E182 - Derek Wood and Debbie Reynolds E182 - Derek Wood, Senior Director Product Marketing, Duality Technologies

37:50

SUMMARY KEYWORDS

data, privacy, organizations, ai, solutions, regulations, analysts, third party, access, encrypted, risk, business, technologies, companies, security, fully homomorphic encryption, model, protect, teams, create

SPEAKERS

Derek Wood, Debbie Reynolds

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss their privacy issues with industry leaders around the world with information the business needs to know now. I have a very special guest on the show: Derek Wood. He is the senior director of product marketing for Duality. Welcome.

Derek Wood  00:36

Thanks, Debbie. Glad to be on.

Debbie Reynolds  00:40

Well, I'm excited to have you on the show. You and I have had some great chats on LinkedIn and chats on the phone as well. I think you're really sharp, and you understand the challenges that we have right now in the world as it relates to privacy, and how companies can manage data. I'm not a doom-and-gloom person. It's not as though I feel that companies can't use data. So I'm like a data abstinence person. But I think the challenges are how can we have companies use data in ways that doesn't slow their business down, but also respects the rights of individuals. But I would love your intro so that you can let people know who you are, what you do, and how you came into your role.

Derek Wood  01:29

Thank you. I'm flattered to have such an introduction from you, and it's always nice to follow what you're putting out. The opinions you share, I think, are valuable not just for the privacy market but for just general people. Privacy to me is really important and interesting because the work you do as an advocate is because it's the right thing to do. It's important for people to understand, and I think about it every time I have a conversation about my company's security or privacy practices; I always try to understand that there are a lot of other companies having that same conversation. So, I want to make the right decision and do it the right way. There'll be on that box check because we're all in this together, the data that gets breached because we don't know where it is, or because we didn't know what type of data was, or the breaches that happened because of things that we're currently allowed to do. But we're working on securing them. We're all a part of it. My background comes from mostly being in startups and starting new markets. So, I was part of the team that brought online backup to the masses in the early 2000s. Then cloud disaster recovery, and then going into data security. Now Data Privacy and the thing I think is really interesting and unique about the movement of privacy, is when you look at the adoption of the solutions, security tech governance, tech, reg tech, these risk-based solutions typically just solve the risk problem, and usually at the detriment of the business teams, which is why we have so many dark spots on our map. What's unique about privacy technologies is that people are starting to realize that they're not just about compliance; they are streamlining the way that we work with data and allowing us to work with more data than we could ever access before. I think we're starting to see people understand; holy cow, if we can keep data encrypted while still being able to use it, we can now go across borders where we couldn't even consider that before, which also makes it tough because there are so many efficiency challenges and data flows, whether they're national or international, that we've just lived with for so long. So you just assume that this isn't a problem to solve, this is just a problem that we have to work around. The direction of travel for regulations is that you're going to have to take security, privacy, and governance far more seriously than before, and you have to also make sure that you can collaborate with others. That was mentioned with the EU, the US, and the UK largely because of the experience with COVID. We had all this data from a global pandemic. But international teams really struggled to work together. Part of our preparedness for the next time around is let's make sure we can do that more effectively. That extends to basically any innovation or growth in any industry, as these cross-border data flows are extremely important. It's how we generate value from the data we have with different technologies like we have AI and machine learning. These models are only as good as the data you can get to train them. So when you're customizing, you need more and more data. The winners and losers coming out of this last year, going forward, are going to be those that can really get that speed and scale of data and the awareness that privacy texts are the solution. The path forward to that is really coming up big. We saw the ICO in the UK do a recommendation in a review of privacy techs and recommended, hey, by 2025. If you're moving and using a lot of sensitive data, you need to use solutions that use things like fully homomorphic encryption. Then, the IMDA in Singapore did a case study with MasterCard, finding the same thing that using a fully homomorphic encrypted solution allowed them to share data across borders far more effectively while satisfying all the regulations. And that was across the US, UK, India and Singapore. That awareness I think we're seeing is growing, which is exciting because businesses don't adapt very quickly. It's just a risk-based solution; they wait till they're forced by regulation. But now that we see that this is actually a growth vehicle, the adoption should come a lot faster than the regulation, which is really imperative, especially as we look at the spread of AI in these technologies that people have a lot of concerns about.

Debbie Reynolds  06:12

Excellent. Well, I have two things I would love for you to do. One will be explain for our audience, just in case they don't understand, what fully homomorphic encryption is. Then also talk a little bit about the shifts of companies understanding that privacy needs to be more proactive as opposed to reactive.

Derek Wood  06:34

So fully homomorphic encryption is a cryptographic method that allows us to encrypt data on two ends without moving in. So you have data sitting in your data center. I have analysts who get value from analyzing your information. We can encrypt the data on your end, and then we encrypt the access for the analysts, and they're allowed to ask predetermined questions about the data. Then, the encrypted results get back. So at no point does the analyst have raw access to the data, and at no point does the data owner have access to see the subject for the query in case that's, say,  for financial crime investigations that may be confidential information, and we don't want to leak out that we are close on the trails of El Chapo, right, we don't want someone to leak that we're close to getting in. So we'll allow it to be aware that, hey, these are questions that I, as a bank, would allow you to ask. But I don't need to know who it is you're asking about. So that's one simple way to think about it is its encryption for data in use, which is one of the things that made me really excited about getting into this market is coming from the security side; using data was always a big risk, all these third party data arrangements, you do what you can just in time access management, different two-factor authentication flows, but ultimately just have to trust that they're going to take care of your information. And now we have a way where we don't have to have that trust, we can just know that that risk is now eliminated. And it really speeds up interactions between private-public entities or public-public when they're trying to do KYC AML, and these sorts of investigations.

Debbie Reynolds  08:29

Right. Also, I think the reason why privacy-enhancing tech I feel is having its day in the sun is because a lot of regulations, because a lot of jurisdictions are saying, hey, you really need to embrace privacy-enhancing tech, and you know that you want to use this data. You know that there are risks involved. I think, in the past, and maybe there are some late comers to the party, where they don't understand that privacy has to be more proactive, as opposed to reactive. So, it's not something that you can really do effectively at the end of a process to be more foundational. So, what are your thoughts?

Derek Wood  09:13

I agree, and I think the privacy market benefits from the bruises and the injuries from the security space; you look back at the evolution of security moving more and more out of the dungeons, the service department, and you have to have security by design, just like you have to have privacy by design. Now, with AI, we're seeing governance, which is typically the back end of the half of how you show your accountability that's also coming together. So, we're seeing a convergence of the GRC privacy security space. It has a lot of organizations looking at how to restructure their teams of specialists that really need to be tightly integrated and integrating with the business teams because you can no longer separate what you want to do or need to do as a business to grow and or succeed with what you're allowed to do and what is technically capable. So we're starting to see the merge. I think everyone wants to see it faster than it is. But such is life. I'm glad that we're not going through the same bumps that the security teams have been going through for the last couple of decades. We seem to be a little faster, and that's getting adopted more in line with all the other changes.

Debbie Reynolds  10:32

Yeah. Well, tell me a little bit about it, and you and I chatted about this; I like the way that you phrase it, the privacy imperative, and how Duality helps with that.

Derek Wood  10:44

The imperative, I think, is because privacy is, I think, largely cultural in terms of how people understand it and appreciate it. As an American, we haven't had a real stark violation of privacy like other regions have. When I lived in Germany, some of my East German friends were very nontechnical folks, and they had an extreme understanding and appreciation for privacy. I thought that was really shocking. But then it kind of made sense. They grew up, many of them still under the Stazi, and so they've had the experience of what it's like; you said it earlier. Are you an optimist or a pessimist about future privacy? It's always easy to be negative; I feel like that's always the easy way out. But people have to realize that we have a lot to lose, which really is we have a lot more to protect. And so the regulations are driving us towards that path. If you want to do business in Europe, you're going to have to address Data Privacy more seriously. So I think organizations, whether their country has those laws or restrictions or not, are going to be pulled into it.

Debbie Reynolds  11:58

Also, I think, now, there's a separation, or there should be, hopefully, for people between cybersecurity and privacy. So I feel as though some people feel like, okay, well, we solved everything for cybersecurity. So then that covers privacy, and it doesn't. What are your thoughts about that?

Derek Wood  12:20

Yeah, they're two sides of the same coin. I think of security; ultimately, it's about protecting data, and the way that we used to do that has changed from the way that our networks are configured and where the attack vectors come in. You used to just build a strong moat around your network. Then that would keep all your users and everything in there secured. Now data has more of a tidal motion across users across services. So you don't have the luxury of that mode. Now, you have to deploy security measures in a different way, and you're really trying to follow the data because that's what people are after. That's where everyone in the cybercrime space is making money, either restricting access to data or restricting access and stealing it and then selling it on the black market. So it's privacy as a similar type of move, where it's no longer just seeing that, oh, I have nothing to hide, because maybe I don't understand that. Privacy violations are those that are really difficult to predict. And then once they happen, it's too late, your data is out. And for some reason, that data is bad for you. Maybe it wasn't bad before; there's a search history on Google where law enforcement was trying to get a list of names that watched a specific YouTube video. So there comes a major privacy issue of it wasn't illegal for me to watch a YouTube video; it was posted on my feed and came in. But now because I saw that my name is on the list that's being investigated by law enforcement. So that brings up a similar type of privacy violation where you're operating as usual. Lawfully, and then, at some point in the future, someone decides that's no longer okay. Everything that we do and all the data about it does need to be private.

Debbie Reynolds  14:10

I agree with that. Yeah, I think those are called keyword search warrants. I did a video about this a year or so ago. Anyway, yeah, people should be shocked and alarmed by stuff like that. I think, especially for businesses, obviously, I think businesses have done a great job of protecting their own valued assets, like their business processes, their secret sauce, whatever that is. But now there's this push to protect the data of individuals in a certain way and be more transparent with how companies are operating entirely with that data. What's happening in the world right now that's concerning you as it relates to privacy.

Derek Wood  14:58

What's happening in the world that's concerning in privacy; it seems that we are on the precipice of some really important decisions. Also, we don't have forever to make them. The move for more secure and trustworthy AI and the EU AI Act are good. They're coming through as quickly as we can push them through the legislative processes tedious somewhat by design. So to see that we're taking action now. For those that haven't read through a lot of the guidance and regulations for how you use machine learning and AI models, is to incorporate the data protection policies that already exist, as well as the governance. So if you're training models on data, can you protect the data while you're doing that, again, a data and use protection strategy? I think what we're going to see is that the limitations of process-driven means of satisfying the security and privacy requirements of using these advanced models are going to again force looking into technologies that drive efficiency; we've seen it in the security space. First it was we have to secure things. So let's put up a bunch of checkpoints; checkpoints slow users down, and then we started getting more and more guardrails. That's because technology solutions, provide guardrails where you can gain efficiencies and growth; what Duality is doing is we have privacy-preserving technology platform to do just that, your users need to use sensitive data that creates great risk for the organization, or maybe they're not allowed to do it. Or maybe it's a very expensive and long process to actually get the data to even start working with it with a platform that is designed to protect data while in use, you know, gain a lot of efficiencies, and you get a lot of options that you probably didn't consider before.

Debbie Reynolds  16:51

I love your analogy when you're talking about people securing data in different ways, in your moat analogy, and I like to use a castle analogy where we're like, okay, let's close the gates to the castle, and no one can get in. But we didn't know someone was flying the drone over the top of the castle so they could see the data, right? But I think the AI brings even another level or another wrinkle in that attack surface where data is being used in so many different ways. Now, what are your thoughts about it?

Derek Wood  17:26

There are so many possibilities, and they seem to grow on a daily basis; we're still in such a discovery phase of what is going to be done. That's kind of normal; we develop new tools, and they're the good use cases where this is going to help people, help businesses, etc. Then there are the malicious rules and how this is being applied to attack us? It's concerning to see the generative AI uses to spoof video, I think that is probably one of the scarier things out there. Today, we rely on only so many senses to understand if something's real or not. Some of the videos that are put out of these deep fakes are just so terribly convincing, and unless you're really trained, it'd be impossible to figure it out. You remember the old email days of oh, you can just look at the email address and know if it's a bad email or not. Today, you can't always tell; maybe that email was hijacked. It's a legitimate address from a legitimate organization. Generative AI has a lot of scary things behind it, as much as the really good things. Now, the governance of how AI models are trained, I think, is going to be a big part of how we can improve things in that sense. Because answering the question of what data sets your model will be trained on becomes a really difficult thing. If you're just opening it up to the whole web? I think we're still working towards a solution to the deep fake problem.

Debbie Reynolds  18:59

It's definitely scary. I think voice spoofing is really, really tough. Where I think a video may take a bit more effort and time, where voice is easier to spoof, unfortunately. For organizations that use voice as a biometric they rely on solely, it's terrible.

Derek Wood  19:22

Yeah.

Debbie Reynolds  19:23

Because it's so easily spoofed. I think I read an article recently where attackers can enter a call without someone knowing and spoof your voice during the call and mute you, but then they'll take over as if it's you.

Derek Wood  19:40

Oh, my goodness. That's terrible.

Debbie Reynolds  19:43

Yeah, this is like Star Trek-level crazy things that are happening now. But I want to talk a little bit about data lineage and data provenance. So I've seen what a lot of these regulations are, what a lot of these business processes need to build in now that maybe they didn't have before, is trying to find a way to create more tracking about what goes into these data models and what comes out tracking that data from beginning stages to the end. So, I want your thoughts about that level of transparency that's actually being asked of businesses.

Derek Wood  20:24

Data lineage, especially now, looking at machine learning and AI, is really tough. It was tough before for those that aren't so familiar; data lineage is to basically know the entire life story of a particular piece of data. That becomes really difficult; some of the ways that data is used is first, you would maybe collect it into some production servers, so you're hosting it, and then you have part of your organization needs to use that data, you're probably not going to give them access to the production database, you're going to copy that over somewhere else, and secure it so that they have a secure way to use it, maybe you have scheduled sync, or maybe you have a real-time data flow. But now you have two copies of the data, you have the production data, and then you have the data that's being used. Then let's say you've hired a third-party data as a service company; pretty common you would say, okay, this company is going to give us a lot more data science, expertise, data, and analysts resources, and they're going to give us some really good insights from our data. So now you create another copy because maybe you have to redact some of that. You can't just give everything to them. Or it's slightly different than the data set that you're using internally. And you just want to keep the data separate so that they have three copies, already, it becomes very difficult to have a documented data lineage. So, I don't know how close we are to a solution to data lineage in general. But looking at some of the ways that privacy technologies can be used, we're probably not too far off from at least greatly simplifying that if you had a production data set and you were using our platform, you could encrypt that into one new location. But then you could join additional analysts, whether they're internal or external, to only pre-approved things with that information. So now you can have maybe four or five analysts working on the same data set with different purposes and different restrictions. But none of them have access to it; they're just pre-approved to run different computations or queries and then get those results back encrypted so that they can provide value. So that's one easy way where you can start to see not only data minimization but data lineage becoming a lot simpler of a solution. If you don't have to duplicate for every use case, I believe Gartner has 40% of organizations have at least 10 to 15 copies of any particular data set. So it becomes very difficult. With AI model training, organizations go out for, say, they need to do some inference on some real third party data; they may struggle to even get a third party to agree to give them that data. They don't want to give their model to a third party because if that model is leaked, that's their IP, and that can be a problem. So they would need a flow to securely train that data and run the inference and then get the insights back that would give you some governance and reporting of okay, at this point, this model was running against this data set. Now we know we have it; I think there are solutions yet to be developed that are on their way to giving us those answers. Some of the people pushing against these regulations will say, oh, how are we supposed to do anything with these rules? Now, we can't go over here to get data. We can't go over here to get data. How are we actually going to do this? How do we even make money with our model if there's no secure way to prove value to a customer without exposing either the model or the data? So these solutions are again, they're going to be there to help the compliance part. The adoption is going to come faster, because people realize that it allows them to do so much more in a more safe fashion.

Debbie Reynolds  24:11

Yeah, as you were talking, it had me thinking about internal access, and things like unauthorized access, where people maybe within an organization maybe have too much access to certain things. I feel like privacy enhancing tech can really help in that way, because organizations really need to think through why people need access to certain things and making sure that certain data is only exposed to certain people. What do you think?

Derek Wood  24:46

One hundred percent and that's one of the big reasons that just-in-time access management systems have come out, and there's more attention to it because it's almost like another workaround. We know we have to give this person a risky level of access to sensitive information or systems. So a way to protect ourselves is we're not going to give them perpetual access, or we're gonna give them a quick way to be pre-approved at certain times; there was a breach last year with Atlassian and a bunch of employee HR data with a third party data as a service provider. The analyst for the third party got fished, and then their access to the data went to the hackers, and then they were able to leak all the data. So that was a big problem today; we don't have to accept those types of risks. We don't have to; you can use a platform like what we've built. You could have given the analysts an encrypted access that would have had pre-approved queries or statistics. So, even if the hacker did get access to the software, they wouldn't have had access to the data. So we wouldn't have had the big breach event, and you would have also had the governance controls to see, hey, there's a lot of questions being asked that weren't approved. So they're not running, and we'll reject them. Or there's a lot of queries of data that is happening way too quickly. The frequency of these requests is abnormal. So the flag that and make sure that everything's appropriate, and this is authorized access, at least we've authenticated, that is this analyst that's supposed to have access to things. I think there’s, again, the convergence of GRC, security, and privacy. The business operations are unavoidable, and we're starting to see it. I think those organizations that take big steps first are going to be well rewarded over the next few years.

Debbie Reynolds  26:43

I think so. As we start to see so many more companies really jump on this AI train and go full force ahead with digital transformation, I think it creates more complexity within the organization. So they really need a way to be able to manage some of this complexity and be able to do business without creating more risk. What do you think?

Derek Wood  27:11

Last year, I really thought that the big value we were gonna get was going to be more on the front end of your app development and just your general data acquisition strategy. I think there's still a lot of value there. Now, you can secure larger volumes with more diverse data, which every model needs. But what's been really interesting is we found that it's actually on the go-to-market side that a lot of organizations are finding that they're lacking options to move at a speed and scale that really gives them the growth from their AI or data strategy. So, for instance, we have a data partner, and they provide data to their clients. And then we're getting a big influx of AI model developers who want to use their data to run inference or validation for various use cases, you know, predicting investment trends, predicting real estate growth, and up some flows. So, the predictive models of how you use data to minimize the risk of big, expensive business decisions. The rub came in, well, this data, we can't just send it to you because it's quite sensitive, and if we do send it to you, it's too expensive, maybe for that AI model company to purchase in a big bulk purchase, it can be very expensive to just buy. Then you have to ingest, house, protect, and govern all the fun stuff that comes with additional data, and it flies in the face of any type of data minimization effort. A lot of these companies, maybe they have a small budget, maybe it's an r&d budget for a larger organization. The data provider says, well, it'd be a lot cheaper if you send us your model that we'll run it on the data, and then we can send that back to you. But then nothing happens, right? The data provider they just can't trust, even if they do a legal liability contract with a third-party data partner, they can't trust the data is not going to flow and get distributed elsewhere or leaked accidentally. Then, they created a second competitor with a very expensive data set. That's their business; we developed a workflow where neither side has to make that compromise. That's been very exciting for these organizations because they're thinking, man, data is our business. AI needs huge volumes of data. We should be raking it in but we just don't have a solution that lets us do that effectively. So that means that their engagement with these types of organizations is a lot smaller and frequent and takes more time because now we have to go through a lot more trust exercises and data redaction and different cleanup things to protect both sides. Having a way to protect a model at the same time that you're running in protecting the input data is a big opportunity for growth today.

Debbie Reynolds  30:02

You touched on third-party risk? I would love to talk a bit more about that. So, to me, the opportunity that that I see is that, well, first of all, the climate around data sharing has changed because of regulation and some of the bad things we're seeing happening with data breaches and unauthorized access. But what I think it has created on the business side is businesses are creating these contracts where they're pushing down more requirements to third parties around their data practices. For some third parties, it will shut them out of those opportunities of working with those first-party data companies because they may not have the security or they may not have the traction to be able to keep up with what those obligations are. But I think what you're discussing is reducing that friction by using something that reduces the risk from the first-party data side and also makes it easier for those third parties to comply without creating more risk for the organization. What do you think?

Derek Wood  31:12

Absolutely. In any third-party relationship, you want to put your customer at ease. If I'm providing you with data services, IT, or security services, I'm constantly looking for ways to make the customer just feel that not only do I have their best interests in mind, but I'm actually doing all the right things, you can rest easy with us as a data partner, because we're using a platform that we never actually get access to your data. We're no longer part of your assessment. Okay, what about this third-party breach supply chain risk management of software supply chains and third-party operators?

Debbie Reynolds  31:53

Yeah, I think that question or that issue comes up so much. I've seen big companies say, well, we're not going to use your company as a third party because you don't have XY and Z, whatever XYZ is, or I'm seeing third-party companies have it as being a barrier to people adopting their tool or whatever their services, and they do not really think about it, because they think, okay, I create a product, my product is awesome. This company really likes it. But then I can't really answer those questions about how to protect or encrypt data; what do you think?

Derek Wood  32:33

Yeah, the technology solutions are what give us the speed and scale of data that we need to move forward. Every advancement that we have, or we can provide technical guardrails versus process-driven solutions, is going to be how we maximize data faster than everyone else. So I think these third parties are, even if you're just housing sensitive data, that's a major value add; if you can say, we're not going to add more risk, we see the CEA delete Act has now defined a whole new level of risk that organizations weren't previously responsible for, how can we continue to grow our business operations while satisfying these regulations? Again, that's the exciting part about the privacy technology segment is that that's what these things are designed to do. You're increasing access to data while simultaneously improving its protection of it. That is a real path to growth; some organizations are going to adopt faster or slower than others. In some cases, organizations are put in a position where they're thinking, oh, man, we never had this opportunity before. Our data is not even ready for that. There's a lot of data cleaning that we have to go through, as people start to realize, and we have some big organizations out there today that are leading the charge; I mentioned MasterCard already. But like Tel Aviv Medical Center and Dana Farber Cancer Institute, these are organizations that are already embracing this new set of technologies, one part because it satisfies regulation. But for the most part, the feedback is that we're getting so much more value out of our data, less time figuring out how to work on it, and more time just doing the work.

Debbie Reynolds  34:15

So, if it were the world, according to you, Derek, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, technology, or human behavior?

Derek Wood  34:30

Debbie was coming in with the small questions. The direction of travel for privacy regulations is great. I'd like to see it speed up, especially here in the US. I think it's becoming more and more familiar as to why it's important to keep data safe, but there's still time to tell how quickly we'll move. But the adoption of these technologies, I don't care if it's because they're only caring about how much more money they're going to make with them. If that's the only motivation, the data is still much more protected than it was before. So, I'm happy that we have a set of technologies that have this motivation because money talks way faster than doing the right thing or checking the compliance box. So I guess my big wish would be just faster adoption, all these organizations holding financial data and patient data. In the insurance providers, we see breaches all the time. All these organizations that have large volumes of personally identifiable information in any context, I'd like to see them just adopt the available texts that are out there today.

Debbie Reynolds  35:37

I agree; why swim upstream and make your life harder when you don't have to? I think the complexity of computing will force people in this way. I see a lot of companies that either shied away from doing certain things or tried to do certain processes manually, and they figured out they couldn't do that. So I think this is the wave of the future. So you moving into this area really educated, which I feel like this is what you do, only then very much educating people about what's available, what's out there, and what they can do to actually accelerate their business and their growth.

Derek Wood  36:17

Absolutely, it's all about the speed and scale of data. These regulations are going to go right against those efforts. So those teams aren't looking for solutions today. They should be.

Debbie Reynolds  36:29

Excellent. I agree with that wholeheartedly. Thank you so much for being on the show. I loved having you on the show. Thank you so much for being a great sparring partner with me on LinkedIn; we always have interesting chats. You just have such a wealth of knowledge and understanding of tech in general and where we're going. So, I'm excited to see how things turn out.

Derek Wood  36:51

Yeah, thank you so much, and love to see people come and visit us at dualitytech.com. One of the unique things is you'll not just see content about solutions we have in the market, but we also have a very deep technical R&D team. So we are very much both technology leaders in developing privacy techs and combining them, but we're also leaders in bringing them to market solutions. Whether you're more on the technical side or you're trying to solve problems. We have a lot to offer.

Debbie Reynolds  37:21

Yeah, check out your website, which is dualitytech.com. Excellent. Well, thank you so much, and we'll chat soon for sure.

Derek Wood  37:35

All right. 

Debbie Reynolds  37:36

Thank you.

Derek Wood  37:38

Thanks, Debbie.

Previous
Previous

E183 - Debesh Choudhury, Ph.D., Information Security Researcher (India)

Next
Next

E181 - Dan Caprio, Co-Founder and Chairman of The Providence Group, Vice Chair, Internet of Things Advisory Board, U.S. Department of Commerce