E243 - July 1, 2025 - Yogita Parulekar - Founder & CEO @ Invi Grid Inc.

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me “The Data Diva”. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now I have a very special guest all the way from San Francisco,

[00:29] Yogita Parulekar. She is the CEO and founder of Invi Grid. Well, welcome.

[00:38] Yogita Parulekar: Thank you. I am excited to be here, Debbie, to speak with you.

[00:43] Debbie Reynolds: Well, I'm excited to have you here.

[00:45] I'm very excited to talk to someone of your caliber about cyber and all the things that you work on. And we don't have enough women on the show. Doubly excited to have you on the show.

[00:58] But why don't you tell the trajectory of your career? You've had such success and just all the different things you've done in your career around cyber. But how have you managed your career thus far and how you became like the founder and CEO of your own company?

[01:16] Yogita Parulekar: Oh, gosh,

[01:18] that's a long story. I will try to keep it short.

[01:22] I started in my journey way back in 1996 when the word cyber wasn't really coined and no one was talking about it that way.

[01:34] It was very interesting. I think people. The very first question that people ask me when they hear that is, how has everything changed? And I will tell you in one sentence.

[01:45] In those days, people would ask us, what exactly do you mean you're going to do for us?

[01:51] And today they ask us, just tell us how to do it.

[01:55] And better still, if you can tell us how to do it most efficiently and effectively,

[02:00] better, faster, cheaper.

[02:02] So the question has gone from what to how,

[02:05] which is good.

[02:06] But I wish that we were much more ahead in terms of that in those discussions.

[02:12] But my journey a little bit, I started in 96. I have since then become head of security multiple times.

[02:21] And now I'm a founder.

[02:23] And yeah, I'm super excited about this phase of my journey as well.

[02:29] Debbie Reynolds: Well, you've also been recognized by Cyber Defense magazine, Cybercrime magazine,

[02:35] SC Media,

[02:36] and so you've been very prolific in your career. You've been someone who's been someone that other people can look up to in terms of cyber and being able to really be out there and get your voice heard.

[02:49] Tell me a bit about what is happening.

[02:52] And by the way, I love your story about how people were asking what was cyber as opposed to how to do it.

[02:58] What is your idea of what's happening now in cybersecurity that concerns you?

[03:05] Yogita Parulekar: Oh boy.

[03:07] Very good question.

[03:08] There's a lot of things.

[03:10] So over time,

[03:12] cyber risk today has increased, honestly tremendously and at an unprecedented pace today.

[03:21] Let's break it down into what is cyber risk?

[03:25] Cyber risk is the risk of anything negative happening to the company and impacting the company negatively as a result of loss of confidentiality, integrity or availability of information or systems.

[03:38] And that is generally, if you say that's kind of the definition in a very general high level definition,

[03:45] well, how is that changing? What is impacting the cyber risk and why is it growing exponentially today versus in maybe 20 years back or 30 years back when we started?

[03:57] And it's the pace of growth and the unprecedented exponential pace of growth of technology.

[04:03] Never before has every two years seen something dramatic happen in our lives. Right. We had four years back, we had the solar winds breach and we were just grappling with that.

[04:14] And whether it was solar winds or the colonial pipeline, gas pipeline, or all of those issues and understanding how cyber is becoming a systemic risk,

[04:24] barely were we trying to understand that when OpenAI and AI was thrust in our face with Genai being launched a couple of years back.

[04:35] And last week, or rather was it this week,

[04:38] we heard that Google has launched its quantum chip,

[04:43] right?

[04:44] So it is by leaps and bounds this pace.

[04:50] And people would say this is, I mean, on one hand it is phenomenal.

[04:55] Perhaps we can solve all the healthcare related, disease related issues at a much further faster pace than ever before in our human history,

[05:05] but it does come with certain risks that we have to be very careful about.

[05:11] And one of them is the cyber risk. And I'm saying loosely cyber, but I do include whether it is security or privacy because they're very interrelated. And I'm happy to talk about how security and privacy are interrelated as well,

[05:25] but it goes beyond that now.

[05:28] So it's not just an enterprise risk, not just a systemic risk, but what it is, is suddenly even the World Economic Forum talks about cyber attacks, cyber risk as one of the top global risks in its latest report.

[05:46] It includes misinformation, disinformation,

[05:49] societal impact, environmental impact, biases, ethics.

[05:54] All of that has been thrust into,

[05:57] in our domain and we have to think about that. Even weaponization of information of all this malicious of AI primarily as well.

[06:07] All of this and the geopolitics think about while all of this is happening at the same time.

[06:15] We are at a very sensitive point in our history with so many wars raging, which could any turn anytime in hopefully good direction, but even bad direction.

[06:25] So a lot of things are coming together that is exponentially increasing cyber risks as a general category that we need to be really be aware of.

[06:38] And we as a community have to take a much higher role, I believe, in addressing some of these things.

[06:47] It's our time to really make a difference for the world.

[06:52] Debbie Reynolds: Yeah, I agree with that. All the things that you said are worrying are concerning. I think one of the big concerns that I have, and I want your thoughts, is that I feel like a lot of what has happened in cyber,

[07:08] there hasn't been enough kind of public discourse about cyber as a profession and why we need to really look to leaders like you about cyber. Because like, let's say you go see a TV show or something and they're talking about something to do with the public and you always hear,

[07:28] you always see people on TV about finance, you always see people about health, and you rarely see someone talk about cyber unless there's like a breach.

[07:37] Right. So the idea,

[07:39] you know, I would love for us to get out of the idea that cybersecurity is something like the fire department that you call when something bad happens, as opposed to something that's more holistic in terms of how, how organizations manage data and also helping people to not create more risk for themselves or for organizations.

[08:01] But I want your thoughts.

[08:03] Yogita Parulekar: Oh my gosh. You. You named it Debbie?

[08:08] Debbie Reynolds: Yeah.

[08:09] Yogita Parulekar: I mean, we are always remembered when something goes wrong, right?

[08:14] Until the house is clean.

[08:16] No one really mothers who cleaned it.

[08:23] You are absolutely right. We need a much, much higher level discourse. We need that discussion at the highest levels.

[08:32] And you know what? I think we are beginning to see it in some ways and hopefully it continues.

[08:40] So if you go back many years, a lot of us in the community have been talking about let's build systems, right?

[08:49] Let's build software, right?

[08:52] Security people traditionally, unfortunately are not.

[08:55] Neither are they writing the software nor are they building the systems.

[09:01] The engineers are writing the software and the operations teams are building the infrastructure on which the software will be deployed.

[09:09] So we have been fundamentally this,

[09:13] and it was one of the biggest frustrations for me in my role as head of security or otherwise, that we were always fundamentally, cyber hasn't changed. We have always gone back and told the CIOs or the CTOs and we have discussed all kinds of misconfigurations and vulnerabilities that exist.

[09:35] Who likes to be told the mistakes that they or their teams have made?

[09:38] Nobody.

[09:40] So there has always been this tension and Always been this perception that security comes much later,

[09:49] Right?

[09:51] And sometimes it is also business urgencies.

[09:54] I mean, think about it. The CIOs and the CTOs who are building that software or those systems are under constant and immense pressure from the, from the shareholders, from the investors to go ship the product,

[10:09] right? The important thing is just ship that product, get to market fast. Let's start generating revenue.

[10:17] And think about it. I mean, if there is no revenue, there's no product, there's no revenue, what are we going to secure?

[10:22] So that's there. The idea,

[10:25] what we need to fundamentally change is this, that security is something that you need to think while you're building that software, while you're building that infrastructure. And by the way,

[10:41] we need to show them, and this is probably where we have not done a very good job on,

[10:46] is that we can help you do it at the speed at which business wants.

[10:52] If we don't marry speed and security or speed and privacy and get that help the people who actually build the systems get it done in the speed at which business wants, we won't probably see much change or much traction at a very fundamental, basic level.

[11:10] Debbie Reynolds: Wow, that's so much to think about. Oh, my goodness.

[11:13] I want your thoughts. And to me, this is, this is shocking and it maybe goes towards resiliency and also just the way people think of cyber. So I guess two things.

[11:23] One is I feel like,

[11:26] and I think you've been in this, the tech data world probably as long as I have,

[11:31] but I think one of the shifts that has happened that people don't realize is like back, let's say back in the 90s, people were doing things in manual ways and then they were moving towards digital systems.

[11:45] And in that age they felt like I could do things manually, but if I did things in a digital way, it would be faster. And so they thought that kind of the tech teams and the things that they were doing in technology was kind of a take it or leave it thing.

[11:59] It was like optional, right? And now I feel like some people who are in leadership may still think of it as kind of an optional thing,

[12:08] but that's not true today. Today is like, you basically can't do your job if you don't have these systems up. And so one of the things that brought this to, definitely I've been looking at it over the years, but one of the shocking things that happened this year, and this is with the United Health Breach.

[12:25] When that happened, I thought, oh my goodness, because I didn't know exactly what was breached Right. And so I thought, okay, wow, that's not good, obviously, because depending on what information is, is breached to some people's health information and things like that.

[12:38] But then I started hearing from people, my friends who were customers of that, that company, they were saying I could not get my medication,

[12:46] they canceled my surgeries, the doctors weren't getting paid. And I thought, how is this possible?

[12:52] How is it that we don't have enough resilience in the way that we do our work?

[12:58] That a cyber breach like that can just really bring health providers and people who need healthcare, that whole process to a halt? So that's a big question. But I just want your thoughts about kind of the thinking about the criticality of kind of cyber and cyber resilience now.

[13:18] Yogita Parulekar: Oh, boy, Debbie. Exactly.

[13:21] You're getting me more and more excited about this topic. We need to create that level of excitement, I think, with everybody to understand that. What excitement about cyber?

[13:32] The understanding the need for cyber. That it is a business enabler,

[13:37] it is what builds trust with your customers. Right. If you have, I mean today,

[13:43] B2B companies,

[13:46] B2B software companies,

[13:48] they're pretty much forced into at least getting a SoC2 done, right?

[13:52] So,

[13:53] but what about B2C? They might be having sensitive data as well.

[13:57] We don't have any forcing function there.

[14:00] I mean, look at so many. I don't want to name any company, but so many B2C companies will come to mind immediately. Right? We collectively are not asking for that cyber resiliency.

[14:12] How many of us read the privacy policy?

[14:15] Pretty much nobody does pages and pages of cyber policies.

[14:20] So what do you think?

[14:22] I think you and I know that some amount of regulation will be required in order for that.

[14:28] We don't want regulation because regulation always means slowing things down. But honestly, in spaces like security and privacy, we may probably just need something which becomes a forcing function.

[14:41] So hopefully it will be self regulation by the industry with things like the, like an energy seal, I think the US Cyber trust mark as an example. Something like that.

[14:52] Debbie Reynolds: Very cool. Well, I want your thoughts on cybersecurity and data privacy. So so many people confuse the two.

[15:00] I mean, just a lot of people. And it's like I literally had a phone call with someone in Europe today and then we were talking about ISO. I'm like, that's not even privacy like things you're talking about.

[15:09] That's cyber. Right? So I had to explain to them why it's different, but why they are have a symbiotic relationship and they work together. But I love Your thoughts on that?

[15:19] Yogita Parulekar: And hopefully I answered your previous question on resiliency. I love that word, by the way. Resiliency. You got that?

[15:27] That's critical, right?

[15:29] But going back to this current question on this is also another very interesting topic,

[15:34] security versus privacy.

[15:36] So I like to explain this in this,

[15:40] using an analogy.

[15:42] I do the onboarding, new hire, training myself with every engineer and I say make this statement to them as well.

[15:50] So something like this.

[15:52] A bulletproof glass gives us security,

[15:56] not privacy.

[15:58] A curtain will give you privacy,

[16:01] not security.

[16:02] You'll need those two to make confidentiality happen.

[16:06] Debbie Reynolds: Oh, wow, that's great. That's a very simple way to explain it. Wow, I love that. I'm sorry, go ahead.

[16:13] Yogita Parulekar: So that is how I explain it to anybody. And then it. And it starts popping. Oh my God. You know,

[16:21] things that give me security may not necessarily offer privacy protections.

[16:27] Like the choice,

[16:29] giving the customer the choice, the data minimization itself.

[16:33] Those are all data retention and then data deletion, right from end to end processes that you need for privacy.

[16:42] Those you have to layer on top of your security practices. So once you have your data,

[16:48] once you get that data, how do you secure it so that you can keep it confidential?

[16:54] And of course. But on the other hand,

[16:56] security goes beyond privacy in certain respects because that's where you talk about.

[17:02] So for example, your intellectual property information,

[17:06] right, Your code,

[17:07] that those are all,

[17:09] all of that is sensitive data, though not personal data that requires privacy protection.

[17:15] So that kind of information does still need security because it is sensitive or competitive intelligence, which gives you an edge over other companies.

[17:26] So all of that, so one is personal information. So other than the analogy beyond that, I try to explain, one is for personal information and the other is more for any sensitive information.

[17:37] Like to think about it like a Venn diagram with those two circles,

[17:41] there's a lot of overlap. But then there are certain things that are for cybersecurity and certain for privacy processes and practices.

[17:49] Debbie Reynolds: Very good.

[17:51] I want your thoughts, and this annoys me to no end, but I want your thoughts about the resistance of some organizations of putting cyber folks on the board or at least giving cyber people like that level of kind of visibility within organization.

[18:11] So I feel like there have been some.

[18:14] Actually I think it was the SEC rule. They stopped short of saying that you have to have a cyber person on the board. And there are a lot of big name organizations that lobbied against that for some reason.

[18:26] I have no idea why. But what is your thought about having someone who is part of the board or has Visibility to the board that knows cyber.

[18:39] Yogita Parulekar: I would put it this way, and I think this goes to your resiliency comment.

[18:46] If cyber resiliency is critical to you, you carry sensitive information,

[18:51] if that's your business,

[18:54] personal information, other sensitive information that needs protection,

[18:59] it behooves you to have someone with that knowledge and expertise on your board.

[19:05] Now, whether that person,

[19:07] in which committee that person sits, what kind of expertise that person has,

[19:12] preferably someone who has been in the operational tech role, cyber role. Perhaps a CISO or a CIO will help here,

[19:21] right?

[19:22] But it also,

[19:23] I think sometimes it goes back to the impression of everybody about a ciso or a CIO's role which is comes down to oh,

[19:34] are they simply the naysayers?

[19:36] Are they simply the obstacle?

[19:38] Have they been an enabler of business?

[19:43] Will they understand the risk in which companies operate today?

[19:49] So let's say for example AI.

[19:52] Adopting AI is almost like if we don't do it today,

[19:58] the company might be at the risk of going extinct via competitor depending on what business they are in. If they are a customer support call center,

[20:08] they might not exist tomorrow if they don't adopt AI just as an example. Right. So if so, depending on your business,

[20:18] I think and whether cyber risk and cyber resilience is critical to you vis a vis how fast you,

[20:27] what else is important to you?

[20:29] Correct. Now you may be a manufacturing business, manufacturing chips,

[20:34] food in a food company, obviously it is the food quality people that are probably more important than a cyber person perhaps. And I'm just making this thing so not every company will need a cyber expert.

[20:53] But where the cyber risk is high because you have sensitive information,

[20:58] because you are hyper digitized to such an extent that it may bring down a service like a hospital,

[21:05] a health care,

[21:06] while you need doctors possibly on the board,

[21:09] you should perhaps think about something like cyber resilience.

[21:14] Because in the absence of good cyber resilience,

[21:18] perhaps care may not be given. As you pointed out the United Healthcare issue,

[21:23] that's the example you gave, right?

[21:25] People are not getting suddenly healthcare services or the way I like to put it because is this credit card data can be changed,

[21:36] you can change your credit card if it is lost and if it is hacked or used by malicious people, it's available on the dark web. You can't change your health record.

[21:44] So some of these,

[21:47] some in some of these companies the cyber risk is high.

[21:51] The resilience required is very high for the business itself.

[21:57] I think those companies, it behooves us that someone with that expertise is available on the board.

[22:05] Debbie Reynolds: I agree with you wholeheartedly. Yeah, almost. I guess the analogy I give is like, I think,

[22:12] I don't think any company would think that they didn't need accounting software,

[22:17] right?

[22:17] So it's like, I feel like cyber is like that fundamental to companies that are doing things in digital spaces.

[22:25] So it should just be table stakes in my view. And unfortunately, I think some people think of it as back again to the fire department example. It's like, okay, well as long as nothing's happening right now,

[22:36] you know, then you know,

[22:38] it's fine,

[22:39] everything's perfect.

[22:40] Then if something happens, then it's like, oh my God, let's get these people on the phone, we have the firefight and different things like that. And then also I guess I want to talk a little bit about,

[22:51] you know,

[22:52] I think people have a false impression what their true cyber risk is because like, for example, like you know me years ago, like the target breach, which is like really huge, right?

[23:03] And some people who maybe in small, medium sized businesses, they saw that and they were like, well, you know,

[23:10] target, they survive. So I guess we'll survive, right, if we have a cyber breach.

[23:15] But we know that statistics show that a lot of small and medium sized businesses go out of business within several months of having a breach. So you're not target,

[23:24] you're not these other big companies that can really bounce back from those types of things. But what do you say to those maybe small or medium sized businesses that don't really aren't really thinking that cyber is important enough for their business?

[23:42] Yogita Parulekar: I would say numbers and facts state otherwise.

[23:47] There are a lot more.

[23:49] Be aware, be knowledgeable that yes, it could hit you as well. It could be something as simple as someone clicking on a phishing link and you having a ransomware attack.

[24:03] It could be simple things like that which could put you out of, which could put a small company out of business. And you should absolutely be aware of that.

[24:12] And if you can't hire, see the, I think biggest thing that happens is some of these companies cannot really afford or hire a cybersecurity expert.

[24:23] I would say to them that there are now a lot of CISOs who are getting into,

[24:29] are offering VCISO or fractional CISO services,

[24:33] have one of them on,

[24:35] have some kind of an arrangement with them so that at least fundamental cyber risk is assessed for you.

[24:42] You will know whether you are at high risk or not and what kind of commensurate or reasonable precautions you need to take. Because finally everything is about reasonable precautions. Have you taken at Least reasonable precautions commensurate to the cyber risk that your company bears.

[25:02] If you are a small company doing ice cream business,

[25:05] you don't. You're probably not that high of a ice cream shop, not high of a cyber risk, or even 10 shops like that,

[25:13] possibly still not of a higher risk.

[25:16] Right. But even if you are a small doctor's clinic,

[25:21] 10 doctors clinics, it may be a small setup, but you might have sensitive information.

[25:27] So it is.

[25:28] Your cyber risk is very much, very much depends on your business.

[25:34] What data, how,

[25:35] what kind of data you are carrying,

[25:39] how your systems are interconnected, how it would impact your business if that digital records are not available to you.

[25:47] So all of that put together,

[25:50] I would still urge every small and business unit to at least have like a fractional to go through and do this assessment for you so that.

[26:00] And have some kind of an insurance. I'm going into tactical stuff now.

[26:06] Debbie Reynolds: Well, that's. Yeah, that's all part of it. I think some people think about cyber, especially when cyber insurance first became a thing, they thought, well I don't need all this stuff, I'll just get cyber insurance and that'll take care of everything.

[26:18] But what we're seeing, what I knew would happen, is that cyber insurance is becoming very expensive because of the amount of breaches that are happening.

[26:26] Cyber insurance are asking more questions of companies around their maturity in those spaces. And some people find it unaffordable based on kind of some of the things that they're doing in their business.

[26:41] But I want your thoughts on that.

[26:44] Yogita Parulekar: So in terms of cyber insurance,

[26:46] that's a separate topic, Debbie,

[26:49] a whole discussion topic. The cyber insurance is increasing, the premiums are increasing, they started going up. We all know that some of these come actually this insurance provider were kind of almost decimated when the ransomware went to increased fourfold over Covid period.

[27:06] Do you remember those days when suddenly everybody went remote and the CISOs had to then scramble and deal with remote workers and make sure that everybody's trained not to, you know how they are accessing systems not to click on phishing links.

[27:23] Everything which probably they was taking its own sweet time to happen within organizations, had to be immediately done overnight on how to work remotely.

[27:32] And the ransomware went through the roof. And some of these insurance companies were impacted negatively. And that's when we saw their business was impacted. So we saw the rates go up for all of us as a result.

[27:45] And since then they have stayed up, they haven't gone down.

[27:49] Never happens right with anything, any pricing.

[27:53] So.

[27:54] And you Will see, you know, if you, you have probably taken some insurance for your business, you know, like we all have to do that. And everything has become a line item,

[28:04] every little thing. Your coverage, you may think you're covered for one thing, but you're not necessarily covered for it.

[28:11] So everything has been in the cyber ream has been broken down into the line items.

[28:18] So yeah, watch, read very carefully what is being covered when you get that insurance. First get that insurance.

[28:26] Read that document very carefully into what is covered and what is not covered. Yeah, that's what I would recommend right now.

[28:34] Debbie Reynolds: Yeah, that's a good recommendation. I want your thoughts a bit about the complexity and I think you touched on it a bit at the beginning and that's around AI.

[28:46] So what has AI done to the modern enterprise as it relates to cyber.

[28:53] Yogita Parulekar: It has expanded cyber risk in its horizontally, vertically and every dimension that you can think of.

[29:02] Right. So let's talk about.

[29:04] It has not just remained with respect to the data,

[29:07] but it is also now we have to consider data or systems, but we have to consider all the other dimensions of using AI,

[29:15] from ethics and fairness to misinformation, disinformation, biases.

[29:21] Let's take a few examples. I think that is what will make things very clear.

[29:26] Let's take accountability of actions from AI.

[29:30] Who's accountable for that car crash if it happens?

[29:33] Who's accountable for the decision of denying loans to certain individuals?

[29:40] Is the AI accountable for that? Who is accountable?

[29:43] Is there transparency in how those decisions were made?

[29:49] So whether, if you just break down AI.

[29:53] So you start with data and you have, simply speaking, you have the data on which models are trained,

[30:03] then you have the models themselves.

[30:06] So is there transparency? Do we understand how the models are making decisions at some point in time? Even scientists will tell you now it has gone out of our hands.

[30:17] Right.

[30:18] So do we. When is it. So that has increased the risk exponentially by itself.

[30:23] Correct. The data itself, what is the source of that data?

[30:29] Does the data have integrity issues? Did the data have biases?

[30:33] Correct. Where did we get the data? Did we have permission from a privacy perspective to even have that data in the first place to make those decisions?

[30:42] So everything that we have thought about from a cybersecurity or a privacy perspective now suddenly starts applying in so many different directions in so many different dimensions as far as AI is concerned.

[30:59] So the risk has grown exponentially. And I don't think we all understand that risk.

[31:07] A lot of people will come. Just a few days back, I was attending a group of CTO you know, CTOs. And this was a big topic of discussion amongst the CTOs.

[31:18] How do we,

[31:20] what kind of policy should we put in place?

[31:23] What are the statements in those policies? Should we, should we put.

[31:28] So everybody is still grappling,

[31:31] when do, how do we use AI? What do we tell our engineers to use AI? We want to use AI to improve engineers productivity,

[31:41] but then how do we guide them into using AI safely?

[31:45] So one is just building AI but using AI as well.

[31:50] So if you look, if you even go to think about how exponentially this risk is increasing, if you go to model registries today,

[32:01] hundreds if not thousands of new models are added on a daily basis.

[32:06] And if, how do we stop everybody from the company from basically saying yeah, I can download? It's like think about,

[32:13] if you think about it, there is a little correlation.

[32:17] The little correlation is open source software. Did you have a policy on open source software? Can you use that same policy for the models that anyone can download? You should,

[32:28] right? So there is corollaries, there is similarities and you can adopt. But if you were fundamentally strong in your policies and practices,

[32:37] there are some things that you can adopt and apply to the AI world as well.

[32:42] Whether it is the data usage,

[32:44] whether it is the model usage and how decisions get made using that AI, how AI gets used.

[32:52] But yet there are many different dimensions.

[32:55] But hopefully you are always fundamentally strong.

[32:57] You can apply that similar policies,

[33:01] but there will be more that you will have to add on to be more transparent,

[33:06] to show accountability,

[33:08] ethics,

[33:09] fairness,

[33:11] add all of that into the mix in addition to security and privacy.

[33:15] Debbie Reynolds: Yeah, I agree. It's definitely getting more complex for sure. And then from a privacy perspective, what we're seeing with a lot of the laws and regulations, what they're bringing up enterprises that they didn't have before is that for example, they may need to delete data,

[33:31] right? Or they may need to provide a level of transparency to consumers that they've never provided before. But how does that make it more complex from a cyber perspective?

[33:42] Yogita Parulekar: Oh boy. I mean inherently AI needs ton of data.

[33:48] Think about it right there,

[33:50] your risk has increased exponentially,

[33:54] right?

[33:55] And if you are not the one who is training the data and you are getting pre trained models from somewhere,

[34:00] then you want to know what data it was trained on,

[34:03] whether there were any copyright issues on that,

[34:07] and again depending on your business, what data, what model you're using, what data it has used.

[34:11] So not just whether it was trained on any personal information that yet now may leak into yours or your data and the Personal information that you have may leak into the model is the data being sent out by that model that you are using and for training purposes.

[34:30] So I think the best way to think about AI is to take a step back and watch your data flows. I think going back to the fundamentals,

[34:40] correct to watch your data flows, but take it a little bit, step ahead, watch your own data flows, whether that data is going out. Look at the contractual clauses with that AI provider, whether they state categorically that they are not training with your data, no data is going out.

[34:58] You want that statement in there that your data is not being, not even getting out of your boundaries, not being used to train their models.

[35:08] You want to know how. But that on one hand, but on the other hand, I mean, when I say your data, that is probably your customer data, your employee data, very sensitive information,

[35:18] or it could be even your code,

[35:20] if it's engineering productivity tool, your intellectual property and so your other sensitive assets. But you also want to know what data it was originally trained on so that you don't become liable for perhaps a copyright violation, perhaps a privacy violation.

[35:38] So again,

[35:39] back to the fundamentals very strongly with data flows, threat modeling and the same,

[35:47] yeah, good old cyber practices applied in a. With a very. With. With the knowledge applied with the knowledge of AI and how AI operates is very critical.

[35:58] So I would behoove every cybersecurity person to basically get up to speed on how AI works.

[36:04] Right? We may not know how perhaps the AI ML engineers and how, you know, but knowing a little bit of that, it doesn't hurt how models get written,

[36:16] how models get used.

[36:18] I think we will have to. Also,

[36:21] there's a lot of learning. And I always say the most dangerous statement today,

[36:26] Debbie, and this is, and this is actually straight, not from me, but from Grace Hopper, ex U.S. navy.

[36:33] And she would always say that the most dangerous statement is it was always done this way.

[36:39] Debbie Reynolds: Right.

[36:40] Yogita Parulekar: That today is so much more true,

[36:45] that it is truly dangerous in the world of AI, in the world of quantum, in the world of crypto and the unprecedented pace at which we are going, if there is one takeaway,

[36:55] is that the most dangerous statement today is that it was always done this way.

[37:00] Debbie Reynolds: I agree with that. I will tell people the future is not going to be like the past. So we're trying to rely on the past to try to, you know, map out what the future is going to be.

[37:10] It just doesn't work that way. So we need to really do a rethink about everything that we're encountering. And knowing that maybe those old solutions don't apply in the same way in the future.

[37:22] But if Uwa, the world according to you Yogita, and we did everything you said, what would be your wish for privacy or cyber anywhere in the world? Whether that be regulation,

[37:32] human behavior or technology.

[37:35] Yogita Parulekar: I think it's all of the above, right?

[37:37] We want all of the above in measured doses,

[37:41] in the right ways possible,

[37:44] so that we can adopt the technology in a way that benefits humanity.

[37:49] So with guardrails. That would be my.

[37:53] Pretty much be my wish that I hope we are able to adopt as a human race. We are at an inflection point today, Debbie.

[38:02] What we do today is going to have effect over generations. I don't know whether we realize that, everybody realizes that, but this is going to. It's a very.

[38:11] For every generation. It is true. But it is more true today with the unprecedented pace.

[38:17] So I hope we are very aware and very cognizant of that and are able to take advantage of the technology for the betterment of human race.

[38:29] So we better have those guardrails necessary.

[38:32] And guardrails can mean many things.

[38:36] All of the things that you mentioned, Debbie.

[38:38] But it is very important,

[38:40] I think going back to, I don't know,

[38:42] Isaac Asimov.

[38:44] He has three laws of robotics,

[38:48] three laws of AI if you look him up. I don't have them memorized right now,

[38:54] but I hope we follow they go something to that effect. That we hope that AI will obey humans eventually to the extent that it doesn't cause harm to the humans and to themselves.

[39:06] So go something like that. There are three laws of ISAC smo, which I think they are more true today than ever in the past.

[39:15] I hope whatever we do is for the benefit of humanity, for the benefit of the universe and the nature around us that we operate in. I'm sorry if I'm getting very philosophical here right now,

[39:26] but.

[39:28] But let's take advantage. There is hope. Let's take advantage of the technology.

[39:34] It is giving us an immense opportunity to do things that we have never been able to do in the past.

[39:40] And let's take advantage of it. Staying within the guardrails. That's my wish.

[39:45] Debbie Reynolds: Yeah, that's a big wish. And I agree. I think this is an unprecedented time in history. And so you're right. I think what we do today will have a huge ripple effect over the next, you know, many, many decades.

[40:00] So it is very important for us to think, take that long view about what we're doing and try to put the right things in place for the future.

[40:09] Yeah. Well, thank you so much. It's been great. Oh, my goodness. So it's been amazing to have you on the show. I love all the things that you're doing and the way that you're thinking about these issues.

[40:20] It's very important.

[40:21] So thank you again.

[40:23] Yogita Parulekar: Thank you. Debbie. It was fun chatting with you.

[40:27] I love the data diva.

[40:31] I love it. And you're doing great, Debbie.

[40:34] Great work in this space.

[40:36] So I'm honored.

[40:37] Thank you.

[40:38] Debbie Reynolds: Thank you. Well, it's my pleasure to have you on the show. This is information that people truly do need to know right now. So you're really hitting the mark on that regard part.

[40:48] Yeah.

[40:49] Yogita Parulekar: Thank you.

[40:50] Debbie Reynolds: Yeah. Well, I look forward to us being able to chat in the future and possibly collaborate in the future.

[40:56] Yogita Parulekar: Absolutely.

[40:57] Debbie Reynolds: Yeah. We'll talk soon. Thank you.

[41:00] Yogita Parulekar: Thank you.

[41:13] Debbie Reynolds: It.

Next
Next

E242 - Karina Klever, CEO and CISO of Klever Compliance, Governance Risk & Compliance Centers of Excellence