E242 - Karina Klever, CEO and CISO of Klever Compliance, Governance Risk & Compliance Centers of Excellence
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:24] Now,
[00:25] I have a very special guest on the show, all the way from California,
[00:29] Karina Klever. She is the CEO and CISO at Klever Compliance. Welcome.
[00:38] Karina Klever: Thank you. Such a pleasure to be here.
[00:41] Debbie Reynolds: Well, it's a pleasure to have you on the show. You and I have been connected for quite some time on LinkedIn.
[00:47] You had made a comment on something that Jeff Jokic had written and it was unbelievable. I was like, yes, yes, yes. Oh, I have to reach out to her. Like, she's totally like, you're hitting it on all fronts.
[00:59] So I wanted to make sure that we connected. And we did. And so we had a fantastic conversation. And to me, those are like the best where you feel like you literally could have recorded that and that would have been a podcast, but we decided to have you on officially so we can talk about all things compliance and governance risk and compliance related.
[01:21] So before we dive in, I would love for you to introduce yourself.
[01:26] Tell me about Clever Compliance and what you do. You're phenomenal.
[01:31] Karina Klever: Oh, well, thank you. Thank you so much. So I guess about me is I've been in it since now.
[01:38] April 1989,
[01:40] I was a computer operator. I became an AS 400 programmer and I got sick and tired of getting bad specs.
[01:49] So I became a project manager because the project manager manager kept giving me the specs. And then back in 02, an old colleague called a C level and said, you got to get these auditors off my back.
[01:59] And I said, you must have the wrong number.
[02:02] But sure enough, he actually did intend that I start helping him with an IT audit program office, which I did.
[02:09] So I've been doing GRC related activities over 20 years now.
[02:14] Definitely love it completely. GRC's my jam. I've worked with all kinds of regulations and frameworks across biotech and banking and retail and pharmaceuticals and healthcare,
[02:28] all kinds of different environments, mostly SMBs and some enterprise level clients. So definitely opened up Clever Compliance now eight years ago to help people kind of escape the monotony of the traditional approach to compliance, which usually consists of these checkboxes that are ethereal and vague and nebulous that don't even really apply to them.
[02:54] So it's just kind of about appropriating control specifically to Our clients.
[03:00] Debbie Reynolds: That's amazing.
[03:01] One thing I want your thoughts about, and I think this is so funny, so I think that we are kindred spirits because I've always loved true people who understood governance, right?
[03:14] Because I can't really do anything well in an organization where you're dealing with data if you don't really have governance. Right. So that's why I think we get along really well.
[03:23] But what is it that companies get wrong about governance? Risk and compliance.
[03:32] Karina Klever: So there's these thousand multi thousand row checklists that they all of a sudden the focus seems to shift onto the checklists and these spreadsheets opposed to actually understanding your own environment and putting in controls from, for your actual operations, you know, and when you get through these checklists, it's, they're,
[03:56] they're great reference points to kind of keep you straight. But you need to know that you will never follow them verbatim and you shouldn't, right. So frameworks are built by design to be ethereal and broadly applicable.
[04:11] You know, frameworks are born when really smart people sit around a room and they decide to come up with statements that are broadly applicable. And that's the point of a framework.
[04:22] It's not made for your company specifically or your industry or your level of maturity or the size of your company or the technology stack that your company has. And so I think where people get really stuck is they hyper focus on these checkboxes to go finding these ethereal controls instead of actually appropriating to their actual operations.
[04:45] Debbie Reynolds: I agree with that. I tell companies a lot of times because a lot of times when I talk with them,
[04:51] they're concerned. Maybe they saw an article in the news or maybe there's some company got in trouble for doing something and they're like afraid, oh my God, like we need to jump on this.
[05:01] And it's like you don't even do that. So,
[05:05] so why are you even concerned about it? Like this is not even in the sphere of what you're working on or what you're doing, right? So a lot of it is really right sizing it to what,
[05:15] what the company actually does as opposed to, you know, like I say, like staring at the shadow, at your own shadow on the wall and being afraid.
[05:23] What do you think?
[05:24] Karina Klever: Yeah, well, we have this thing called End to the fourth. End to the fourth is now next near never.
[05:30] So you know, it's a very non sexy, non AI, non fun decision making cycle of sitting in a conference room and trying to turn off the distractions and looking at that Big incoming document of 7,000, 12,000, 16,000 thousand rows and basically coming in and saying, hey, what are you guys doing now?
[05:51] Right? Identifying the now and making sure that it's documented. Like people don't have a governance policy.
[05:59] Your governance manages your entire operations. How are you going to keep track of your operations along the way, along the path of actual true, the GRC cycle, right? So making sure that your documents are all in place, the controls are documented correctly, the frequency on the controls is accurate.
[06:21] A lot of people like to say frequently.
[06:24] Well, what the hell's the frequently, right? To me, frequently is once a month. To you, frequently is once a quarter. Auditor comes in, says, well, to me, frequently is frequent, frequently is once a week.
[06:34] You've already failed, right? So why don't you just put down that you do it once a month instead of using words like frequently, occasionally, periodically,
[06:43] sometimes. That's not an executable control.
[06:46] And additionally you're never going to be able to actually automate that control if you use those ethereal words, right? So we have now the next is as you're going through this long spreadsheet, identify what you're going to do next.
[07:00] You have your bucket of what I do now. This is your big fat win. This is your huge low hanging fruit.
[07:07] Make sure that there's tangible controls that you can gather evidence for passively if possible, long term for stuff you're already doing. Get credit for that.
[07:15] Identify the stuff you're coming up with next in one year or half a year if you're smaller, two years if you're huge, right? Then there's near. Near is, near is your three to five year goal.
[07:27] Three to five years is on your roadmap, on your quarterly roadmap is a set of controls and then there actually is a never. So I'm going to tell a really quick story and even regulation, and I do want to encourage people to understand the difference between a regulation and a framework.
[07:43] They're very, very different, right? And a lot of people combine into one blob and it's, they shouldn't do that, they shouldn't treat it at the same level. But even the never column is called, is is the inapplicable column.
[07:57] So the example I'm going to give is out of a biotech who made drugs, right? So QMS process, you put a pill in your mouth at the end of the day, there's a serial number in it.
[08:07] So it's traceable for your, for your health purposes, right? 1. So pharmaceuticals have to abide by the FDA. FDA releases 21 CFR to pharmaceutical companies.
[08:18] One of the titles says 135, 115. If you make goat milk ice cream, you have to use goat milk.
[08:26] Well,
[08:27] the pharmacy I happened to be at didn't make any goat milk ice cream. So what we needed to do as part of this never exercise, of this exercise of end to the fourth now, next near and the never is say I acknowledge this title, but we don't make new goat milk ice cream.
[08:44] So here's why we accept this risk. We're going to put it far, far away and not even really track to it because this doesn't ever apply to me. So that's your never.
[08:53] There is actually a hint of an end to the fifth. An end to the fifth is new.
[08:59] That only comes with maturity. So that means that you can now cycle all the way back to those influence documents. You've, you've identified your now you have your next, your project plan against that.
[09:10] You know what your near is on, your strategy, your roadmap, what you're never going to do. But if that regulation and that framework has changed and it may affect you, you want to loop back down around that's end to the fifth.
[09:23] That's your new. You want to acknowledge those new controls and feed them right back through that first and to the fourth now next near never.
[09:32] Debbie Reynolds: Yeah. One thing that I've experienced going into companies and looking at what they're doing in governance, typically they call me in because they want to find documents or find information or get to the bottom of something.
[09:46] And so typically the first meeting are people rushing towards me with documents, information and people with big titles and telling me about what they do. And so I let them talk and give me all the information and then I say, what do you actually do.
[10:02] Karina Klever: Exactly?
[10:03] Well, my usually, my first question is, where's your data?
[10:07] That's usually what I start with.
[10:09] And nobody can answer this question.
[10:11] Yeah,
[10:13] just tell me where your data is.
[10:15] I mean, it's usually my first question out the gate and then I'll check in with them every month for like six months. And then they call me and they're like, well, I think I kind of found 40% of it.
[10:26] Right. So I mean, you have on one side this huge black hole called a vendor,
[10:33] right? You have these kind of CYA contractual obligations that almost restrict transparency.
[10:41] Right. So you need to be asking very, very specific questions from your vendors. You know, stick to NIST V2CSF. Are you transporting my data? Are you storing my data? Are you processing my data?
[10:52] If so, where.
[10:54] Right.
[10:55] Make sure you're understanding how that vendor is handling your data. Because most of our breaches, it's because I've relinquished my data to my vendor,
[11:04] but then that vendor has gone and sent it to their vendor.
[11:09] Well, that vendor doesn't want to do any more on storage stateside or whatever is the thing. And then they go send it somewhere else. The downstream controls continue to weaken as that data passes and makes those chain link hops downstream.
[11:26] And it's.
[11:27] And a lot of people honestly just don't know where their data is.
[11:31] Debbie Reynolds: Yeah, I think that's true.
[11:33] One thing as you're talking, I'm just thinking to myself, the statements that I've heard people say, just having crack up. So I'm just going to throw these at you.
[11:41] One of them had a company basically said, so when are we going to be compliant?
[11:46] So I mean, what would you say? How would you explain compliance to them or someone who said that to you?
[11:52] Karina Klever: Well, what I would say is you never actually really arrive. It's a journey, right? You can do the heavy lift, you can do the design and that initial implementation and you can get you, your, your controls up to date, right?
[12:04] So whether those controls are housed inside policies or processes or standards or templates or KBAs or a napkin, for God's sake, sitting in a filing cabinet, right? Wherever those actual controls live under which published document they live in, that is the heavy lift.
[12:21] That is the design of the actual operations that has to be identified. And you have to figure out a way how to gather that passive evidence long term.
[12:30] Then it becomes more of a how am I adhering to my own operational controls? Because look, so like when you have a well written control,
[12:40] if that control fails, it needs to instantly hit the risk registry. Now risk registries that consist of ethereal controls, you know there may be an earthquake in la.
[12:51] Absolutely, there may be very soon.
[12:55] And that's an important ethereal control. Karina might be walking down the street in Boston and a brick might fall on her head.
[13:01] Sure,
[13:02] there's a lot of construction there. When I was there a few weeks ago, beautiful work happening in the downtown area. But. But the real risk I care about as an executive,
[13:12] the real risk I care about as the person responsible for the operations is what control is supposed to be there and why is it failing?
[13:23] I want to fix that. I want to know what's broken in my actual operations. The problem is most of the way that our documents are written right now.
[13:31] They're not conducive to measure the break,
[13:35] we haven't defined the controls well enough. So when it breaks we don't know. So when we write the controls that say, hey,
[13:42] I know exactly what's expected of this control and I actually know exactly how to measure its failure so I can hit it, I can put it over into the risk registry.
[13:53] Now we can sit down at around a table and say is this risk really important for us to fix or is it not? Maybe it's an easy fix. Maybe we haven't had a cab once a week because we don't have the changes.
[14:07] Well let's go update the policy to say we do a cab every other week, not every week. Maybe it's just really not a big risk right now if we release something into production that causes multiple incidents,
[14:20] you know, without doing a BIA like a crowdstrike did as an example, that was just a change management fail, right? If we go release that, well now that's a heavy risk.
[14:31] Let's go do an RCA and figure out really what went wrong and who approved this change and how and what went wrong in the execution of this change.
[14:41] So you never arrive, you never.
[14:46] Debbie Reynolds: I think that's true. I think that's true. One thing that I've seen especially with small to medium sized businesses, I don't see it as much with larger companies because larger companies tend to have bigger.
[14:57] Well it depends on the company how much they care about compliance. Right. They concern tend to have people who have specific roles within the company and maybe part of compliance or, or the part in their job that's related to GRC is like more well defined.
[15:14] But in like a small to medium sized business you have people who may be wearing, may not have the role quote unquote, but they're kind of wearing sort of a hat of that maybe like they have a, a task that they're given.
[15:27] And so problem that I end up seeing is that when I read through people's documents they' oh, the person who does this does this thing. And it's like so where is this person?
[15:37] Who is this person,
[15:39] who is this magical person that does things frequently or accordingly and stuff like that. So there are a lot of things that companies write into these policies and procedures but there really is no person who is assigned to and especially if it's not really documented really well if that person leaves the company or goes to a different role,
[16:00] it just is like an orphaned control. But give me your thoughts on that.
[16:05] Karina Klever: So I actually came across this very recently and it was kind of Tragic to see they had paid for one of those templatized, you know, get your sock two overnight tools that have these canned templates in place that basically a fill in the blank template.
[16:25] They were genuinely afraid to make any changes against that template because their logic was, well, a bunch of people spent years writing this out and they know what they're doing.
[16:37] So I have to keep this in here because when the auditor shows up,
[16:41] they're going to look for this verbiage and it has to be present inside my document for them to approve me and pass me on to the next level and say that I passed my audit.
[16:53] And my argument in reverse was if you don't really actually do this,
[16:58] you shouldn't be in there. It shouldn't be documented as a real live, actual control.
[17:04] You maybe if you want to do this in the next year, that's your next column, right? Let's create a project plan on how we'll get there.
[17:13] But it's probably going to involve humans because right now you have extreme segregation of duties violation and you have zero least privilege permissions controls because everybody's an administrator,
[17:26] right? And they almost have to be in order to figure out how are they going to create fulfill all of these sections in these templates. So I think the important thing to do is really again I'm going to come back to now next near never.
[17:40] And this is where I keep bringing a lot of clients is let's talk about what you're doing right now and not ever have anything in the documents that allude to you do more than this.
[17:53] Because you've now made an auditable statement for yourself that an auditor could come in and say, oh, it says you do this. Show me the evidence for that.
[18:03] But if you don't do it, why do you have it in there, right? Put it on a roadmap and say, hey, I've identified this risk that I don't do this right now.
[18:12] Here's how I'm mitigating it in the next year.
[18:15] Put it on that risk registry and show that you're working towards actually getting to what that control said.
[18:22] Debbie Reynolds: Yeah,
[18:23] let's talk a little bit about just the lift of governance. I guess so.
[18:29] And this is my perception of it.
[18:31] So first of all, governance has always been important. But governance hasn't always gotten the attention and the money that is needed. But now that companies see like AI for example, is this new toy that they want to play with and they want it really bad, right?
[18:46] They want to understand this tree for Christmas.
[18:48] But now they have to go back because if they want to really be able to leverage AI and these emerging tech, they have to really do better or, you know, do something with governance.
[19:00] Right? Because, you know, a lot of people say garbage in, garbage out is more like garbage in, junkyard out, right? So you get, you throw more stuff in and you get more junk out.
[19:10] So tell me how AI has,
[19:14] is changing you think at all people's idea about the importance of governance.
[19:19] Karina Klever: So look, so when I started it in 1989,
[19:25] we didn't have the Internet at all,
[19:28] right? And when the Internet first came out,
[19:32] it was so out there. It's very, very authoritarian. It was very official and legit and,
[19:39] you know,
[19:40] real. And it was very important.
[19:44] And in the beginning, all of us thought everything on the Internet was true because it's just the way it came across. We all thought it was true, but it wasn't all true.
[19:55] Right? And it took us a little bit to figure out that the Internet was not all true.
[20:01] Right now, AI is the Internet magnified times infinity,
[20:07] right? Times bazillions,
[20:10] which is where we're getting this data set. We're seeing the incredible bias,
[20:14] we're seeing the incredible discrimination.
[20:18] We're seeing such a lot of these characteristics that we find on the Internet,
[20:24] we are finding in AI. If you're talking about the raw data sets now,
[20:31] EU AIA EU AI act was written right around the same time that ISO took 402001 and published 402001 as a guiding principle for AI.
[20:45] So considering that ISO 9001 was what was used to write QMS standards and QMS basics, say, right,
[20:58] here's my input, I understand what it is,
[21:00] here's the process, what I'm doing with it, and here's the output. If we take a pharmaceutical, here's my recipe, here's what I throw in the cauldron at a certain temperature, here's how I rotate it, and at the end I have pressed pills that are packaged and serial numbered, right?
[21:17] So it's a QMS standard, right? Good manufacturing practices, good documentation practices follow under QMS written by ISO 9100 globally recognized. ISO came out with 42,001 actually recommending these same principles.
[21:33] When you're talking about AI, here's my original data set.
[21:37] Here's how I'm crunching these numbers. This is my intent and this is the intended outcome. Why is this important? Maybe I do want bias.
[21:46] Maybe I want to see that one drug is reacting to an African American woman versus a white woman differently.
[21:53] Maybe I want these data sets to see that balance. Right. Or how that result might be different.
[22:00] Maybe I don't want gender or race involved in my assessment and in my study.
[22:06] So a bit of a long way of saying Sorry about that.
[22:09] ISO 42001 is now becoming the globally recognized standard AI tools,
[22:18] which means that you've identified your source data,
[22:22] what you're doing with it and what your intended outcome is. If we can stay as an industry to this governance structure and make it very, very clear what our intent is with the AI, I think can be incredibly powerful because of the computing speed.
[22:39] Right. I mean we've had so many advances because of, I mean, thanks to AI in fields of medicine,
[22:46] in fields of diagnostic and analysis. I think it's wonderful.
[22:50] But we have to put those parameters around it is what I think. Now from a governance perspective,
[22:57] you have to have an AI policy because I can guarantee you all your folks out there, they're playing with AI if you haven't white labeled it or black labeled it.
[23:07] People are using your network to hopefully not putting your proprietary company information into AI and ask it for advice.
[23:16] Right. Or put kids pictures up or, and I'm going to take a minute to do my plug, please don't put up the videos and pictures of children online.
[23:25] You're giving them a digital identity and signature they have not opted into or chosen and you're actually opening them up for identity theft as well. It's very easy from a social engineering to find out where they're at and how old they are and what school they go to and what based on probably your adult social media presence,
[23:45] where you live, the whole thing. So I bet I beg people to not put pictures up of their kids, their neighbors kids, their nieces and nephews,
[23:54] anybody under 18. Let them make that choice when they get to that point. Right. But back to governance and AI. I implore everyone to write some sort of a bumper guard and a parameter for their staff because they're trying AI and they need to know what your position is.
[24:15] And if you as the executive leadership are sticking your head in the like an ostrich and deciding that maybe la la la, it'll go away if you ignore it and that people know what to do and they all want to do the right thing,
[24:29] you're delusional. You really need to come up with that guidance for your staff.
[24:35] Debbie Reynolds: Excellent points. I'm gonna talk a little bit about the expense, I guess, of governance for trying to get stuff right with data. And I think this is something that I've Seen, and I'm sure that you've seen as well in your consulting work.
[24:48] And that is to me I find it very offensive that I see companies spending goo gobs of money on stuff and not getting a good result or good product. And we see a lot of these same companies end up in the news and they're in trouble, right,
[25:05] for something that really shouldn't be that difficult or shouldn't be as expensive as it was. But give me your thoughts on that.
[25:14] Karina Klever: Honest to goodness,
[25:16] missing the very basics. So super, super quick, easy math, right? Let's say I have 10 people in my company and let's say fully loaded, it's 50 bucks an hour, not much, fully loaded.
[25:29] Benny's salary,
[25:31] rent, insurances, all of the stuff, right?
[25:35] 50 bucks an hour, fully loaded, I have 10 people. What are these 10 people doing? They're liaison with the auditors, they're getting the documentation ready,
[25:44] they're updating documents there internal audit, they're doing the follow up, the mitigating task, they're working with audit in some capacity. Okay, so If I have 10 people at $50 an hour, fully loaded, and I multiply that by 20 hour, 2080 hours per year, which is a normal work year,
[26:06] that's over a $1 million spend just on the human who runs around like a chicken without a head gathering evidence for controls that may not even apply to you.
[26:20] And this is the craziest part,
[26:23] because when we continue to work off of those spreadsheets,
[26:26] those extremely long spreadsheets, and we have an auditing firm come in and say, well,
[26:33] we've multiplied the 12,000 rows by the configuration items in your environment and because of that, now we've got 12 million rows to go through.
[26:47] So now because of those 12 million rows, we're going to have to send in 70 people for the next, oh, three to five years because you have to use our proprietary spreadsheet and get ready to pay 10 to 15 million a year.
[27:02] But meanwhile we're not any more secure,
[27:05] our environment isn't any more safer. We're leaving these big wide gaps in our security postures because we're so busy focused on these checkboxes and we're spinning on these controls that have nothing to do with us because we haven't appropriated them.
[27:23] That's the continual fail that we keep seeing in this,
[27:27] which is why so many companies are like compliance, I'm going to take my chances,
[27:32] I'm going to pay my cyber insurance policy instead. And folks, I got to tell you,
[27:37] oh my God, if you Think your cyber insurance policy is going to cover your losses and reinstate you back to normal?
[27:45] They won't.
[27:47] I've seen so many situations recently where one situation actually the questionnaire. So cyber insurance policy sends you the questionnaire always gets forwarded to.
[28:00] Guy's job is to fill in the questionnaire,
[28:03] make sure to say yes on as much stuff as possible.
[28:07] So they say yes to stuff. Do you have mfa? Sure do. Yes is the answer.
[28:12] Breach occurs,
[28:13] the insurance company says,
[28:15] well, show me your MFA stuff that you said you had on your questionnaire. Well, I do. I have MFA for the sliver of my IT environment over here in this corner.
[28:25] But the breach didn't happen in that specific corner. The breach happened in a different corner. And all of a sudden the cyber insurance policy carrier says,
[28:34] yeah, not it. You lied on your questionnaire, so we're not going to help you at all.
[28:39] Good luck with that breach.
[28:40] 80% of small businesses who get breached end up closing, folks. So it's very, very dangerous.
[28:47] Debbie Reynolds: Right. And actually I'm glad that you brought that up. That's a statistic that I've seen for many years. And I think people see these huge companies like t mobile or AT&T and they get breached.
[28:58] Like these companies, they have so much money, right. And they have these huge policies. And so. But you're not that company. Right?
[29:05] Karina Klever: So.
[29:07] Debbie Reynolds: So I think companies,
[29:08] some people see that, first of all, they think, oh, this only happens to big companies. Or they think, well, my cyber insurance policy will cover it, so then I don't have to do anything about it.
[29:18] And if you think about it, actually in the US they say 99% of companies are considered small businesses. So if you are Apple or Google or Facebook or something, you really need to heed this advice around governance sense.
[29:33] Karina Klever: Yeah.
[29:34] The other thing I would say is drill as part of your drbcp,
[29:40] that SEC filing, if you're publicly traded.
[29:44] So I see a lot of drills for reinstating the server under Susie's desk or whatever. Right.
[29:52] Kind of your classic kind of tabletop exercises.
[29:55] What's happening right now with the securities Exchange Commission four day requirement to do the breach notification is many of you that if you drop something into legal, you can check back in about 45 days and see if they're maybe halfway through the request.
[30:13] If you drop something into finance or if you drop something into hr,
[30:17] there's a very long, tedious process of getting any sort of an answer back. So what's happening right now that I'm seeing is a lot of security engineers, you know, and guys who work level 1, 2 and 3 really want the system to continue to run and they care jobs.
[30:35] They love being the firefighters and the guys who fix everything.
[30:39] These are really, really smart people who want that business continuity to be in place.
[30:44] And so what I'm seeing a lot of these SOC engineers and NOC engineers do is pick up the phone and say, hello, sec.
[30:53] Because they're protected by the whistleblower laws and they're the ones that know about that incident before anyone else.
[31:02] And they know.
[31:03] So what I'm seeing a lot of right now are companies who are unable to quickly turn around from a workflow perspective that mandate reporting to the securities Exchange Commission within the four business day requirement, right?
[31:23] So because legal takes too long, HR takes too long, the vendor might take too long, finance takes too long. So that that process between when the account actually,
[31:37] when the occurrence actually is considered a material weakness and a breach, significant an event, significant enough. And there's people who will argue and write long articles about is it an event, is it an incident, is it an occurrence.
[31:52] Whatever you call it,
[31:54] you lost a whole bunch of data. Now this is a cybersecurity incident where you've been breached.
[31:59] So there's this event and the other end of that workflow is you reported to the six year securities Exchange Commission and it's been four days.
[32:09] Companies are finding it absolutely impossible to fulfill that.
[32:13] And because they're finding it impossible to fulfill that, the security engineers who work in the SOC and in the NOC are the ones that are picking up the phone and calling the securities Exchange Commission and saying, hey, hey,
[32:27] we've had an incident, we've had a loss, we've had a breach, we have a ransomware attack, we're down.
[32:34] Because they know full well that in their company,
[32:38] their workflow, that four day workflow will not be fulfilled because of how many departments that that workflow has to hop through.
[32:46] But they want to do the right thing. They know that it's the law.
[32:50] So they call the securities Exchange Commission and notify them and they're protected by whistleblower laws.
[32:56] Shame on you, the company when this happens because you haven't provided instructions for the folks working the help desk areas,
[33:07] any of the support areas on how to handle themselves,
[33:12] how to escalate. And the other shame on you is you haven't drilled this workflow.
[33:19] You have to make sure that you've exercised this and you can run it it with your eyes closed. In the middle of the night,
[33:26] halfway you wake up From a dream. And you know exactly what that process looks like from an incident commander standpoint. You use a seat color or you use a secret type of make and model of a vehicle to have it be your internal company code.
[33:44] That when you get an email that says,
[33:47] I don't know, Ford F150 or Winnebago Travato,
[33:51] you gotta respond because at the end of this workflow that's gotta be published within four days, you have to drill that internally.
[33:59] Debbie Reynolds: Yeah. Well, I want to know, since you're a data person, how does privacy play in to your work?
[34:06] Karina Klever: Gosh, I love talking about this because we're really failing in America from a consumer privacy perspective in a very big way.
[34:15] I think one of the reasons I walk into companies and say where's your data?
[34:20] So that we can actually classify it and assume associate the data points that relate to the consumer who has CCPA protections? The consumer in Texas, who has protections for consumer privacy?
[34:36] The GDPR consumer. Right. So when we have data about consumers, we have to flag that data whether it's Phi ephi,
[34:49] whether it's credit card data,
[34:52] whether it's county of residence, city of residence, state of residence. Because you have to be prepared at any moment to receive a consumer privacy fulfillment request in Europe. Note dsars.
[35:05] So let me walk through that for a minute. Dsars are data subject access requests because that's how GDPR describe them.
[35:15] In US laws we call it the consumer.
[35:19] So really what we talk about are consumer privacy requests.
[35:23] Right? So those consumer privacy requests, they all have different timelines. And of course back in I think last October we heard about apra, the American Privacy Rights Act. Right. Which has now kind of been at a standstill unfortunately for a while.
[35:40] We're all hoping that it kind of gets, gets moved forward and moved along, but you have to be able to accept a request from a consumer and based on where that consumer lives.
[35:53] Now mind you, one state has a 45 day fulfillment requirement. California has a 15 day marketing opt out. One state has a 60 day fulfillment requirement. One state has a 30 day fulfillment requirement.
[36:07] So you have to know based on your consumer profiles how much time you have to fulfill that consumer privacy request and make sure that you enact across all of your data storage the request that's been made by the consumer, whether it's a.
[36:25] If you want to keep that transactional row, but you've received delete me request but you need to keep the data of the transaction. So instead of Karina clever it needs to say user 1, 2, 3 make sure you have processes in place that do that anonymization of that initial identifier for that consumer,
[36:46] and you have a mechanism to track. Remember a lot of these tracking tools, they only. They only create a template online. And you can do this yourself. They create the template, they grab the consumer info, and they just dump the data into your existing ticketing system.
[37:02] So just take your existing ticketing system, create a splash page,
[37:07] and feed it into your existing ticketing system. The difficulty will be whether you know which group is responsible for which type of data for that fulfillment exercise to actually execute that request.
[37:21] Debbie Reynolds: I think it's so complex.
[37:24] It's so funny because I've talked to some companies,
[37:27] like for example, where they're have consumers all over the world, and a lot of them are going towards, let's just go towards like the most restrictive standards because then we'll cover most of everything and then we can look nip around the edges.
[37:43] But what are you seeing?
[37:45] Karina Klever: There are over 120 countries right now that have consumer privacy laws.
[37:50] I think the most important thing,
[37:52] I think in America, we're at seven,
[37:56] I think seven, with 12 enacted,
[37:59] but not actually fulfillable, I guess to like later on in the year when you store your data,
[38:06] make sure that you've categorized it the right way.
[38:09] It's really simple, right? I mean, and if you can be disciplined enough to force that into your own data set.
[38:18] But not only that,
[38:20] data purposefulness, data minimization.
[38:23] My dad passed 11 years ago. Five years before he passed, he went to a medical center here up the street. He only went there once. Hated the doctor, comes back all angry.
[38:35] He was starting to feel sick, but he didn't know what was going on.
[38:40] Never went there again.
[38:41] Here we are 16 years later,
[38:44] and I get a letter from that medical facility addressed to my dad saying, we lost your data. Why do you still have that data?
[38:52] Why are you not getting rid of it? So if that data is not actively associated to the purpose of your company,
[39:01] data purposefulness,
[39:03] and you don't have data classification in place that specifies retention guidelines for every type of classification type, why are you encrypting publicly available information?
[39:17] Do you want to know how many millions is literally thrown down the dumpster when people don't classify their data and actually encrypt publicly available,
[39:27] why are you doing that? Right.
[39:29] Debbie Reynolds: Yeah. Wow. Oh, my goodness. I know. I had a situation where I had not been in a CVS for probably 15 years. And I walked in and they knew who I was.
[39:39] And I was like, wait a minute. Now,
[39:42] wait a minute. This is about troubling.
[39:46] Karina Klever: Very troubling. Right. And so that business purposefulness and minimization is important because once you do that classification the right way,
[39:54] now you can say public data, get rid of it in two weeks. You know, the stuff that's my most important golden egg, put parameters on that forever is not a parameter.
[40:05] Data hoarding is not something we want to encourage because look, that the hacker, it's going to take them five minutes to get into your system.
[40:13] Whether it's a big data set or a little data set, which one do you think they're going to want to pick in the same 5 minutes? Minutes. The bigger the data set, the juicier it is, right?
[40:23] Debbie Reynolds: Yeah, that's true. Well, if it were the world according to you, Karina, and we did everything you said, what would be your wish for data or privacy anywhere in the world, whether that be regulation,
[40:35] human behavior or technology?
[40:38] Karina Klever: I wish that companies would be more responsible with the data that they have about consumers,
[40:48] that they would protect it better,
[40:50] that they would tag it the right way.
[40:53] And I wish that data brokers would have less of a role in our everyday life because everything we touch and everything we do and everything we stream and everywhere we drive and every credit card transaction we have, and every time we click on a review,
[41:12] there are algorithms behind the scenes that will try to take all of our actions and upsell to us.
[41:20] And sometimes you make a consumer request of a company to get rid of your data and they do it at the moment, but then tomorrow's batch comes in and backs your data right back up into the database.
[41:33] Right. So I think that we all need to start just being a little bit more careful with consumer data and how we handle it at companies.
[41:43] We also have to, as companies give people the option to opt out of our AI gathering and not make it sly and sneaky. For the life of me, I cannot opt out of, and this is on behalf of my financial team,
[41:58] end to it is not letting them opt out of AI and it's on every single screen.
[42:04] So we should have the capacity as consumers to decide whether or not our data is used in these algorithms and these calculations and data that's broadly available to a very mass audience.
[42:19] And I wish that we had true visibility into our vendors when we conduct our daily operations and we have vendors that actually we hire to fulfill some of our operations.
[42:33] It would be good to know with trans transparency instead of CYA contractual language,
[42:38] exactly what they do with our data and be and have those vendors be forthright with it.
[42:46] Debbie Reynolds: Well, I share your wish and your dream. Absolutely.
[42:50] I think what we're. What's happening or what we're going to come up against now is going to be a situation where companies will start to lose money because people either give them bad data or they won't give them any data.
[43:04] Right. And so they have to make that they want the best data. Right. To get them better insights. And so in order to get that, they may have to be more transparent and more responsible with the data or pay more or give you something more in that in exchange that you feel is a value.
[43:25] Karina Klever: The thing that a lot of people don't know, and I personally know this, is, is there are people who sit there and actually pollute AI.
[43:34] I don't know that. I mean, that this isn't usually widely talked about,
[43:38] but there are bots out there who will literally, on behalf of a company,
[43:43] go out and insert millions and millions and millions of comments and rows,
[43:49] basically encouraging any searches with keyword matches to come to them.
[43:57] Right. So this pollution of AI calculators is actively happening right now to a great, great extent.
[44:06] And we don't know what somebody else can afford from a bot perspective to have us make our decisions and calculations based on their pollution of the AI tool we're using.
[44:22] Debbie Reynolds: Definitely concern for sure.
[44:24] Well, thank you so much,
[44:25] Karina. This is fun.
[44:27] Glad we were able to get a chance to talk and I will definitely follow your work and people. Please,
[44:32] Karina, if you can tell us how people can get in contact with you and your company.
[44:37] Karina Klever: Absolutely. So on LinkedIn we have Klever Compliance available.
[44:42] Also I have a very active account on LinkedIn. That's my only social media account is LinkedIn.
[44:49] We don't do anything else by design.
[44:52] Also we have office numbers listed. Karina. Klevercompliance.com Klever with a K and Compliance with a C. You can go ahead and contact us anytime or info@Klevercompliance.com so always excited to speak with anyone and help them appropriate their GRC program for their actual operations.
[45:17] Debbie Reynolds: Definitely reach out to Karina and her company. She's the real deal. They're fantastic. So, thank you. Thank you so much for being on the show. We'll talk soon.
[45:26] Karina Klever: All right, take care. Thanks. Bye bye.