E257 - Gina King - Cyber vCISO and Communications Consultant, King and Company Capital
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:13] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:27] Now,
[00:28] I have a very special guest on the show today,
[00:31] Gina King.
[00:33] She is the cyber vCISO and Communications Consultant at King and Company Capital.
[00:41] Welcome.
[00:42] Gina King: I'm so happy to be here. I don't even know how to thank you just for all of your work. I'm excited to talk with you.
[00:49] Debbie Reynolds: Well, we've been connected on LinkedIn for a long time.
[00:52] You and I chatted on LinkedIn.
[00:55] You and I actually had some actual meeting before just to touc baby,
[01:01] and I'm super excited. I think your excitement,
[01:05] even in the way that you write, comes through when people talk to you. So you tell you're very passionate about what you do.
[01:12] You definitely know your stuff. That's just bonkers. But I want you to tell me, what is your journey? What is your passion, the thing that led you along your path in technology and how you became such a just amazing operator at what you do?
[01:29] Gina King: Oh, there's so many things I could say. I think, because this is only an hour. What I'll say is, I started off as a chemist,
[01:37] and how I got into technology is I. While I was in school for chemistry, I happened to be working as a bank teller for a period of time. And because I was the youngest one there, they would have computer problems, they would just send the kid.
[01:47] You know, I was the kid. They would send a kid to talk to the IT group.
[01:51] And I was hooked off of the immediate gratification of getting something to work. And it was something about that. So even though I finished my degree in chemistry, I wanted to get into technology because I loved figuring things out and helping the technical team get it to work.
[02:07] And as a result of that, I was able to communicate between the business and what was going on with the technical team. And I loved being in that space. And so over time, when the technical teams realized that I could do that, they would push me, go talk to leadership,
[02:21] go figure this thing out. Like, go do this and go do that. And so I started learning about technology to be able to communicate. So to me, it's like a new language that I'm using to bridge the gap between the business and the technology teams.
[02:34] But I'm really a people person, and I love seeing the light bulbs go off. And so that's how I got into it. And as I moved forward and started running into a couple of cybersecurity teams showing up at the last minute on one of my projects, I was like,
[02:48] oh, this is crazy.
[02:49] I need to start off with them to understand what they're looking for. When they realize I was paying attention, we have a project manager who's paying attention to what we want from a cybersecurity standpoint.
[02:59] Then they started latching on to me like, well, go find out from leadership what this means and how we can get this done. Get this into your project. And when they realize I could get their stuff funded,
[03:09] I was in. Because they're like, here. This is the girl coming in from the business side helping us from a cybersecurity standpoint to be able to communicate how important these things are.
[03:20] And so to me, it's just helping my friends get things done. I can't believe I get paid to do it, to be honest with you.
[03:26] Debbie Reynolds: That's a tremendous skill, is one that I think a lot of times I'm going to go deep on my technical geeky side right now because I am a technologist, I'm a geek.
[03:37] I work a lot like you do, kind of bridging those gaps between those teams.
[03:43] And it's really funny because a lot of times when you have someone who's so deeply entrenched in technology,
[03:51] it's hard for them to communicate in ways that the business really cares about. And so tell me a little bit about this. You had told me some stories about this.
[04:01] You, and I laughed. I mean, I could just hear it here right now.
[04:05] Gina King: Yeah. So as an example to me, every time I'm working with somebody, it's like I'm hanging out with friends. I'm really trying to understand the emotion behind why they do what they do, especially from the business side, because often they have a really good reason why they're doing the wrong thing.
[04:20] And so instead of coming in and just saying, hey, you're an idiot. Don't do that. I really try to understand what's the value?
[04:26] Why are you doing this? And then when I talk to the technical teams, I'm able to connect the pieces and say, okay, we can do this in a secured way and still get them what they're looking for.
[04:35] So to the business, they don't care how we get it done. They just want it done.
[04:40] And so when I'm able to come back and tell them, hey, here's a way we can get this done, and I leave out the technical Piece I just tell them, and so the increase in cost is gonna be this, or the increase in value is gonna be Y.
[04:50] Or this will help us gain new markets, or this will help us to be able to navigate this particular customer base is done. They don't care about the technology.
[05:00] They just wanna know how can we get the thing done. And so I think often as technologists, we think we have to prove ourselves the business. That's not what they care about.
[05:08] What they want to know is, can we gain a new market?
[05:11] Are we able to compete?
[05:13] So being able to make that transition from the technical side to the business side, I just happen to be really good at it. And then even going the other way, the business is saying that they need to accomplish A, B, and C.
[05:25] So I'll learn about the technical framework, what are our boundaries, what type of budget do we have, and come back to the technical team and say, okay, this is the problem, and help me develop a solution that will work in this particular scenario.
[05:38] And the technical teams can do it too. Cause I really. One of the things about me is I really respect people's talents,
[05:43] and so I let them tell me. So when people ask me how I like managing projects, I go, okay. So my job is to listen to people tell me what they want to do, when they want to do it, how much it's going to cost, and my job is to tell them when they're going to get it done.
[05:59] I want that job all day. That's. That's what I. That's what I have to do.
[06:03] You give me the list and I tell you if you met your requirement or not, it's all up to you. And so I open it up to allow people to be the experts that they are,
[06:12] figure out ways where they feel like they can be honest about what it's going to take and how they feel about it. And then in my mind, I reconcile. It's almost like solving a Rubik's Cube of how can I take what the business wants, the technical team, and get them to come together and align.
[06:26] And so that's what I do every day.
[06:28] Debbie Reynolds: That's so funny. It makes me smile. I think it's very similar to what I do or what I work on in privacy. Because I feel like what people don't understand about these roles is that really, the key of them is really communication.
[06:44] And so being able to communicate with all different groups and all different levels. Probably the most valuable thing I ever learned in college cost me a lot of money to learn this one thing.
[06:53] But the most valuable thing I learned in college is one of my teachers. I was a communication major at one point and they told me that if someone doesn't understand you is your fault.
[07:09] And people,
[07:10] people totally think, oh, the opposite. They're like, oh, I told this person and oh, they're so this or they're so bad. And it's like, no, no, no, no. It's like the person doesn't understand you.
[07:21] You have to change the way that you communicate. And so understanding your audience, understanding what they want and what they need, I think is really important.
[07:31] I remember one time I was working with, with a very difficult person on a project. And a lot of times those people, I target them, right? And so I like, may have like side meetings with them or whatever, try to find out, like, what's their deal, what do they want?
[07:45] Or whatever.
[07:46] So one time I was talking to this one person,
[07:49] she was a new mother and she had a very stressful job. I basically said, do you want to go home at 5 o'? Clock?
[07:57] And she's like, yeah. I said, okay, well,
[08:00] we do it this way.
[08:01] Yes, go home at 5:00'. Clock.
[08:05] Okay, I'm all ears, but I want your thoughts.
[08:08] Gina King: Listen, that's excellent. One time I was working with an architect and one of the questions I asked when I start a project and I could either be the seesaw, I can be the project manager, I could be the janitor, I don't care.
[08:18] Like, you call me what you want to call me.
[08:20] And one of the questions I ask is, what are the two, three, five key things I have to do or I better not do for this project to be something you can enjoy?
[08:29] And this architect, and I never forget the look on his face, we're really, really good friends now. He said,
[08:35] it doesn't matter what I say. Ultimately, you're going to run your meetings on Thursday.
[08:39] My daughter's dance recitals are on Thursdays. And for the last four years, no project manager has been able to get these meetings on Thursday. And I said, well, I probably won't be able to either, but just tell me a couple other things since we already know.
[08:51] I'm not going to get that when I tell you I got that thing off Thursday. And when I did it and he looked at me, he was just like,
[08:57] how did you.
[08:58] The reason I do that, and I try to do that up front is because if I need to call somebody at 2:00 clock in the morning, I want them to say yes.
[09:04] So I go in giving and filling up that bank However I can. So in your scenario, like you were telling her, like, we both have goals and I can help you reach your goal if you help me reach my.
[09:14] And I think that that's the key of being in this industry, because they have goals, too. We have to recognize and respect those goals.
[09:21] And so often we think, since we have the answer,
[09:25] that people should follow us, and that's not true, they can revolt.
[09:29] Debbie Reynolds: Absolutely. The best case scenario is you want people to champion what you do, or the worst case is you don't want them to impede what you do.
[09:38] Gina King: Yes.
[09:38] Debbie Reynolds: Yes. So they don't have to love it, they don't have to like it. They just don't want them to interfere. Exactly.
[09:44] Gina King: And so the people who are difficult, those are the people that I really want to work with. Because here's the thing. I know that you can.
[09:50] The blank faces, the people who. They could care less if it's successful or not, like, I can leave them, but the ones that are passionate, that I'm not going to be able to do it because they remember the three other times it didn't work.
[10:00] I want to talk to you. I want to figure out, how can I be your. Because I don't want to fall down the same flight of stairs. If you got all the intel, like, let me know,
[10:07] right?
[10:10] Do I got to send you some stickers, some Twix? Like, do I have to send you, you know, a doordash lunch card? What do I have to do to get on your calendar?
[10:16] Because you got. Got the answer. You got the key.
[10:19] And so, so often we take those emotions from people and we project our insecurities on it. When those people, like, that's their job. That's their legacy. And we're coming in saying, hey, follow me.
[10:32] And they're not sure if they can trust us or not. So I want to come in. I want to do a little bit of what you want me to do to show you I can do it and that I care.
[10:38] I'm listening. I understand your business.
[10:40] So later on I can say in a scenario that I've been in here before, even though I did what you wanted me to do and I knew it wasn't going to work,
[10:47] this other way does work. Can I talk to you about that? And so now I've built that rapport where they don't think I'm just coming in kicking people's babies in the face.
[10:57] I'm sure you've been on some of those projects where those consultants come in and kick somebody baby right in the face and like.
[11:02] Debbie Reynolds: Oh, yeah, right, right, yeah, exactly. The Al Haig thing. I'm in charge now. I'm like, that doesn't work. That doesn't work. There are many.
[11:11] Gina King: Those people are sitting there like you about to fall flat on your face.
[11:16] Debbie Reynolds: Exactly.
[11:18] What's your thoughts about cybersecurity and privacy?
[11:23] So my view is that because these things run parallel to almost any part of the business,
[11:32] they have to work. They are not the same,
[11:34] but they have to have a symbiotic relationship. But how does, how does privacy seep into your work?
[11:41] Gina King: Oh, goodness. So in my mind,
[11:44] cybersecurity, that's like a branding,
[11:47] what do you call it? That's a ruse. We're talking about business risk. Now, a part of that business risk may have to do with threat actors accessing data over the Internet, but by and large, we're talking about business risk in general.
[11:59] And I feel like sometimes when we say cybersecurity, some business leaders think, oh, that's some newfangled thing. That doesn't matter because 15 years ago it wasn't a topic because we kind of renamed it.
[12:09] And I'm like, no, we're talking about business risk and data privacy.
[12:14] Just the data in general. If it's not properly defined,
[12:18] whether the privacy standards are in control or not is of no consequence because you don't even know what you try to protect. You have no idea what it is. So it's running rampant out the building.
[12:26] So to me, the data and how the data is controlled is way more important than the overall business risk, because that's how you run your business is through information, through language and nomenclature.
[12:37] If you don't have that down, you don't have a business.
[12:39] You working with gases and things that are intangible. And so often I think it comes down to these identities.
[12:48] And on the one hand, identity access management is a real thing. But I think there's another part. The identities of who we think we are,
[12:56] we get played by threat actors so often because people in their titles and they think that they're too good for this or they know more than to do that, or that's not in their wheelhouse or that's not in their job description.
[13:08] So we get exploited through our identities about the information that runs the business. And once that's exploited, you don't even have a business. And that's so a lot of my posts, that's what I'm trying to get people to understand about artificial intelligence is once you sprinkle that stuff on your Business,
[13:23] you can't unbraid that. Once somebody misunderstands you, it's done.
[13:28] Your secret sauce about how you got those customers, soon as something comes through that they think you've changed your business or you didn't communicate to them that you changed a policy or a standard,
[13:37] there are so many options that they can go to your people. Start doing true ups when I tell you, and I'm gonna try not to predict on your show. So many businesses are gonna have to close their doors because they won't have a product, they won't have a service because they've lost that trust with the customer because they didn't protect the data.
[13:56] And it's.
[13:57] It's unfortunate.
[13:59] Debbie Reynolds: I think it's true.
[14:01] Companies already have a lot of complexity within their organizations. They already don't really understand what all they have.
[14:08] And then when you bring in something like artificial intelligence,
[14:12] you know, I say people, like, pour it on stuff like goo right on everything, because they want to. They want to play around and they want to be like this other person.
[14:20] It's like, I mean, you still have to do the fundamentals. And I feel like companies have struggled for a long time to understand what that is. And a lot of times, even around privacy,
[14:31] when I'm talking with companies, I tell them,
[14:34] you don't have privacy problems, you have data problems.
[14:38] So your data problems make these other things that's raised everything else that you're trying to do. So really starting with the data and what you're doing with it is really key.
[14:48] Gina King: Yes. And for every area of your business,
[14:50] because it could be that each department is using that information a little different.
[14:54] And so when you come in with artificial intelligence that's not artificial or intelligent, it's like putting too much mustard on something. That thing is done. You can't rinse mustard off.
[15:04] It's mustard in the Kool Aid. It's mustard on your peanut butter and jelly sandwich. Like that.
[15:10] Debbie Reynolds: Exactly. That's so true. I love the things that you write on LinkedIn because you definitely are doing a really good job of educating. But people feel like you either know technology or you either know law or you either know business.
[15:24] In the future, we'll all have to know all of that or a bit of that. Right. Understand how those things interplay, because the enterprise is getting just a lot more complex, and understanding why you're doing things is really important.
[15:38] A lot of companies have set themselves up almost like Santa's workshop. Like, everyone has a very specific, narrow thing that they do, and they don't understand what the person to the right or their left is doing,
[15:50] know kind of their part.
[15:51] And organizations don't work like that anymore.
[15:54] And so not understanding that your part, how it impacts something else, is a disaster. But I want your thoughts,
[16:03] that's what.
[16:03] Gina King: I mean, about identities. People are so focused on that one box that they check that the space between their box and somebody else's box is unmanned.
[16:12] And so that allows threat actors to come in and pretend to be whoever they want because you don't even know what the person next door is doing.
[16:19] To me, it's a little bit surprising, but I think you're absolutely right about people having to. You can't just press one key on the keyboard. You gotta be able to play the whole thing.
[16:26] And no matter where you are, if you don't know how to do it, figure out the framework so you at least know the boundaries that should be in it. And you can at least have a discussion about what you do or don't know.
[16:36] And so often people have not done that. And it's unfortunate. So one thing about me that I think people underestimate me about is I've worked in a lot of industries, and so over time I've become industry agnostic, tool agnostic, system agnostic.
[16:53] Because I know so many situations where something that should not have worked, worked. And I know so many situations where something that everybody thought it was going to work, did not work.
[17:02] And so I know a lot of the things that don't work and why, and they transcend industries. And it has a lot to do with what you're talking about,
[17:11] people not understanding not only their role and the boundaries of their role, but how it can be exploited as it moves from them to just that very next person or their very next department.
[17:21] And it is difficult to watch. Is difficult to watch. So I think a lot of organizations, when it comes time to true up,
[17:28] so in a scenario where a company has two or three different products and they're getting a really good price because of AI and they're advertising their price when they true up and they've realized they haven't been keeping track of the industry's prices,
[17:40] and then they realize, oh my goodness, like, we're upside down in this thing because they were so focused on their little piece, nobody looked around to see everybody's changing too.
[17:51] And you got to keep the pulse on that. So if you're not doing that, if you're not keeping the pulse on what everybody else is doing and you're not communicating the changes you're going to make, when the curtain goes up, nobody's going to be dressed,
[18:03] everybody going to be naked,
[18:05] everybody's going to be standing with no clothes on, like, oh my, like, what,
[18:09] what are we doing?
[18:10] Debbie Reynolds: This is going to be crazy. Yeah. Well, let's dig deeper into AI in general.
[18:15] What are you seeing about how companies are trying to approach AI, how they're trying to adopt it,
[18:22] what are they doing that they should be doing, or vice versa.
[18:27] Gina King: So the thing is that to me,
[18:30] AI is not a new thing. I've been dealing with these newfangled. This is going to come in and revolutionize my business. I've been my whole career. I feel like I've been running the same 10, 11 projects my whole career.
[18:40] And one of the things that companies fail to do every single time is to look internally first and to clean and sort what you already have and know what this is what I have,
[18:50] if I know what I have and I can define it, then when something comes in, I know where it's going to work and where it's not going to work versus saying, hey, here's this great thing and it's going to work.
[18:58] And figuring out is not going to work with the customer or with a client, because that's the worst time to figure it out,
[19:04] is when you have somebody who's purchased a product or purchased a service. And so I find that companies, even with generative AI and some of the internal models and things.
[19:13] Let's look at your risk register.
[19:15] You gotta clean some of those things up because as a result of those things, when you try something else, those risks are gonna be realized.
[19:22] Let's go look at your change control process because the issues you have there, all of those things are gonna be amplified by the fact you're bringing in this language and nomenclature that is not your own.
[19:32] And so it's going to force you to finally do those things you should have already done. But my God, the cost is going to be bananas because your partner's business, suppliers are going to be impacted, your clients are going to be impacted, your employees going to be impacted.
[19:45] And by the time you figure it out, this thread of foolishness going through your organization is going to be difficult to understand. Especially when you bring in external consultants that sometimes don't really understand your business.
[19:56] They just know, oh, well, if we delineate this data like this, you should see this improvement over here and you'll see the improvement.
[20:04] But compared to the industry, it may actually be a deficit. And I don't think a lot of people are looking at that. They're real time change.
[20:11] Because when the big guys come in, when the Nikes come in, when the Microsoft come, they're gonna do it the way they want to do it. And now you got change on top of change.
[20:17] Let's see you get that spaghetti in order.
[20:19] Gonna be crazy.
[20:21] I'm gonna just be there on my pillowcase. Like put the money in the bag.
[20:24] Put the money, put the money in bay.
[20:26] Debbie Reynolds: Exactly.
[20:28] So what you just touched on I think is a very important thing that a lot of people don't understand and that is if you're a third party to anybody,
[20:37] a lot of what you do will be pushed down from them.
[20:42] Right.
[20:43] So you have to be nimble in a way that you can really understand that. And so if you don't understand your data, what I call your data story, it's going to be a hot mess.
[20:52] Gina King: But I want your thoughts in addition to that.
[20:55] Often when those things come through, a question that nobody's really asking is do I have the right to use AI?
[21:04] If I do this, am I breaking a contract? Am I in breach of contract?
[21:08] Did I tell people that James over here was going to do all of the data privacy things but instead of James's AI, did I communicate that? Do I have the right to do that?
[21:17] Am I now in some type of legal risk as a result of doing that and multiply that by how much money by vendors to supplier or if it's the government healthcare system or if it's the military, like do you have the right to use artificial intelligence in your organization?
[21:31] And there's some organizations who have no idea what employees are using non enterprise grade level AI tools. I have some lawsuit templates if anybody's interested and I can help you key up your lawsuit so that you can send them out because it's going to be crazy.
[21:48] Debbie Reynolds: Yeah.
[21:48] There was actually a thing in the news, there's a lawsuit going on now with I think Photo Bucket. So Photo Bucket,
[21:55] they had collected images of people who put their pictures online and now they decided that they wanted to use it for AI training that was not the purpose for which it was collected and that it wasn't really communicated in the right way.
[22:09] And so they were trying to force people to consent to this new thing and now they're in this huge lawsuit or. And then not only lawsuit, you lose a lot of customer trust.
[22:19] Right? People can make choices, people can choose different things. So I think that's like a cautionary tale. You really shouldn't do that anyway. So if you collected data for one purpose,
[22:29] you do need to really align with that. And I hadn't even thought about it from a contract perspective. You had actually signed a contract with someone you know. Are you doing something that maybe is out of line or out of bounds from what you agreed to do in the first place?
[22:45] Gina King: Absolutely. Because if you have somebody, like a lot of contracts, you have to name your data owner, name your data steward. And if that now includes artificial intelligence, you probably need to communicate your intentions on using it.
[22:58] Where is it encrypted? Is it in transit? Where is it going to land? That other organization or those other organizations or those people or those customers. They need to understand, like, your full architecture of how you plan to use it.
[23:09] And as a result, where is it going to go? Why are you using it? How is it supposed to improve my experience?
[23:14] Nobody's sending out anything. And when everybody goes, voila. When I tell you the madness that we're about to experience. So this is the time, I'm glad you mentioned that, for people to beef up their networks of people that they really know, faces you've really seen.
[23:30] Because what's going to start happening is nobody's going to trust anything except if you can make a phone call and talk to somebody,
[23:36] it's going to happen like, no, I need to call Deb because this don't look right. I'm not signing that. Let me call her, because I don't know if she sent this out.
[23:43] It could be a bot to tighten up your network because you don't even know who's who, the way they're doing it right now.
[23:50] Debbie Reynolds: That's true.
[23:51] Let's talk a little bit about threat. I love your thoughts on insider threats as well. I'm sure you have a nice spicy thing to say about that.
[23:58] But a lot of times when people. I feel like a lot of times people think about threats of cyber,
[24:03] they're thinking about Tom Cruise hanging from the ceiling, you know, mission impossible thing. And that's really not the biggest threats that companies have. Right. And so they sort of ignore things that aren't malicious.
[24:18] To me, a threat is like, Susan down the hall doesn't know how to use this tool.
[24:22] Gina King: Inadvertent. Yes. That's the number one.
[24:24] That's the number one. You change something. And she's like, no, this is what I always do. But she's wrong,
[24:30] right? Absolutely, Absolutely. Especially the churn right now with them trying to get AI in and what they're doing with these employees trying to bring on people that are cheaper or whatever.
[24:40] There's so much knowledge that's being leaked out of the organization that the threats are going to go up internally and externally as a result of people just not knowing the policy wasn't clear, the procedure didn't exist.
[24:52] It was a whole bunch of folklore and fireside chat talk of how things go. And so you add AI and it's going to be a run on the bank. You won't even know what's missing because you didn't even know to look for it.
[25:04] Years could go by.
[25:06] Debbie Reynolds: Yeah, yeah, that's true. That's true.
[25:09] Gina King: Yeah.
[25:09] Debbie Reynolds: I work with a company now on, like, a lot of data retention product projects or whatever, and those are very long and painful because you're basically kicking down every door and opening every box and trying to figure out what people have and what they're retaining.
[25:25] And so during this process,
[25:28] what you find out is that what I like to say. I tell companies, I want to make sure that your walk matches your talk. Okay? So a lot of times you find out, okay, the stuff that you put on paper, that's not what's happening in your organization.
[25:44] It's like, why do you. Are you doing this? Or you find out that this, you know, Susan down the hall was doing her own little process that nobody knew about or what.
[25:52] And this is, like, rampant within organizations. But tell me a little bit about that.
[25:56] Gina King: So I do a lot of assessments for different certifications, whether it's CMMC or CMMI or you name it. And there have been situations where the documentation has been impeccable, like, impressive.
[26:08] Like, can I get somebody's autograph? This is. And then you walk around the corner and you see a bin full of backup tapes, and you're like,
[26:15] so where do these go?
[26:19] And who. Because the tapes are just sitting here. I could just roll these out to my car right now.
[26:24] I've been in those situations or where the server was at somebody's house,
[26:29] and I'm like, what we doing?
[26:33] Debbie Reynolds: What are we doing?
[26:36] Gina King: So many stories. And so you're absolutely right. Like, your. Your walk has to match the talk because when it comes down to it,
[26:44] your clients trust, your vendor supplier trust is all on the line. And there are a couple huge companies that I don't want to mention that have had these egregious,
[26:55] horrible things happen. And then they'll just come out, like the next day or the next week with the advertising for that exact thing, like, hey, here's our new and improved blah, blah, blah.
[27:02] And I was like, did everybody forget? Did Everybody forget that they had this huge issue or this huge breach.
[27:08] And that's another thing that I think is going to start going up where people are going to start having like a fatigue for the information and as a result,
[27:17] nobody is going to trust what's happening. Unless I could talk to somebody and go and touch it and really understand it, people are going to back away from technology and that's unfortunate.
[27:26] But they're unwilling to do what they need to do. It's like,
[27:31] it's like asking a kid what you want for dinner. They're going to tell you chocolate cake, you can tell me you got a bubble guts,
[27:36] you have a headache, da da, da. For whatever reason, unless you have a parent that's involved or somebody who knows that can help guide them,
[27:43] they're going to choose the wrong thing. And I think we have a lot of organizations that are using that, that are preying on that, thinking that they're going to make a lot of money.
[27:51] And that's unfortunate.
[27:53] Debbie Reynolds: Absolutely, absolutely. Well, what else is happening in the world right now that's concerning you?
[28:00] Gina King: Definitely about technology? Definitely the fake AI companies that are like, hey, you can implement X, Y and Z and notice this change like in 30, 60, 90 days.
[28:11] That is ridiculous. No matter what size organization, because they're definitely leaving out communication with your partners, vendors and suppliers. They're not taking into account any of your legal responsibilities. Or maybe one of your key vendors is the government, maybe one of your key vendors is a huge organization.
[28:27] They're not looking at any of that. And then all of these fly by night AI tools that are just repositories to collect your data.
[28:35] So many of them out there where they just want to get your data because they want to use it for something else, they want to sell it somewhere else. And so they just got this storefront,
[28:43] like we can help you do this magical thing that makes no sense. Those types of things make me feel uncomfortable because I think about small business owners, I think about people who are a little bit older, may not necessarily understand all the intricacies involved.
[28:56] People who have no idea about the children they're going to impact with this thing, even up to and including some of the changes that are coming down from our government with no policy, just, hey, we're going to make this cut and leaving it up to everybody to interpret what that means.
[29:10] All of these things come together, give me a headache.
[29:14] There's no, there's no control at all.
[29:17] Debbie Reynolds: That's true.
[29:18] OpenAI recently had updated their AI model to a new version And a lot of companies and people just went bananas because they had workflows built on some of those older models.
[29:36] And so when they changed that things didn't work. And so I'm like, this is an example of why you shouldn't do that.
[29:43] Gina King: Yes,
[29:44] no communication, nothing. I don't understand how these medical facilities, these huge medical systems, are implementing AI with no communication to the customers and not only letting them know, hey, we're going to start using artificial intelligence, but how to interpret what those changes are and where the changes exist.
[30:00] So say, for example, you have a healthcare organization and they have a requirement that if anybody shows up with an urgent or emergent situation, we need to come up with some type of solution within 12 hours.
[30:12] But with AI, they're like, if anybody comes up with an urgent or emergency situation on these lists,
[30:18] then we need to come up with something within 12 hours that meets these particular standards. You need to communicate that, that additional clarification. Because now I need to know, when you say urgent or merger, what do you mean by that?
[30:28] What's the definition?
[30:30] These lists, what are each one of those things mean?
[30:33] Nobody's doing that. They're just, hey, they're assuming that the garbage that was on the Internet that we already knew was garbage that that's appropriate for health care. And that's not true.
[30:45] Debbie Reynolds: Right. Well, and I think about, when I think about AI and people like before people went crazy with generative AI,
[30:54] a lot of those systems were very narrow scope. They were purpose built.
[30:59] There were things that it wouldn't do. You know, you didn't have a lot of risk because it was very limited in what it could do. And so now that you have people trying to use a tool that really doesn't have a level of accuracy or stability that the company really needs to actually run their business,
[31:17] I'm like really concerned about what people are putting into these things. And especially,
[31:23] you know, one of the things concerns me a lot is kind of what you call catastrophic forgetting. Well, these models, they, they can't just continue to grow, they have to purge.
[31:33] You know, some of the data is old or whatever, but some of them can forget.
[31:37] And that's the, that's the last thing you want to happen with maybe your foundational information within your organization that you run.
[31:45] But then on the flip side of that,
[31:47] you know, on the privacy side,
[31:49] you know, you're not supposed to hold things forever,
[31:52] especially around people's personal information. So I think it's going to be a really tough thing for companies to try to navigate that. What do you think?
[32:02] Gina King: Absolutely. Yeah. The things that make me feel nervous, that keep me up at night is drift and lack of context.
[32:11] Because they're not. When you receive a result from one of these systems, it doesn't say, here's the result.
[32:18] And the reason I choose it is because here are all the possible results. And this was the highest percentage. But what if that highest percentage was only 68%?
[32:26] Who's validating? But we don't even get that. We don't even get context.
[32:30] So whoever's using it, just think that's the answer. That's the only answer is like. And I could tell you a hundred ways, and I don't want to go into a whole other topic, but this is why I'm anti certification, because for everybody certification question that has an answer.
[32:42] I can tell you 20 scenarios where that answer is absolutely incorrect,
[32:46] but without context,
[32:48] you're just stuck with that. And so people then go into these organizations using that certification type AI knowledge, a bank of information, and here's the answer. And they apply it in businesses and break things.
[32:59] And it's like, no, look,
[33:00] here are all the possible answers. That one was only 20, 21%. And it could be the highest percentage, but that doesn't mean it's right. We now need to do additional verification.
[33:08] We need now need to look at context and figure out everything else is going on. They're not having AI do that. It's like, hey, this is the answer. And every time I see somebody do that, I'll go, okay, so ask it.
[33:19] All the assumptions it made to get to the answer. It's like 50 things that come up. And I go, and this is what's wrong with AI. It's not telling you that it's making those decisions for you.
[33:27] Are you okay with that?
[33:28] Debbie Reynolds: Absolutely. And I love that you say that because those are.
[33:32] Drift is something I talk about a lot. I call it the AI risk drift. And then also context is key. Right?
[33:41] You're right. You don't have a lot of context. You don't understand.
[33:45] So one of the examples I gave is like, let's say you had a traffic system where they say, well, people on the south side, they drive worse because they get more tickets.
[33:55] It's like, well, the south side has more cameras than the north side.
[33:59] But that's not. But they didn't tell you that. Right?
[34:01] So it is making judgments out of context without you actually knowing what that is. And that concerns me greatly.
[34:10] Gina King: So important.
[34:11] I remember. So one of the things my pet peeve Is platitudes, when people had those little quotes as supposed to give you some type of, you know, knowledge. And one of the platitudes that somebody had wrote something a post about is, don't put all your eggs in one basket.
[34:24] And it was something about what they were writing that made me feel uncomfortable about that particular platitude. And so I went to AI and I was like, I want you to draw me an image of a woman holding 12 baskets with an egg in each basket.
[34:37] Because in my mind, having an egg in each basket increases the likelihood that one is going to get broken. Because you walking around with 12 baskets like, let's really look at this thing.
[34:46] And it drew me, this model. It was a beautiful Asian lady,
[34:49] human,
[34:50] and it has six baskets on one arm, six baskets on the other, with an egg in each basket, and came up with that.
[34:56] But in order.
[34:57] I thought it was going to make the handles different sizes to get the six fasteners on each other. No, it put hinges in her arms.
[35:05] And I paused and I said,
[35:07] what made you put hinges in her arms versus making the handles of the basket longer? And it said to me, it would be easier to just put hinges in two arms rather than extend the size of 12 baskets.
[35:20] And I said, but she's human.
[35:22] She would have to go to the hospital and have surgery to get these hinges in her arm. Do you realize what that means?
[35:28] And the response was, well, if you wanted me to prioritize the woman over the baskets, that's something you would have to start with. And I went, oh, my God,
[35:38] girl, stop,
[35:40] Stop.
[35:42] I said, she could potentially die. And you're telling me.
[35:45] And it was like, you're absolutely right. So in the future, you need to let me know that you want to prioritize the woman over the baskets, because there's no way that I could know.
[35:55] And I'm thinking to me, in my mind, and the thing is, I know I'm not right.
[35:59] Why don't you inherently know that the human is more important than the basket? I'm just saying. I'm just saying. But I know that I have no right to expect that.
[36:09] But how many other things are that way?
[36:12] Debbie Reynolds: Absolutely. And I think about things like,
[36:14] especially healthcare. I mean, it's frightening. Frightening. I mean,
[36:20] unfortunately,
[36:22] I had a relative that was 16 years old that died from cancer. And one of the reasons why her cancer had escalated so far is because they said from their model that someone her age would not typically have that type of cancer.
[36:45] So they never tested for it until it was like too late.
[36:49] So this is what concerns me greatly.
[36:51] Gina King: So that's what you name it after your family.
[36:55] Debbie Reynolds: Right,
[36:56] right. But I'm saying these are the types of things that concern me a lot about the way people are using AI. I mean, there are a lot of gaps there.
[37:05] And so not every solution is the most probable. And we're talking about probabilistic type of processing. And so is very concerning, very concerning.
[37:16] Gina King: And there's male and female bias because I was on a project where supermarket organization was developing like a curbside pickup type model and it was not my project. I was just looking at the plan that was on the wall and they had, they had in a scenario where we don't have the particular size yogurt,
[37:33] we'll just give the larger yogurt at no additional price. And when I saw that,
[37:39] I went to the project manager and I said, I got a question.
[37:41] I said, you see this right here?
[37:44] Yeah. She said, we'll just give the bigger yogurt at no additional charge. And I said, but I need it for my kids lunch. And let me tell you, if you send home this big dumb container of yogurt, I'm going to throw it right out the window.
[37:55] I'm getting a little one for a particular reason because I need that container. I'm not going to take a huge container and then find a whole bunch of little containers to put little dollops of yogurt.
[38:04] That's ridiculous. Give me the option to say I want to opt out of a change.
[38:09] And to me this is a level one thing. But when it comes to data, we don't give people the ability to opt out when we made a decision for them.
[38:17] We just assume that that's the only answer. And I'm just like,
[38:20] that's gotta be illegal.
[38:22] That's gotta be illegal.
[38:23] You can't just make a decision and say, well, if they want to change it, they need to let us know. They're assuming that that's the only option. Because you wrote it like that,
[38:31] Right.
[38:32] Debbie Reynolds: And you're not thinking about why the person chose that, right? Yeah. So yeah.
[38:37] That also reminds me of something I always say. You go into a supermarket and you, you step on the mat, that opens the door. Right. So let's say the person before you stepped on the mat and opened the door and you stepped on the mat and it didn't open the door for you.
[38:53] And the other person could say, well, it's nothing wrong with it because it worked for me,
[38:56] but it didn't work for you. And so this is another issue that we have way because it's supposed to work for anybody.
[39:04] So the fact that it doesn't is a problem?
[39:07] Gina King: Yes,
[39:07] yes,
[39:08] absolutely. And it's only as good as the person's ability to use it. Because I can sit down with somebody and because they don't understand what I understand, I can get completely different answers out of it.
[39:21] And that's a problem.
[39:22] Not only is it discriminatory,
[39:24] I feel like it targets people who don't understand what they're doing.
[39:29] And that breaks my heart. I feel like there needs to be a coalition to protect the vulnerable among us because they have no idea what they're doing. And when they get so far down the road, it's going to be too late.
[39:41] Debbie Reynolds: Wow.
[39:41] You know, if it were the world according to you, and we did everything you said, what would be your wish, Gina,
[39:48] for the world as it relates to either privacy or cybersecurity, whether that be technology,
[39:54] human behavior,
[39:55] or regulation?
[39:58] Gina King: I think there needs to be a committee,
[40:01] not only of data experts, but experts in every area.
[40:05] And we need to have, like, a global grc. Some people who are really looking at this thing from all angles because that's not happening. And some of these things we're doing need to have, like, a seal of approval that has been reviewed.
[40:17] But we are breaking laws. We are crippling people. We are stealing from people left and right.
[40:23] And here's the thing. If you think you're having a problem, where do you go? Who do you tell?
[40:28] You're just all by yourself thinking that you're the only one. And like you said, when you tell somebody else, well, it worked for me. What's your problem? What's wrong with you?
[40:35] That's something that keeps me up at night because I have three kids on the spectrum. So when things happen that impact them, it's kind of like I'm the advocate, but I shouldn't have to be.
[40:44] They just happen to be my children.
[40:46] All children who are disabled or vulnerable people. They don't all have people like me with them. That's not fair.
[40:53] It's not fair to other people that my kids have me. But what's happening to those other people? What's happening to those other children?
[40:59] And it really makes me feel upset.
[41:02] Debbie Reynolds: I agree with you. I feel like we say human in the loop,
[41:07] and that phrase seems to ring so hollow because it doesn't really explain how humans need to be involved. Like, to me, it should be, these systems should serve humanity.
[41:23] I'm not like a bot, right? So we're not friends, we're not, we're not associates, they're not your parents. I think we need to have more of a human centered focus and figure out how technology plays a part in that as opposed to trying to make technology kind of take over and try to be human for us and try to make decisions for us without knowing.
[41:46] You can't make a decision for me that you don't know me.
[41:49] You don't know what my life is like. Right.
[41:51] So just like the yogurt example, which I totally. It's so funny because when you said that, I was like, yeah, I don't want the big yogurt.
[41:58] Gina King: Don't give me that 25 pound bag of flour. I got this little sugar dish. That's why I'm buying a little sugar. I bought this beautiful dish and if I want to pay $8 for the tiny bag,
[42:08] I will do so give me that big bag where I got to carry it up stairs. I would throw that thing right out the car windows like this garbage. I don't want this.
[42:17] I'll open it up and take how much I need and then throw it out the car.
[42:20] Debbie Reynolds: That's right, that's right, that's right.
[42:22] Gina King: Well, like a year to get through that little bit of show. You hear me? It's just there for show. I don't want 25 pound bag of it.
[42:32] Debbie Reynolds: Oh my goodness,
[42:33] you're remarkable. You are remarkable.
[42:36] I just love talking to you because you have such a,
[42:39] such great insights. But then you're, you know, you have that practicality which I love, love about that. But how if someone wants to reach out to you and they want to work with you, how do they reach out?
[42:51] Gina King: I think LinkedIn is great. I get ambushed out there because I also try to help executives figure out what they're doing with their careers and develop frameworks and things. I think LinkedIn is a great spot to find me.
[43:03] And on my LinkedIn profile, like, my number is there because sometimes I get people who call me. I just helped a guy, I have to have you interview him.
[43:11] I just helped him with litigation against a company that discriminated against him. Like out of the blue. He was just like, I saw this post that you did and this is what's happening in my company.
[43:20] And EEOC like said that he's right and that they violated his rights. And now I guess we gotta find a lawyer.
[43:29] But people reach out to me for all of those vague, ambiguous type things. And usually those are also the projects I get because, you know, like you're saying with AI like these bots and these systems, they don't know you.
[43:41] And so it's very difficult when you have a lot of disparate pieces of information to find where to go. And I tend to draw those types of projects and people.
[43:52] Lyn, things are good.
[43:54] Debbie Reynolds: Fantastic. Well, I love talking with you, and I look forward to us being able to collaborate. Yeah.
[44:01] Gina King: Anything. If something happens to and you need.
[44:03] Debbie Reynolds: Somebody to show up with a van.
[44:04] Gina King: And some garbage bags for some kind of reason, maybe some sunglasses and wig, like, call me. I'm. I'm your girl. I'll show up.
[44:09] Debbie Reynolds: You may get that call.
[44:10] Gina King: You may get that call if we get there. Right.
[44:17] Debbie Reynolds: All right. I'll talk to you soon. Thank you so much.
[44:19] Gina King: Okay. Same.