E284 - Michelle Finneran Dennedy, Chief Data Strategy Officer, Abaxx Technologies
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:14] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast, where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:27] Now I have a very special guest.
[00:30] This is not understated in any way, shape, or form.
[00:33] Michelle Dennerdy. She is the Chief Data Strategy Officer of ABEX Technologies.
[00:41] This title does not encapsulate you at all in any way,
[00:47] shape, or form.
[00:48] So,
[00:49] first of all, I have been in awe of you forever.
[00:52] And I remember the first time you ever, like, liked a post that I did or something.
[00:57] This is my name. I was like, oh, my God.
[01:00] I felt famous. Like, oh, my God. I had no idea that you knew.
[01:03] Michelle Finneran Dennedy: Mutual.
[01:04] Debbie Reynolds: I had no idea you knew who I was. But when people call you a leader, that's an understatement, right? So you, for me,
[01:12] you are a pioneer in privacy.
[01:15] So before privacy was a thing,
[01:18] before privacy was a. A real, like,
[01:22] industry,
[01:23] and you cover all the bases. So you are a lawyer,
[01:27] you are a technologist, you are a strategist. So you work companies you've never heard of, like Sun Microsystems and Intel,
[01:37] Cisco. Right. But your impact is extraordinary. So I said all that to say thank you so much for being on the show. I mean,
[01:48] who are you? Who are you?
[01:49] Michelle Finneran Dennedy: How did you.
[01:50] Debbie Reynolds: How did you do all these things?
[01:53] Michelle Finneran Dennedy: Mostly by accident, if I'm honest.
[01:55] And thank you so much. I mean, Debbie, you are everything,
[01:59] everything, everything.
[02:00] And talk about. I'm writing a book right now called the Lighthouse Leadership, and you really are one of those lighthouses. I remember our first interactions where you used to do five minutes, no more, no less, and just bring us up to date.
[02:12] What do you pragmatically need to know about what's going on legislatively or with best practices or standards? And I think that kind of practice is. Is how you and I first got together, because it's.
[02:24] To me, it is. My meandering way of answering a question is once you sort of break things down into these little pragmatic nuggets and you start to communicate those things,
[02:34] the universe reflects it back to you. So engineers say, tell me more.
[02:38] I don't really know what you're talking about. Instead of running away from you,
[02:41] other lawyers say I don't have to wait for the next legislative cycle to get ready for the next wave. And now it's a tsunami deluge of new requirements. You can't build in the requirements from last legislative session because they're already being usurped by some other things, some other where.
[03:02] So that's really how I started out as a patent litigator. I was a paralegal and then patent litigator.
[03:07] And then I fell into privacy, mostly because my boss at the time, McNeely, said, You have zero privacy. Get over it.
[03:14] And so all of the grownups ran away because the boss said, it's not important. You're probably not going to get budget.
[03:21] If you lean into that,
[03:23] it's probably not going to be something that gets you promoted.
[03:26] All of those things turned out to very wrong.
[03:28] And it's not because I had some great foresight. It really is because I a. I love a lost cause, and privacy seemed like a lost cause at the time. And also I love to break things down and sort of say what's really going on here.
[03:44] As an undergraduate, I studied psychology, actually,
[03:47] and I worked with a population of handicapped kids using robotic arms.
[03:52] So it was really using technology for a very human purpose.
[03:56] And half the people in the room were from the engineering school and the mechanical engineering facilities. And all the software in the 80s is homegrown, right? So everything is Fortran.
[04:06] So it broke down a lot. And my job was to work with the kids.
[04:12] And so I actually sat right at the centerpiece where we sit now, where humanity meets technology and everyone else can worry about are the machines working? Right? And that's where most of the money was going.
[04:24] The action happened at the interface where a child was able to play who's bound to a wheelchair or a caregiver got to see their kid making a mess or breaking something, and that never happened.
[04:39] Your life is so heavy when you have a severely handicapped child.
[04:43] So I've always brought that forward to all of what I've done since then. When I could no longer push the agenda at sun in legal, I created a role called Chief Privacy Officer and sold it into the business.
[04:55] My first boss was under the cfo.
[04:58] And I've moved around. I've worked for a group called People,
[05:01] Places and Policy.
[05:03] So we had standards, we had public policy,
[05:06] as well as traditional HR and facilities management and physical security.
[05:11] I've worked in sales. I was a sales business development leader when Oracle bought Sun Microsystems. And that was really telling the story of things like encryption,
[05:21] identity management system, segregation,
[05:25] systems administrator, separation of duties.
[05:27] How do we actually programmatize those and put them into sales plays to sell?
[05:32] All these beautiful technologies that exist that people were not deploying because security simply wasn't sexy enough before now when it's everywhere. And then finally, like my most recent chapter has been as an entrepreneur Crazy McNeely again called me up when I was still at Cisco and said, there's a,
[05:50] a data BI company, kind of like a tableau light and they need a CEO. And I said, you're crazy because I'm a geek.
[05:58] And he said, no, no, he said, no one ever knew what a CPO was until you invented that and filled it in with your friends.
[06:05] And so everyone knows what a CEO is and I'll teach you how to do that. So a long winded story for a long winded journey. I did end up selling my company privacy code to Abex about a year and a half ago.
[06:18] So that's where I'm here now. And it's a commodities clearinghouse and exchange as well as an identity platform. So we're really putting together. It's really back to my roots of how does the physical energy transformation world fit with the AI data transformation world?
[06:34] And it's not a very long leap if you want an electron to move across and pulse across to make that data or that digitized by Google move somewhere. It's gotta be powered by somewhere.
[06:45] So I'm sort of sitting in that place now. And then my sort of final role is I now advise a bunch of startups and the ones that really work. You'll hear a lot from me like right now because cookies are everywhere and they are not automated.
[06:58] There's not a good tool without hiring a bunch of services.
[07:02] That's why I've partnered up with Ian Cohen at Locker, because the stuff works and it solves a discrete problem.
[07:08] So I have a number of those. And there's a couple more AI companies I'm working with that will be out of stealth mode soon. So I guess that's what I do more than who I am.
[07:16] That's a very long. That's way beyond your five minute limit.
[07:20] Debbie Reynolds: No, that's absolutely perfect. Well, you're such a singular individual in terms of your background, so you deserve a lot more than five minutes to tell that story.
[07:31] Michelle Finneran Dennedy: I should say Provadus too, because you've had my partner too. My business partner was at the BI company with me, Brian Lee, Lieutenant Colonel, retired Brian Lee.
[07:39] So we also consult, so we do business consulting, value consulting and data consulting as well. So we're sort of a package deal. And I would be very deep doo doo, if I don't mention Pravadis and Brian.
[07:51] Debbie Reynolds: Brian is phenomenal.
[07:53] Michelle Finneran Dennedy: He really is.
[07:54] Debbie Reynolds: Oh my gosh. Well, I think you are. First of all, we have been trying to get together for years and we finally did it. I don't know how we did it, but I think you are the most name dropped person on the podcast.
[08:07] Like so. There are so many people and leaders,
[08:10] not just in privacy and just other realms of technology, who really look up to you.
[08:16] They adore you as I do. They think of you as a pioneer.
[08:20] Michelle Finneran Dennedy: So I will call you more often, Debbie, Like I'm sitting here with a dog that has zero respect for me right now.
[08:27] Debbie Reynolds: Call me. You can call me anytime.
[08:29] Michelle Finneran Dennedy: I will someone pick me up. I'm a mess.
[08:33] Debbie Reynolds: Oh my gosh. Well, what's happening in the world right now in terms of privacy or technology that's concerning you most?
[08:42] Michelle Finneran Dennedy: So I actually just came back a couple weeks ago now. I keynoted at a Human in the Middle, Human to human network and it really is a coalition. Tricia Wong, who you should talk to at some point.
[08:55] She's just a brilliant technologist bringing together the worlds of identity and security and AI and tokenization and remember us, humans in the middle.
[09:07] So beyond just writing down thou shalt not be biased and then watching everyone fly by that standard, pulling together and saying privacy really is the heartbeat of AI. We are the humanity in AI.
[09:21] We are the why,
[09:23] we are the value creation in AI. But we're not really focused enough, I think, on human centricity as we're figuring out not just how to make sure the LLMs under the desk are not leaking your intellectual property.
[09:38] Well,
[09:39] intellectual property, I started out there, I was a patent litigator doing a lot of medical devices and things like that.
[09:45] It only matters when it's in motion. It's so much like electricity or data even, or just currency.
[09:52] It only really matters when you've got momentum behind it. Same thing with ethics.
[09:57] Same thing with how are we setting these requirements for privacy and ethics. Engineering in a world of quantum where you have almost qubit allows for a very context driven, rich query that may or may not have a clear answer.
[10:15] So all of these really hard questions I always like to anchor in navigational terms. You're not going to own the ocean,
[10:21] you're not going to own all the data centers, you're not even going to be able to quote unquote, control all of your individual digits that have been observed or things that people are saying about you or thinking about you and thinking about you in private on their ChatGPT self therapy sessions.
[10:39] So the biggest thing I'm really thinking about now is how do we have enough controls that we at least can navigate and sense both where things are going right and things are going wrong?
[10:50] And as I think we were texting back and forth about, I think in a world where we now have proven that we can have near infinite compute and different incantations and ideas and pretend conversations,
[11:04] we need to have nanoscale controls in terms of provenance,
[11:09] in terms of identity,
[11:12] in terms of self sovereignty.
[11:15] And then on top of all of that, if that's not hard enough,
[11:18] people,
[11:19] early on, when I started using the term ethics Engineering about 15 years ago, people were like poo pooing it and saying, oh, ethics are just too mushy and they're too personal.
[11:28] And my answer is no.
[11:30] And there's real. Moralists and philosophers can differ with me, but my sort of view on this is that morality, what I think is right or what I think is a moral line.
[11:39] And what you may think, sure, that can be an individual thing. And so we've got 8 billion permutations about what morality can be. First of all, Quantum Compute can handle a billion different versions of things.
[11:49] So that's kind of a hopeful aha for later idea.
[11:53] But the reality is ethics, ethics by definition are community standards. So as we decide what is acceptable on the community of AI,
[12:02] is it okay to have a simulated woman in a business suit giving a keynote when she's not getting paid? Probably.
[12:09] Yes or no, Is that capitalism? Is it okay for that same woman to be nude?
[12:15] That sounds like a different line that our community standards can decide not to cross.
[12:21] So these kinds of trolley car questions, these kinds of the blending of home life and personal life and professional life, how we're judging a student that's propelling themselves through university right now using AI,
[12:34] these are the questions I am really thinking about functionalizing as much as humanly possible right now so that we can remember again that privacy and humanity is the heartbeat. It is the why behind all of this work that we do.
[12:47] Debbie Reynolds: Wow, that's incredible.
[12:49] I want your thoughts about governance.
[12:52] So I feel like we need to reimagine governance now because of AI and the way the rapid pace of emerging technology.
[13:04] And I feel like,
[13:05] and you correct me if I'm wrong, a lot of times when we thought about governance, at least we did in the olden days. You know, I'm probably maybe as old as you may be older in olden days that everything worked out perfectly.
[13:20] You would know what a company or organization was going to do. You would set parameters in Place and process a procedure and then they would be off to the races.
[13:29] Right. They kind of go and do whatever and governance now because of the technology,
[13:35] can't do that.
[13:37] There are many starting lines now and aren't linear and they don't go in a straight line.
[13:43] So you have technologies that are doing loops and different things and it's like, how do you govern that? So I love your idea about the nano controls because it needs to be a lot more granular and it needs to be a lot more spaces or places within a life cycle for that like gut check to happen.
[14:04] But what are your thoughts?
[14:05] Michelle Finneran Dennedy: Yeah, no, I'm. We're so aligned. I think it used to be we'd have an annual audit. When I first came to intel,
[14:13] it was very funny because there was a gentleman there who was constantly, I don't know, he self appointed me his nemesis. It was very strange.
[14:20] And from like day one he said, you have to raise your hands in meetings. And I was like,
[14:24] oh, you cute young thing. That's so silly. I'm never doing that by the way. And I was brought in as a vice president so I was not a junior person being, being out of line.
[14:31] I was actually running my org as I was supposed to run it.
[14:34] So I always, I always kind of thought about him because he like called the dogs on me. And he called internal audit. And so internal audit shows up in my office and they said, well,
[14:45] we're looking for gaps in your program.
[14:49] And I said, welcome, come on in. I have cookies because of course I had cookies, real ones. If you're looking for gaps, you've just met the Grand Canyon.
[14:58] I don't have a program yet. I don't have anything functionalized yet. We had BCRs that were approved and yet we didn't have privacy engineering. We maybe had one or two spreadsheets and we were sort of auditing by sampling.
[15:12] And maybe once a year then they'd call in the auditors.
[15:17] And a lot of those people from that initial meeting have actually gone on to become either privacy officers themselves or other sort of engineers and ethical engineers. Because what we found is by looking at the problem through a different lens of that once a year, naughty auditor,
[15:34] what they're doing is they're actually looking for drivers. Like what are the assets that a company has already invested in, in data, in people and practices that are starting to cause them liability.
[15:45] They're looking for gaps.
[15:46] Well, when you don't have the people, you don't have the process and you don't have any Technology in place, you're the Grand Canyon.
[15:52] So by leveraging people who are looking for risk vectors and measurements in other things like fraud or intellectual property leakage or, you know, sort of gaps in culture that are starting to cause toxic behaviors, all those other things.
[16:09] Now I can mine their brains and say, how do I have continuous improvement while I am building from core principles?
[16:18] And so that's where I feel like exactly as you're saying is it can't be a policy and then the dust blows in the wind on that policy.
[16:28] It can't be some magical technology. Even my favorite technologies are not set and forget.
[16:34] They are monitor,
[16:36] review,
[16:37] test,
[16:38] and then change. Peter Drucker said, you manage what you measure, while I say you protect what you treasure.
[16:44] And so if you kind of put those things together,
[16:47] you start to think about,
[16:48] I have limited resources to monitor.
[16:52] I have very limited attention span to train and educate my workforce to keep up with just AI and other information.
[17:02] They're certainly not going to stop and take some mandatory class that doesn't bring them value.
[17:08] So our people, our process, our technologies, they have to be much more granular. They have to be able to sense and they have to presuppose failure.
[17:18] We're going to make a lot of mistakes.
[17:21] And if our answer is we just get rid of a leader as soon as they make a mistake, not a moral crumple zone like we're seeing globally right now, but a true mistake.
[17:31] They put stuff on an AI that they shouldn't have, for example, or they haven't. They skipped over their training session. Like that kind of stuff is like, we should own and celebrate that stuff and say, look, we now have data that says you didn't train your people.
[17:46] We've had 34% more complaints.
[17:49] That's not a punishment, that's an improvement vector. And start to look at things as a system,
[17:55] as flow,
[17:56] rather than we used to be the whack a mole. No people in legal, like, when you say that you're a lawyer,
[18:04] people will say one of two things. One, oh, I almost went to law school.
[18:09] Well, I almost was Cindy Crawford, except that I'm only like 5 foot 5.
[18:15] So we're off by a good 5 inches and a whole lot of jeans.
[18:19] So people say that to you or they'll say, I hate lawyers right to your face. It's like we're like one of those last few things that you're allowed to make fun of.
[18:27] What they should say is, wow, you're someone who understands how to interpret patterns and use cases.
[18:34] That sounds like an awful lot like a software engineer. We're sort of the prompt engineer ogs over here in legal.
[18:41] Like, what would happen if this scenario happened?
[18:44] Here's all the stuff I know about. My knowledge of how these laws interact and work and who gets to appeal to whom and who gets to make the call.
[18:52] What a magical power we have.
[18:54] Magical. We are the prompt OGs in the legal team.
[18:57] Debbie Reynolds: I love that. A prompt OGs. That's amazing. I think that's true. I think that's true.
[19:02] Michelle Finneran Dennedy: They don't realize how good we are at use cases, but that's what we do.
[19:06] Debbie Reynolds: Yeah.
[19:07] I want your thoughts about controls and this.
[19:11] I've talked to so many people, you know, we know ISO, NIST, 1001, different frameworks and stuff like that. And when I'm working with people, like, my eyes glaze over when they tell me about all these frameworks.
[19:23] Right. So when you get in and you talk with them and they tried out all the paper and all the policies and all the procedures,
[19:31] and then I ask,
[19:34] so what are you actually doing?
[19:38] That's what I want to know. And that's what we need to document. So it's for me, I always say that your policies or procedures should not be aspirational, they should be operational.
[19:49] Michelle Finneran Dennedy: Absolutely. I mean this. You've just described what privacy code was when we were purchased by Abex. You basically, and this was before ChatGPT unveiled itself. You upload your policy or your guideline or even your contract or your 10k and those documents tell on you.
[20:08] We respect your privacy. Well, there should be. We used JIRA because it's kind of the developer's drug of choice.
[20:15] There should be a ticket that falls out of that that says, what does that mean to respect policy?
[20:20] So that's something for the marketing people.
[20:22] What does that mean? What story are you telling about that? And then it says, somewhere in the 10K,
[20:28] we have predicted a 35% risk of loss,
[20:32] because now data loss, and this is actually a fact,
[20:35] data loss in an actuarial sense for insurance companies has surmounted ransomware loss.
[20:42] You wouldn't know that if you're just listening on Twitter and reading your LinkedIn.
[20:47] But the math tells us how important that privacy is.
[20:52] So that shrinkage number of how much loss are we predicting this year is to your point. Here's our policy on how we govern our stuff. Here's the number that we have to beat to improve upon.
[21:03] We want to start to predicting less percentage of loss.
[21:07] How do we do that? We've Got eight layers of compute. Do we have something? Are we minimizing?
[21:14] Are we looking at anonymization?
[21:16] Are we looking at deletion? The beautiful deletion of like only keep the data that sparks joy, my friends. Delete, delete, delete. You're not going to find a pony in pony.
[21:26] You're going to find cholera.
[21:28] So what do we want to do? We don't want to look through pony anymore. That's all the data centers and the data lakes and the data swamps. What we want to do is have another ticket that goes somewhere and goes into marketing or sales or biz dev and says,
[21:41] where are the apples?
[21:43] Because you're not going to just get a magical LLM. Because we want to feed our ponies and we want to attract our ponies. Where's the fresh data coming from?
[21:52] So each one of these things that you're proactively doing to your data can be orchestrated from the front end,
[21:59] your UX and your choices from the back end, your deletion and then that full stack of awareness that is modern day cybersecurity.
[22:08] Each one of those things notices,
[22:11] data flows. Well, I gotta call my Debbie and get myself a ropa. It'd be so much better if I could plug you in to an actual data source and not a human face,
[22:22] because my face will try to tell you the truth, my face will lie. The only person telling you the truth about your data is the API because it doesn't know how to lie, it just sends things.
[22:32] So if you have that kind of stuff into a stack,
[22:34] then over time it's hard to populate these stacks. So I have to applaud my early customers at Privacy Code. Like, you should talk to Rosemary Cooperberg at Demandbase, for example.
[22:44] She's one of my favorites.
[22:46] She was one of the first ones that said to me, this is gonna take a lot of setup,
[22:51] but it's going to be evergreen in value. And that's what I'm talking about. Exactly. What you're saying, Debbie, is when we look at controls and governance and we look at what kind of company are we, do we sell stuff?
[23:02] Do we sell services,
[23:03] do we sell ideas, do we sell data?
[23:06] Now we look through and we actually measure everyone in the company in those little buckets. Business divisions, sales divisions, marketing people, operations people, legal people, they're all spending their time touching the bottom of that data boat.
[23:20] So what is the measurable thing that we want to put into Whether it's a JIRA or a Monday or HubSpot, whatever your flow machine is. And now that we can as governance people,
[23:33] they're looking at the outcome. How many tickets have closed, how many deals have I closed, how many contracts have been completed? We're not looking for that. What we're looking for is how much of that flow is attached to governance.
[23:45] How many contracts contain language that recognize the AI governance requirements coming out of Canada and EU and the other places until the US finally gets its act together.
[23:57] Those are signals for you.
[23:59] Now I'm saying, aha, we've got all these requirements here. How do I leverage them to go talk to my developers or my purchasing people for my third parties and say, who can meet this standard?
[24:12] And that cuts away from other people who are not good enough for you? It's like finding the right man.
[24:19] Like you're dating all these dum dums who are going to treat you like garbage.
[24:23] Well, if your requirements are, I need somebody who shows up,
[24:26] I need someone who's here in the hard times. I need someone who is self aware enough to know where their defects are.
[24:32] That's a sexy vendor.
[24:34] You want to go out with that one and you want to make sure you have a good contract in case you got to get rid of that guy later.
[24:40] So that's how I look at this kind of live system of system of every activity is an opportunity to think about how does it impact data as currency. Just like we do for every dollar that we spend.
[24:51] A dollar on its own is nothing. Like a dollar in my hand is like it's getting sucked towards target to buy garbage. I don't need but a dollar in the hands of like John Chambers or some Jamie Dimon, that dollar's going to some big deal.
[25:08] So I'm going to follow the flow.
[25:11] And not necessarily just a little green piece of paper that says 1,
[25:15] follow
[25:15] Debbie Reynolds: the flow,
[25:16] follow the flow,
[25:18] follow the flow. That's incredible.
[25:20] What's your thoughts about this thing that we have? Right. And so you're the perfect person to ask this because you play in all these different spaces where some people, they're like,
[25:32] you know, all we need is a lawyer who reacts to regulation.
[25:37] Michelle Finneran Dennedy: We need to be that good. We're not that good.
[25:40] Debbie Reynolds: Or technology solves all our privacy problems. Or,
[25:44] you know, one of the ones I hate the most is people think that, oh, if we solve for cybersecurity, we solve for privacy.
[25:50] Michelle Finneran Dennedy: But all they want to do is talk about cops and robbers over there. Love them, but they're limited in their imagination.
[25:57] Debbie Reynolds: What are your thoughts here?
[25:59] In my view, always.
[26:00] You needed all of these People right to be able to do this work. But I think that because of the way we're moving towards so rapidly in emerging tech, especially AI,
[26:12] it is not a option,
[26:17] so it is not optional. Now it's like you need all of these people to be thinking in all these waves and communicating with one another.
[26:24] But what are your thoughts?
[26:26] Michelle Finneran Dennedy: You're so spot on. And this is where Brian and I really had our aha moment. The kind of practice, I mean, we do data valuation stuff now. We do, we'll help people prepare for their boards, we've set up data councils and that kind of stuff.
[26:40] That's our kind of day to day if you want to hire us to do consult.
[26:44] The reality is what brought us together was a concept that we call wicked privacy.
[26:49] And exactly as you're suggesting, a wicked problem. I had never heard of this. I had been practicing law for 25 years or 30 years at that point, and privacy for 25, I'd never thought of this concept.
[27:03] And Brian comes along and he says, you know what? Privacy is a wicked problem. So you can look it up and look up, search for how to make toast. There's a TED talk about how to make toast.
[27:12] It turns out that there are problems that are A never solved,
[27:16] B, can only be solved by starting to solve them.
[27:19] C, create new problems. As soon as you start to solve one problem,
[27:23] they're resource limited.
[27:25] And to your big point, which is the most important one, they have a number of different stakeholders with different needs and desires. So think about a wicked problem as like think of a lovely field, a nice green field with a river running through it.
[27:40] And there's nothing in the field. Well, along comes a farmer and he's got a couple cows. Well, cows have to poo and the cows got to eat. And you got to make sure the cows have something to drink.
[27:51] So off you get into the stream. Well, soon enough you're selling your milk, you're maybe making some of them into burgers.
[27:58] And now you got a town.
[28:00] Now a wicked problem says you can't just build willy nilly because you don't want people living beneath where the cows are pooing.
[28:08] You want to go upstream for that.
[28:10] So you're looking at the same limited resources.
[28:13] And now you've got population.
[28:15] So now you got to figure out where's there enough room for everybody. And that solves a new problem of who owns stuff. Well, that solves a new problem of how much and how fairly distributed.
[28:26] And that solves a new problem. So now we got all these Little people. And we're going to educate them so you see how we're getting on.
[28:32] And so there's all these people and the smallest child in that town and the littlest calf in that town all have a stake.
[28:40] So think about. And it's the same thing if you look at it as just your corporal self. As we age, our needs differ. We gotta eat every day. You're not gonna solve the problem, problem of eating by having the perfect sandwich.
[28:52] So when you think about systems and you think about data,
[28:55] you think about, you need fresh data for certain things.
[28:59] You need long standing data to sort of prove veracity, prove identity,
[29:05] prove credit worthiness, for example,
[29:08] and, or prove who these different captains are. As you've said, you've got a marketing creative person, you've got a legal person looking at how to interpret this stuff, you've got a technical person who can build things, you've got another person over here who can bring in the crowds to buy your stuff.
[29:23] Each one of these people are rewarded by something slightly different.
[29:28] They're driven by something slightly different. And yet we're looking at the same limited resource to achieve something together.
[29:37] This is the essence of wicked problem solving. You need these constituents, at least within an organization to come together and say, here's what I have to offer here of value.
[29:48] And you have to build sort of matrices of trust between these various factions so you don't get lawyers saying, I'm so smart, I was at the top of my class and that's why I went to law school.
[29:58] And then you get the sales guys going, slicking back their hair with like whatever brill cream they got going. Yeah, I got Cs and Ds, but I could always get the prettiest girl.
[30:06] These skills are important and if you compete them with each other, you break a system. If you instead look at a wicked problem as understanding how to feed all of these pieces in a constituency and most importantly, human at the heartbeat.
[30:23] How do all these humans get served in the best, most fair,
[30:28] appropriate way that everyone in the system benefits as much as possible.
[30:32] And so I think that's why that metaphor is so powerful to me and the means and ways that we've been solving things like education and poverty and health and in Brian's case, anti nuclear pro proliferation, which he did for 25 years.
[30:46] These are very complex things. And if you walk in there and you think you're one superhero,
[30:51] you're going to fail.
[30:52] If you walk in there arrogantly and saying your view is the only view you're going to fail if instead you walk in there with a bit of humility and a lot of humor and you look at it as a team project to solve even with the biggest supercomputers in the world that can make us talk smart,
[31:10] having actual humans that have experienced things or haven't experienced things,
[31:15] anything. So they're bringing you fresh perspective.
[31:17] That's how we solve these problems and then create more interesting problems that we'll solve later.
[31:23] Debbie Reynolds: I agree with that completely. And I love the way that you put that together. So very, very vivid storyteller. I love that.
[31:32] Michelle Finneran Dennedy: Maybe we should have left the grass alone. Send those cows home.
[31:37] Debbie Reynolds: What's your thoughts about the Leisha of data?
[31:42] So I always say, and this is true, I'm a technologist. So like data systems are made, created to retain data, they're made to remember data.
[31:53] And what we're saying with privacy laws and regulations that okay, now you have to forget data.
[31:58] And so it's like swimming upstream.
[32:01] What are your thoughts about these two different things that happen? So one, it happens naturally. And I tell people when they say, oh my God, this company, they kept my data and they did that, it's like, well,
[32:12] without any intervention or interaction, that's what it will do because that's what the technology is made to do.
[32:20] So we have to work with companies to figure out how they can do this thing that the technology was never intended to ever do. What are your thoughts?
[32:30] Michelle Finneran Dennedy: I have been a bit of a heretic in this for years, so I'm glad that you share my religion, Debbie. Because when we first started talking about the right to be forgotten, I thought,
[32:40] oh boy, right?
[32:42] We've never had that successfully happen in the span of human history.
[32:47] And I have lived in places like New York City where you think you're anonymous, but you always running into someone you know because you turns out you kind of go to the same kinds of places if you're that same kind of person.
[32:57] And I've lived in a 1900 person town where the cops picked us up at the river and my boyfriend went in the back of a squad car for a special orth court home.
[33:07] And I was told that he was going to call my mom. Vicki knew her. If I wasn't home until 10 minutes different sorting, everybody knew everybody. They were no privacy back then.
[33:17] You had to move out of town to get your privacy.
[33:20] So it's never happened in human history. So that if we think that we're going to somehow magic systems that were never designed to delete things to suddenly do something that humanity has never asked of itself before.
[33:32] Of course we're failing. Of course we're failing.
[33:36] So I got very mad about the whole DSAR palooza for a while there because it really just sucked all the energy out of real priv privacy engineering for a good 10 years.
[33:46] I understand the impetus for it. I understand things like the delete act will force change.
[33:54] I think the unfortunate thing is, as you say, we need incentives because no one was ever promoted for not inventing a new system to retain more data.
[34:05] Never.
[34:06] People are barely,
[34:08] barely promoted for making systems actually work together and interoperate together.
[34:14] So how in the world are we going to take these legacy systems that are only getting more and more dense to store stuff,
[34:23] to suddenly give up stuff? So that's thing number one is the professional, the human, the economic incentives are so stacked against true deletion.
[34:34] Now the good and bad news is the leverage news. I won't give it a good or a bad moral character. The leverage news is it's become excruciatingly expensive,
[34:46] both on a liability perspective, like maybe gamble, because a lot of these guys, well, some of these guys truly have no emotional eq. So if you think you're going to scare a CEO,
[34:57] you're probably not. Odds are that that person doesn't have the scare gene and that person also thinks that they're God.
[35:04] So if you think you're going to shame a CEO unless he loves his mama,
[35:08] you need to take a different tactic. You need to get his money. How are you going to get your CEOs money?
[35:12] Well, you can say that all the firms around you are getting sued and I'm like looking at a demand letter right now.
[35:18] They won't want to lose their money.
[35:20] What you can also do is say this is costing you so much money over here that you're not spending over there.
[35:27] And so that's where I think deletion comes in with our old fashioned riding on the pony of quality.
[35:34] And that's where I get into talking about cholera versus sua sponte ponies.
[35:40] We have all these piles and piles of data because we have told our marketing people that they should be paid for more clicks, more attention,
[35:49] more activity.
[35:50] That system has been gamed for at least the last 10 years where you can automate clicking, you can automate stuff and there's no wallet at the end of any of it.
[36:00] But they're still being incented to get more and more and more engagements. That's where you get all this tracker stuff, this goo. If you look at the Yandex in Russia, it's gone from 2 billion to 14 billion in the last 5ish years.
[36:14] Do you really think everyone's like going to Yandex Ru? And no, what they're doing is they're putting trackers on every website that you can think of. When you're typing in, what is that weird mole with the hair sticking out of it on my butt?
[36:26] Is that okay,
[36:27] that's going to Yandex?
[36:29] If you're like going to your school website and saying, what time is the gym assembly? Do I need to bring brownies?
[36:34] That's getting shipped off to Russia.
[36:37] So if you're going to start minimizing,
[36:39] you can't go straight at the marketing guys and say we're going to like delete all your contacts from your last conference. But what you can legitimately say is not only are you getting a 7 to 10 second delay in your data renderings, it's slowing down your site because all of these trackers have to compute.
[36:58] The computers take energy, the computers take time. These scripts actually sometimes conflict with each other, so you're not getting the search result that you want.
[37:07] When you look at it that way, you can start to minimize from the crust,
[37:11] get rid of stuff that you can clearly explain.
[37:14] Giving somebody else billions of dollars.
[37:17] So someone's put a dollar on that money, on that data and it's not 50 cents per user. Like we used to say, we'll pay our Facebook people, but we'll give them 50 cents a year.
[37:25] No, no.
[37:26] Cumulatively,
[37:27] you,
[37:28] my friends who have websites, are paying a Russian website billions of dollars every year.
[37:35] That seems like a good place to minimize.
[37:37] Now we're starting to look at cloud. Cloud used to be cheap. One of my roles at sun was chief governance officer for cloud. I was in charge of figuring out whether we should support ****.
[37:48] We did,
[37:49] because that's how new technologies get paid for. But we didn't have child **** and we didn't have illegal ****. So it was this quagmire of ethics.
[37:59] But when we looked at the governance of the actual cloud back then, and we're talking 2008, this is early,
[38:06] we started to say there was a competitor that you may have heard of called Amazon,
[38:12] and it was offering compute for $0.05 CPU unit per hour. The cheapest we could do it securely was 10 cents because we weren't reusing some naughty things that they were back then.
[38:24] All this stuff is old news, so nothing new to see here. But back in the day we said secure compute was about 10, 10 cents per CPU. Then we started this tokenization thing on top of that long winded story to say as we've evolved here,
[38:38] the lore that storage is cheap because you can buy highly dense compute system, the lore that there's not a spinning platter somewhere on a mainframe server,
[38:49] the lore that you don't have to hire human beings or somehow figure out your agent upon agent to manage all of this stored data,
[38:58] that all of this is cheap,
[39:00] that lore is a lie. That's like saying all donut all the time. Diet will suddenly get me healthy.
[39:08] So when I actually break down where the real risk is lying,
[39:12] you do have some lower hanging fruit than others in monitoring the crust.
[39:17] Then when you start to go into these programs,
[39:20] if you understand identity about humans and how they interact,
[39:23] you'll find first of all, if you circle like you cycle through your CMOS like every couple of years, same thing with CISOs. It's a game of Thrones job. If you're, if you've got a C in your title and you're not playing Game of Thrones, you're kind of wimping out.
[39:37] That's a different discussion for a different day. But typically those people do change seats a lot because they're there to be change agents. They're there to drive and do hard things that are not always make everybody happy, peaceful land.
[39:51] Attached to those human beings are often vast stores of unloved data.
[39:57] So that's another easy low hanging fruit. What are the programs? What are the things from years past?
[40:03] So if you really get serious about making your data center efficient,
[40:08] higher quality,
[40:09] better training materials,
[40:11] possibly having provenance so that you can resell that data as training data.
[40:16] Now we're talking about deletion, but we're not talking about it because we're like good little sunshine girls. We're talking about it because we are a girl who likes a Gucci.
[40:27] Debbie Reynolds: That's right.
[40:28] Michelle Finneran Dennedy: Very long winded. Aditya, you got me in a mood today. Debbie, I can't tell you, it's tremendous.
[40:33] Debbie Reynolds: Oh my God.
[40:34] Michelle Finneran Dennedy: Make your data Gucci. For God's sakes, people get Gucci with it.
[40:39] Debbie Reynolds: Oh my gosh.
[40:40] Well, Michelle, if it were the world according to you and we did everything you said, what would be your wish for privacy anywhere in the world?
[40:49] Whether that be human behavior, technology or regulation.
[40:53] Michelle Finneran Dennedy: God, that's such a good question.
[40:55] If you had asked me that question in 2000 when I really got into this and became obsessed and you said let's go forward a quarter of a century, I Would have said there'd be a treaty system.
[41:05] I really thought a lot about riparian rights,
[41:08] water rights,
[41:09] space law,
[41:10] admiralty law.
[41:12] We have a common enemy of pirates.
[41:14] We have common goals of, you know, not freezing to death in the darkness, as our board member says at Abex,
[41:21] you know, we need to get liquid natural gas where it needs to go.
[41:25] Not just because someone's paying a lot of money for it, but it's actually keeping people fed and keeping people warm.
[41:30] So when we start to think about how we govern and how we create standards and how we make them not just standard. Someone said this to me the other day I thought was genius was like the way we talk about standards.
[41:43] It's really like, here's what I'm doing and I'm going to stick a sticker on it. And that's a standard. What we need standards to become is this is the way that we are all behaving in an ethical manner.
[41:55] So how do we have these treaties so that we understand that water needs to get to the animals, it needs to get to the humans.
[42:03] The fish should have a say in it at some point in time.
[42:07] How do we have data that is high quality, that doesn't lie about us as many people?
[42:13] I wish more nice people would actually reach out to me and say, hey, I dropped your name today. Because I would love to hear that. Because let me tell you what, the nasty ones,
[42:21] they're pretty prolific.
[42:23] They're willing to say bad things,
[42:25] mostly anonymously. I don't really get that anymore because I'm not on Twitter anymore. But man, oh, man, like, people get creative in their rage if they can find a person.
[42:35] So what if instead we had people who actually knew what they're talking about?
[42:39] And that kind of data is honored and garbage data,
[42:43] we are able to assess or at least predict kind of like a FICO score,
[42:47] that this is someone who's like, talking a lot of stuff and may not have the right stuff, or this is likely to be a bot, and that's why they're so prolific online.
[42:57] Maybe we don't follow that signal to design our next campaign.
[43:02] And I think on top of all of that, really, the heart of it just goes back to the beginning of this conversation, which is privacy really is the heart of AI and quantum.
[43:12] And it's our why every individual deserves to have their own autonomous story long after they're gone. All that's left of my parents now is their story.
[43:21] And I'm so glad and grateful that they lived when they lived, because their story is so clean and so clear. It's known by fewer and fewer people now as time goes on.
[43:32] But I wonder if that will be the case when I go. What will be my story?
[43:36] And I deserve to have a good story. I've lived. I've made a lot of mistakes like everyone else, but I've lived a good life and I've loved as hard as I could love, and I've done it with high wire risk and taken a ton of chances.
[43:48] That's valuable. And I think people that live their lives the way that you and I have raised them and gone out to try to teach other people and bring them into, you know, having exciting lives that works for everywhere, that works for B2B, that works for B2G, that works for B2C,
[44:04] or works at home. If you're a mom who's like, wanting to get into privacy.
[44:07] We don't get paid as well as we should, but we have a lot of fun and we can make a lot of money.
[44:13] And we Gucci.
[44:14] Debbie Reynolds: We Gucci all the way. All the way. All the way. Oh, my gosh.
[44:19] Michelle Finneran Dennedy: I've been a little off script, I think, but no, I love it.
[44:24] Debbie Reynolds: I love it, I love it, I love it.
[44:26] Oh, my gosh. I could talk to you for hours. This is so much fun. Thank you so much, Michelle, for being on the show.
[44:32] I literally.
[44:33] Michelle Finneran Dennedy: This is like a dream for me. I was like, I'm talking to everybody today. I got my scarf out. I was all ready.
[44:39] Debbie Reynolds: This is phenomenal. Thank you so much for sharing so much, not only on this episode, but just in everything that you do. So I'm definitely fangirling you as much as I possibly can everywhere that you go and with anyone I talk to.
[44:52] So. You're incredible.
[44:54] Michelle Finneran Dennedy: Thank you. As are you. I'm so, so grateful that you're still hanging in this crazy industry. I know a lot of us are dropping out and. And a lot of us are retiring now.
[45:02] So I really. I'm so appreciative that you're still out here every day sharing and teaching and getting people to see the value in everything that we do. So thank you so much, Debbie.
[45:13] I really appreciate. You are the diva for a reason.
[45:17] Debbie Reynolds: Thank you. Thank you. Thank you. You. Thank you. Well, we'll talk soon.
[45:20] Michelle Finneran Dennedy: Yes, ma'. Am.
[45:22] Debbie Reynolds: All right. Thank you so much.
[45:23] Michelle Finneran Dennedy: All right, have a good one. Bye.