E249 - Marlyse McQuillen, IntegraConnect LLC - Vice President, Regulatory Compliance, Privacy and AI
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now,
[00:26] I have a very, very special guest today on the show, Marlyse McQuillen. She is the vice president of regulatory compliance, privacy and AI for IntegraConnect. Welcome.
[00:38] Marlyse McQuillen: Thank you so much for having me, Debbie. It's really a pleasure to be here.
[00:42] Debbie Reynolds: Well, it's exciting to have you on the show. You and I have been connected on LinkedIn for a long time and you always send me really cool things,
[00:52] maybe things that I may not have even seen or been aware of. So it's really great to be able to have you on the show.
[00:58] Marlyse McQuillen: Well, I'm happy to be here. There's so much to discuss.
[01:01] Debbie Reynolds: It's a lot to discuss. Well, I want to talk about. Let's first talk about your career journey and what got you interested in privacy as a career.
[01:12] Marlyse McQuillen: I was working on an IPO as a young lawyer at Greenbrook Taurig for a client who was in the middle of diligence by investment banker. We brought in an expert to talk about the Children's Online Privacy Protection act.
[01:27] And it just kind of came up and I was really interested in it. So I stayed in private practice for another couple years and then I've kind of come full circle.
[01:37] Privacy's always been part of my job inside as inside counsel.
[01:44] And I've come full circle. I've started to volunteer for the Plunk foundation with John Kavanaugh, which is dedicated to the online privacy of marginalized communities,
[01:56] human trafficking victims trying to get help, people living at the federal below the federal poverty line. And as a first, when I was in house,
[02:08] I started at Onady Group, which had bought out the distressed loans of most of the major lenders. And so part of President Obama's refinance plan to get the country back on track involved trying to get the borrowers to refinance under the HARP program.
[02:30] And so I worked really closely with our marketing folks and essentially became the on the privacy expert.
[02:38] Built out the privacy program there.
[02:41] And as you know,
[02:42] Gramm Leach, Bliley act and FCRA Facta privacy notices were part of it. I put together matrices. We were a state licensed entity. So I've been tracking from voice recording issues to state breach notices dealing with privacy.
[03:00] And then I put on my commercial hat from regulatory and went to Johnson Controls and in that capacity was there negotiating security agreements as a vendor for pretty much all the Fortune 500s.
[03:16] Because at that point the companies were preparing for GDPR and CCPA to come in.
[03:21] And as you know, part of the requirements are to have physical safety and safeguards in place to protect the essentially the data centers your company in order to protect the data that you're collecting and processing.
[03:37] And so in that, in that capacity I also took on general counsel role of a company that they had just acquired that was a blend of surveillance device manufacturer and a SaaS company that provided retail analytics as brick and mortars were trying to fight for their margins competing against Amazon.
[04:00] And so it was essentially like the,
[04:03] the physical device was like a clicker,
[04:05] like a person was like, like a ticket taker and it had a pseudonymized like image of people coming in. And then basically none of that was shared with the people.
[04:17] But we had analytics and retail experts who were, you know, teaching stores like when the best staffing times were coming. And so it was a great experience. I got to negotiate against,
[04:30] you know, Meta and other organizations that wanted to engage in sort of data sharing, pilots, et cetera. And then I had an opportunity to come in to Integra Connect and help the general counsel, Chief compliance officer build out the HIPAA program.
[04:46] The company does various things and we're HYTRA certified at this point,
[04:53] helped out with that.
[04:54] And we have have patient data, we de identify it, we do analytics dash to help pharma sort of understand the efficacy of the products that they're building and inform that.
[05:07] And then we also had an EHR that was for specialty practices.
[05:12] And so we had a lot of sensitive data. And so I'm just in figuring out my place in the sun in terms of like where I want to go next.
[05:22] There's so much to do there.
[05:24] Debbie Reynolds: Yeah, there's a lot of ground to cover and there are a lot of very sensitive things. And so I think it's really interesting that you have worked on probably some of the harder privacy issues we have in the US So like the COPPA and HIPAA and what I call HIPAA adjacent where it's not really hipaa,
[05:42] but it's still covered by some consumer laws in terms of data handling and transparency.
[05:50] Marlyse McQuillen: Well, it's really critical because I'm not sure how aware you are, but the Medicare, which in a few decades I'll be,
[05:59] I'll be the recipient of and I want this program to last very long. They're changing the way that they compensate doctors from a fee for service to value based care.
[06:10] And so they're taking more proactive approach to preventative medicine. And so as you say, the HIPAA adjacent industries and platforms that aren't really covered by HIPAA because it's not a privacy law,
[06:21] let's be honest,
[06:22] it's got privacy rules that have since come into play. But, but that said, the digital health data that we're all dealing with,
[06:31] particularly in the 23 and meet bankruptcy, to the extent that the shift to value based care is going to be effective,
[06:39] the doctors and providers are going to have to rely on tools that make people take more of a proactive approach to their health and there's going to be low adoption if people think that that data is not going to be protected and that ultimately their insurance premiums are going to go up or who knows what.
[06:58] We know we're being surveilled.
[07:00] Despite my best efforts, my aunt, uncle and cousins like all wear Fitbits and they have like little, they, they champion like they're competitive and they try to get their steps and they're very healthy people.
[07:12] But ultimately I tried to tell them that this isn't protected. You've got to make some choices.
[07:18] And so there isn't really a lot out there indicating that these wearables are digital health data is covered.
[07:26] Debbie Reynolds: Yeah, exactly. So the point that you brought up is a really good one because I think a lot of people have a mistaken view about health and so they think that almost anything about them, about their health has this higher protection like people expect with like HIPAA when they go to the doctor.
[07:47] And it really doesn't.
[07:49] But I want to talk a little bit about 23andMe.
[07:52] And so this is a case you've been covering or you've been watching very closely and so you're very aware of kind of what's happening there. And so to me, I think 23andMe is like the ultimate example of what can go wrong with your data, especially when it's outside of that HIPAA protected environment.
[08:13] But can you tell me a bit about what happened, where we are now, what's going on with the lawsuit or their financials status right now?
[08:22] Marlyse McQuillen: They've already had the bankruptcy was declared in March and the timeline has been very, very quick.
[08:29] For those who aren't in the legal field,
[08:33] when a court moves really quickly, we call it a rocket docket.
[08:37] And there's been a lot of speculation about why the bankruptcy was filed in Missouri, in St. Louis.
[08:44] And part of that is A rocket docket. So I think that the sale has moved forward more, much faster than people expected.
[08:52] The bankruptcy law actually changed as a result of a Floridian like a GT employee,
[08:59] formerly Louis Salazar, who helped change the bankruptcy law so that a consumer privacy ombudsman would be appointed to advise the court in cases where data was the primary asset.
[09:13] And in this case there was a little bit of back and forth. But conveniently the Neil Richardson,
[09:21] who's at the University of Washington and is a premier scholar in the area,
[09:27] was appointed and put forth a report indicating that the sale did not meet for various reasons based on different iterations of the privacy policy in place. And generally they had been very eager to assure people that they had control over their data,
[09:48] which I would still say is right.
[09:50] I think initially 15 million, now I believe at least 2 million consumers have deleted their data after the bankruptcy was announced.
[10:00] But Professor Richardson worked really quickly, got the report on a very tight deadline to say that for under state specific genetic privacy laws there needed to be express consent. You couldn't just sell it.
[10:14] And so the court,
[10:17] when they looked at and there were some issues with it, there were two main bidders,
[10:21] TTAM Research Institute,
[10:24] which is Ann Wojcicki's company, who was a former CEO of 23andMe co founder,
[10:32] former CEO and Regeneron Pharmaceuticals.
[10:36] There were some irregularities in how the bidding because ultimately it was reopened. After Regina Ron won the initial bid and she at TTAM Research Institute won the second bid, they, they went back and issued a higher bid which you know, according to bankruptcy you have a duty to the stakeholders and the stockholders and creditors to maximize the recovery.
[11:05] So anyway, that aside,
[11:07] the judge pretty much rubber stamped the sale of TT to 23andMe Research Institute and disregarded like essentially said he was giving due regard to the report,
[11:21] said that regular notice and consent wasn't meaningful, that there are people that haven't logged in in years and years since they initially got their results in that there were people who had deceased.
[11:33] And so essentially in a bit of legal contortion,
[11:38] the judge said this isn't a true sale because you know, the data is residing on the same servers.
[11:46] This is like, you know, there's succession of leadership in terms of the CEO of 23andMe and now TTAM but for the purpose of allowing the debts and liabilities to follow them.
[12:00] And I haven't even mentioned the date of breach that happened in 2023 that basically careened them towards the bankruptcy. I mean there's, there's a Lot of speculation about why they,
[12:13] they were in this financial position. Part of it was really like their business model was,
[12:19] was kind of one and done, not a SaaS model, which the market seems to like more. And so there was a concerted effort to try to take it private. And, and so now she's successfully.
[12:30] Ann Wojcicki has unapproved sale.
[12:34] But the five states that have laws on the books, genetic privacy laws on the books, are understandably upset that at the legal contortion that the judge engaged in,
[12:46] so to say this isn't actually a sale, so these laws don't apply.
[12:51] And so they're appealing it. Eyes are peeled. And I'm trying to do my best for LinkedIn and the community to keep tabs on what's going on, but those appeals are there and being considered for the rest of the states and the research.
[13:08] The successful bidder, let's say,
[13:10] made a lot of concessions that spoke to what Professor Richardson put in his report in terms of it being a nonprofit in terms of future bankruptcies, because the case,
[13:26] the class actions related to the cybersecurity incident continue.
[13:31] And so this might end up back in bankruptcy, who knows?
[13:35] But I think it's really something we need to keep our eye on. There have been discussions about a federal law that would prevent this,
[13:45] but essentially ever since like the.com boom, you know, data has been part of company assets.
[13:52] When more the digital economy came into being and that data was constantly collected,
[13:58] this has been part of it. And so with all of the big bets that venture capital is making now on AI powered instances and chatbots and the like,
[14:09] we're going to see a lot more of this data out there. And so we'll see.
[14:14] Yeah, I would expect a lot more people to start filing bankruptcies in Missouri for the moment instead of Delaware.
[14:23] Debbie Reynolds: So I guess a couple of things that to me about this particular case, one is that because of the sensitive nature of the data that they had, people really lost, were losing trust in this company because like, for example,
[14:39] like I had a family member who wanted to do, who did the 23andMe kit. And what they really wanted to know is like more about their heritage, right. Especially for black people, people of color, where we may not have long lineage that we can tell like our ancestors has been very attractive to people that I know.
[14:58] And so what people, when people gave their information, what they didn't expect is that enforcement will be using this for different things to track down different people. And, and then that the briefs that happened that you talked about that's another hit to their people's confidence about the company.
[15:15] And so I, as that was happening I could see that the valuation of started going down. People wanted to use them less. And then when the bankruptcy hit, like you said, a lot of people wanted to delete their data and a lot of people had had questions.
[15:29] And so I think what this case did was really brought up that huge loophole or gap that we have around data, especially if a company is, it goes bankrupt or gets sold.
[15:40] Right. Because people, they're like well I didn't like if it gets sold to X,
[15:45] like I don't know this, I don't know X company and I don't trust them and I don't want them to have my data. So yeah, I think it's a huge, huge.
[15:53] Marlyse McQuillen: Well, I mean big tech. Ann Wojcicki is the ex wife of one of the founders of Google. So you know, she's in the Silicon Valley world. So people have a lot of concerns about whether this data is going to be sold de identified data sold to basically train the LLM models because it's big bucks because they can use,
[16:13] they can scrape the Internet, but there's a level of, for clinical data, there's a level of protection and a limited amount of data that can actually inform these systems. But your, your point is well taken.
[16:25] Regarding the people who I myself submitted, it wasn't 23andme but ancestry.com and had a huge revelation that my father wasn't my biological father, that, that we had,
[16:40] my parents had fertility issues and it was huge an impact on.
[16:47] And there's with fertility issues particularly there's a lot of people who are single mothers by choice and they want their kids to also sort of have ideas about their medical histories and consumer expectations paramount here because people think that because they're doing this in service of their health and or that part of a sample is given that it's somehow covered like as more sensitive data.
[17:17] And it's not. And I think we need a federal act yesterday to sort of COVID this.
[17:23] Yeah,
[17:24] I've got some ideas on that front. But yeah, right.
[17:30] I think we had discussed in the past like Heidi Sass has and I agree. I think that the Fair Credit Reporting act needs to be amended and essentially like anybody who's selling or sharing data has to be like treated as a presumptive data broker or credit reporting agency unless they,
[17:50] you know, meet a safe harbor of minimum standards.
[17:53] Debbie Reynolds: I agree. Yeah, that would be the easiest way to do it.
[17:59] Marlyse McQuillen: Yeah, it's much easier to amend an act than it is to have something out of whole cloth. And so that, that's what I would do. I mean, I, I've called Lois Frankel to try to suggest it, but she, she's my representative here in Boca Raton, but the Democrats don't have a lot of power now.
[18:19] So she was like, I can suggest it. Her, her assistant said, I can suggest it, but I don't know how this is going to get any traction.
[18:27] Debbie Reynolds: Well,
[18:28] I, I want to talk about.
[18:30] I typically don't do pop culture, but this is something that's taken over the Internet. And this is Coldplay gate.
[18:37] So this is where the couple in Massachusetts were at a Coldplay concert. And their reaction to being on a kiss cam made people pay attention to this and start like, investigating these folks.
[18:53] And so there's been some fallout already.
[18:56] The CEO of the company, he resigned. And now, now that the company has had all this attention, so they're actually riding the wave of all this attention and stuff like that.
[19:07] But this is, this is.
[19:09] I don't know what to say. I've seen a lot of memes here, but I don't know. I guess I want your thoughts about.
[19:17] Is this a privacy story?
[19:19] Marlyse McQuillen: I think it's a cautionary tale,
[19:24] but it's not necessarily a new privacy story.
[19:30] For my own, I think I posted recently, I went to high school with the CEO,
[19:37] and this was at Foxborough Stadium. And this is a place where since we were kids in high school, we were going.
[19:48] Kiss cams aren't a new thing.
[19:51] Perhaps, like some of the issues related to the advance in the technology in facial recognition might be a new story,
[20:01] but essentially the fact that this has rippled through the Internet, I think I even saw something like there was a report on this connecting, because I looked up my high school.
[20:13] I didn't know Andy Byron well,
[20:16] but it was a small high school that we went to. And so people knew to be discreet. Right. Otherwise the entire town would know all your business.
[20:24] And so I think it's, it's just another cautionary tale that the scale of the world that we have, it's a tale as old as time that there's gossip and you need to be discreet.
[20:36] But based on how connected we are and how much data and metadata is being collected and how these stories can then,
[20:45] you know, a little bit like, like one moment essentially caused the picture to be picked up. And then, you know, people can easily link that to where you work,
[20:57] and then they can check your LinkedIn and see what your position is. And then people who get really like in the weeds can connect to people that are connected to them and get dirt on them.
[21:10] And you know, I even saw something where somebody old like who lived on Lori Lane with them said that he got into a fight with him when he was like younger.
[21:19] Like the person wasn't named. But everyone can pile on and it just becomes way more of a conversation and occupies the attention. I don't know how much this is going to continue.
[21:33] Hopefully it will die down. But the ramifications that the Byron family are going to be dealing with and the,
[21:39] the divorce is going to be.
[21:41] There's plenty of data for his life to act on in the divorce. So I, I do see it as, as a privacy story based on the scale and the amount of people and how global it was involved.
[21:54] But it's also sort of a,
[21:57] a sense of what the town platform is. Right. So you come from a small town, the, the world's become a small town.
[22:06] So you've gotta be way more careful.
[22:09] And part of that is not necessarily kiss cams, but now everybody has like a kiss cam in their pocket. And so I looked through my yearbook. There's, it's not digitized, thank God, but there's a lot of details that people shared back then.
[22:23] And I think that it's really sort of,
[22:26] especially as a parent, it's an instructive lesson on being discreet,
[22:31] making good choices and just also realizing that a moment like a lapse in judgment can impact so many more people than you think it can and so act accordingly.
[22:45] Debbie Reynolds: Yeah, I hadn't thought about that in terms of the ripple effect on all the other people who are, were not participants in the kiss cam but are impacted by it. So it's the company, it's the families, it's town is in your part.
[23:01] And so it's, you know, it is really interesting. But then in my view,
[23:06] people need to know that when you're in public that you're,
[23:11] you don't really have a reasonable expectation of privacy in public. And so I think the thing that, that has happened that people,
[23:19] you know, have been kind of shocked about is that it kind of snowballed. Right. So it went from the, this public display to all these other tentacles of their lives.
[23:30] Right. Whether it be professional or personal.
[23:33] And then the fact that the story doesn't seem to be, it's tabloid esque at this point. So it'll be the divorce and all this other stuff that leads on from that.
[23:43] Right,
[23:44] exactly.
[23:45] Marlyse McQuillen: I mean, I think it's the discourse that we've got a lot of people just want to pile on their.
[23:51] What they would have done in that situation. And it really makes exacerbates the discussion and I think some of the pain that, that these people have like in their personal sphere.
[24:04] I think my brother had said that there was an image of someone else from the company who is like the second in command in hr and she had some sort of grin when she, she noticed like someone else had taken a picture of her.
[24:16] And so there's speculation about oh well, she knows she's going to step into that position or you know, so it's just, it makes us all sort of warriors.
[24:27] And so,
[24:28] you know, we,
[24:29] we the reasonable expectation of privacy. Arguably, you know, I don't want to be a legal nerd here, but you know, when you. We talked about cars, that was the beginning of it with the fourth Amendment and like the,
[24:43] the lack of a reasonable expectation of privacy in your cars.
[24:47] Debbie Reynolds: Right? Yeah, well, cars totally different. That's very. Oh yeah, that's a whole other ball of wax. But I guess the moral to the story is like don't go to stadiums if you're doing something you shouldn't be doing.
[24:59] Probably.
[25:01] So this is not necessarily a kiss cam story, but over the years,
[25:06] you know, I'm in Chicago and we're a sports town, so over the years we've seen things that have happened to people who were in stadiums and they ended up on camera.
[25:14] Someone saw them. Like John called in sick, but he's at the Cubs game. You know, so like these things happen.
[25:21] But this is probably like a massive global scale. But don't call in sick and go to a stadium because someone's going to see you on, on the camera. You're going to get caught.
[25:30] Marlyse McQuillen: So you have to use common sense. Right.
[25:35] Debbie Reynolds: Gotta wear a mustache and glasses or something like hi, hide your face or something like that. Exactly.
[25:44] Marlyse McQuillen: Yeah.
[25:46] Debbie Reynolds: What's happening in privacy right now or technology or in the world right now that's concerning you most?
[25:53] Marlyse McQuillen: Well, I think the HIPAA adjacent space is concerning me.
[25:58] But with respect to that, I think that there's also a question of, you know, even people who are trying to do it right. The technology is moving at such a fast pace that you know, de identification under different standards, you know, there's questions about how do you de.
[26:16] Identify the data in a way that can't get reidentified.
[26:20] And so that's something I'm sort of just professionally keeping my eye on.
[26:25] With respect, personally, I have a 10 or 11 year old son, his birthday just came up and I'm very concerned about, well, he's not allowed to use any social media, but I'm, I'm very concerned as he goes into middle school about the tech and he's going to a high tech middle school about what kind of technology they're being exposed to and what kind of data they're going to be collecting.
[26:51] And so I'm always that mom, right?
[26:54] I've got the personal hat, but I'm always the person saying, no, you can't take picture of him for the yearbook.
[27:00] Not giving my consent for this or that. And so what I'm concerned about now really is like, particularly in light of Trump's executive order on AI in K through 12,
[27:13] what kind of data is going to be collected and what kind of safeguards are going to be put in there.
[27:18] And so I've been working on curriculum. My son's in Scouts,
[27:23] so I've been working on a curriculum actually with John from Plong foundation on AI orienteering to teach them about safe, responsible use of AI and evaluate the outcomes, the output being considerate of the input they already have, like a digital online safety merit badge.
[27:42] And so that's another thing I've been kind of working on because it's really hard to think about all the things that are going on. And I'm always that person at the party talking about, like, either the last book I read, whether it's Careless People or Empire of AI, telling people to read it and to wake up and think about it.
[28:01] But I'm tired of complaining,
[28:03] so I, I want to do something about it. I actually wrote a song,
[28:07] the Rights, They Are Fading, that is supposed to raise awareness about kind of what's going on and urging people, as potentially an anthem to kind of take a look and get more involved.
[28:20] Because ultimately I think people hear privacy's dead or privacy's on life support. Some we need to be careful about because history repeats itself. There's a reason the Europe. Like, I, I don't know if you've been over to Europe or you've seen, you've gone to any of the concentration camp sites,
[28:39] but I've been to Dachau and I went there and I looked at all of the detailed information that was kept. And so history repeats itself. So, like, bad things can be done with this data.
[28:51] And I think that we're all engaged in a cost benefits analysis constantly about whether the convenience is worth giving up the data.
[28:59] And there's. I got a new phone the other day and an Apple phone and I haven't done it yet, but I was going to do like a YouTube series going through the terms and conditions because there's a South park episode that sort of jokes about what you're agreeing to.
[29:12] Yeah, aside from what I do professionally, personally, I'm really trying to get step outside of the echo chamber of data protection professionals and just on a larger scale,
[29:25] educate people as to what we should be concerned about, because we've got right and creators, particularly people putting out their ideas and their thoughts and then all of it getting scraped and then used in the algorithm and no value being provided in kind or in return.
[29:43] So I think we're at an inflection point and I think that the more people get involved, active and, and start like asking questions, particularly with the cognitive offload of people's reliance on generative AI,
[29:58] gotta be asking these questions.
[30:02] You know, I think there's a sense of defeatism.
[30:05] It's not done the digital coup,
[30:09] it's not a fait accompli. So we've got voices and we can use them. We've got pocketbooks and wallets and we've got our eyeballs and we can them. We can move our feet to different platforms.
[30:23] And I think that we've got a friend of mine from college, Baratunde Thurston, is like, big on using Citizen as a vert. We're citizens and we've got to like, think about the society that we want and the society we want our kids, nieces,
[30:40] nephews, friends, kids to inherit.
[30:42] And so I think we're at a very important time and we've got a responsibility, particularly our privacy professionals, to speak up. I mean, the certifying authority, the iapp,
[30:54] is policy neutral and we've already seen the social media devastations of the tens.
[31:02] So I think a lot of people have been quiet,
[31:05] quietly working in the background at their knees, trying to keep their companies on the right side of the law.
[31:11] I think that it's kind of a full contact sport. And so whether you're a parent, whether you're an aunt or an uncle,
[31:19] we've got to be protective of our kids. I mean,
[31:22] you see a lot on social media, there's kids whose data has been recorded from day one.
[31:28] I mean, I'm ashamed to admit it, but my brother even took a picture of my son's crib in the hospital and it had his full name,
[31:37] his birthday. He put it on Facebook and I snapped at him,
[31:42] said, take that down.
[31:44] But ultimately, it's not difficult inference when putting these Long Facebook posts about their kids and pictures through the years, and that there isn't protection for us in the US against,
[31:57] like, those pictures being used. You're gonna maybe see a video made that there's nothing yet out there. So I have a long privacy wish list. I know that's one of your questions.
[32:09] Debbie Reynolds: Yeah, you know, I was going there.
[32:13] So if it were your wish,
[32:16] if you had your privacy wish, what would be your wish for privacy or data protection anywhere in the world?
[32:23] Whether that be regulation,
[32:25] human behavior, or technology?
[32:28] Marlyse McQuillen: Just one, or can I do two?
[32:30] Debbie Reynolds: I could do more. As many as you want.
[32:32] Marlyse McQuillen: Okay. Well, I think there's one right, because privacy work is being, by the way, that we're currently the legal basis on which data is being processed in the US and in other places in regimes is like putting a lot of privacy work on us, the individual.
[32:50] And so I think in order to make those cost benefit analysis,
[32:56] I think in tandem with the AI K through 12, there should be a federal education mandate that teaches basic consumer protection,
[33:05] including financial literacy. Right.
[33:08] But privacy, digital hygiene, and kind of AI orienteering.
[33:13] I'm trying with the scouts, but I think that should be on a federal level. But they're being taught to ask the right questions.
[33:20] Inasmuch as they're being taught to use AI in a way that amplifies their productivity and their ability to create and add value to the world. I think that you need to continue to spark critical thinking and empower people to not just take things at face value.
[33:42] So that would be like my number one.
[33:45] And then along with amending ICRA fact,
[33:49] I would say that with respect to cyber incidents,
[33:54] that all over the world, the companies that run afoul of it, I mean, everybody says it's just. It's not a question when,
[34:02] it's a question of when, not who is being targeted.
[34:07] But I think apologies and credit reporting is not enough.
[34:12] And so I would just make a call out to those enforcement authorities to take a look at some of the HIPAA settlements with the Office of Civil Rights and include more things like data inventories and I think or in tandem with credit reporting.
[34:28] It's like it's not just once the data is out there, it's out there.
[34:32] So two years is enough. So maybe it needs to be lifetime.
[34:36] Right. That would be a lot more expensive.
[34:39] Debbie Reynolds: Right.
[34:39] Marlyse McQuillen: So you would see a lot more people put things in line.
[34:43] Debbie Reynolds: Um, yeah,
[34:44] that's true. That's true. Especially because the harm, especially in something like 23andMe. I'm sorry, I'm just We're totally talking about this company, but I mean it's just such a, such a stark example what the thing, the bad that can happen.
[34:58] I think when, when they had the breach and stuff like that, they were saying, well,
[35:04] we'll give you like one year of credit reporting or no, three years. They said we'll give you three years of credit reporting. I'm like, but if, if my data DNA information has gone out somewhere like that harm could be for a lifetime.
[35:18] Right, so exactly like. And plus is DNA on your credit report? I don't think so. Right. You know, so like how, what is credit reporting going to help me with?
[35:29] Marlyse McQuillen: But they, I think there was some speculation that the hackers were trying to find out whether people were Chinese or Ashkenazi Chinese, like post pandemic discrimination in order to target. So the harm could be way bigger and then figuring out whether people were Ashkenazi Jewish.
[35:50] And I think that like specific data was really mine to put that on the dark web. So. And I don't even want to start my overactive imagination on what that kind of data could be used for.
[36:02] But sorry. And credit reporting is a, that's the usual sort of privacy. I think Neil Richardson calls it like privacy theater. Right. You go through or data protection theater, like you go through and then that's it.
[36:17] And that's said and done.
[36:19] And honestly it goes back to health data,
[36:23] HIPAA adjacent data.
[36:25] A risk based analysis. Right.
[36:29] A risk based analysis under the HIPAA security would have like stopped the prevent,
[36:35] would have stopped the credential stuffing that essentially, you know, resulted in the breach.
[36:41] Right. I'm Monday morning quarterbacking on this,
[36:44] but I think that companies,
[36:47] whether it's soft law or they're being guided by hard laws that have enforcement impacts,
[36:55] need to take a look at this case to see about consumer trust, especially in the digital health platform space and to, even if it's not covered by hipaa, act accordingly.
[37:07] I mean the FTC recently like awoke from the slumber of I think 10 years and used the health breach notification rule to go after a couple of key figures,
[37:20] but I haven't seen much there.
[37:22] Debbie Reynolds: Yeah, that's a hidden one that they don't, that they don't use and they should use. Right. So definitely.
[37:28] Marlyse McQuillen: Well, the ftc, I think in general, like I,
[37:32] I'm trying to be optimistic. It doesn't cost extra. It doesn't cost extra. So I think that the ftc, I have a lot of faith in them that they're going to start some cases and investigations that really Protect us.
[37:46] And whether new laws come out or not, there's a lot of anti discrimination consumer protection laws on the books. Like for instance, Massachusetts, the Attorney General's office has come out and said, like, we may not have an AI law, but to this technology,
[38:00] our old laws apply.
[38:02] And so I think that data privacy and protection professionals and AI governance professionals need to brush up on some of the consumer protection provisions in the existing framework.
[38:16] Just talking about the U.S. i mean,
[38:19] I think globally having a framework we're lucky with privacy, GDPR is kind of the gold standard and gives us with.
[38:28] Together with the first principle with the fip,
[38:31] gives us kind of a sense of what's at stake.
[38:33] And so I don't think to track all of it globally because there's, there's little differentiations, but we've got that. I think that the EU AI act is going to serve as a similar model.
[38:47] Texas's initial AI act came out and set to be EU AI act light.
[38:56] But through the legislative process it got pared down substantially and it basically became just sort of a list of prohibitive practices.
[39:05] But that's somewhere.
[39:07] And hopefully before the midterms we can get some base minimum protections in place,
[39:13] hopefully for deep fakes as well, because I think that that's going to have a huge impact on our elections.
[39:20] So.
[39:20] Debbie Reynolds: I agree. Lots of work to do. Lots of work.
[39:23] Marlyse McQuillen: So much. It's such an exciting space to be in.
[39:27] Yeah. I feel blessed that this is my job and I get paid to do it and track it. It's a lot of work,
[39:34] the privacy work as a parent,
[39:36] as a professional. But I love it.
[39:39] Debbie Reynolds: Yeah, me too. Me too. Well, thank you so much. I'm so excited that we were able to chat and I hope people definitely follow you on LinkedIn. You always post really thoughtful things and really, really encourage us things, so.
[39:53] And I'm gonna wait for your song to come out, so.
[39:56] Marlyse McQuillen: I can't wait.
[39:57] Yeah. But I hope somebody actually sings it for now I have it on Suno, which feels a little like weird. I hope you guys enjoy it.
[40:19] Debbie Reynolds: Yeah. Thank you so much. It's been a pleasure to have you on the show and we'll talk too.
[40:24] Marlyse McQuillen: Sounds great. Bye, Debbie. Thank you