E246 - Aparna Bhushan, Co-Host, Rethinking Tech Podcast

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own. Hello, my name is Debbie Reynolds. They call me the Data Diva.

[00:08] This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know. Now I have a very special guest on the show today,

[00:21] Aparna Bhushan. She is the co host of the Rethinking Tech podcast.

[00:26] Welcome.

[00:27] Aparna Bhushan: Thank you. I'm so excited to be here.

[00:29] Debbie Reynolds: Well, I'm excited to have you here. You and I connected on LinkedIn and we actually had a chat and it was very insightful. And I love the fact that you intersect with tech policy and also I really adore the fact that you try to bring in kind of geopolitical parts of what you're talking about.

[00:48] But why don't you give us a preview of your background and how you became the co host of this podcast?

[00:56] Aparna Bhushan: For sure. And I will say shout out to LinkedIn for connecting us. It's one of the beauties of this hyper connected world we live in.

[01:04] But a little bit about me. I'm a data protection and governance lawyer by background and licensed in the US and Canada.

[01:11] But more than that, over my career I've really focused on helping organizations think critically about how they collect,

[01:19] use and govern data in a way that actually serves people. So I started my career helping organizations of all sizes. I was at a startup,

[01:28] then I went to a bigger organization like Ford Motor Company and I've also worked across tech companies to global organizations.

[01:36] And I've always done it with the purpose and the goal of bringing structure to how we handle data in ways that are meaningful, ethical and practical.

[01:45] And with that lens, it just felt like a natural top step and leap to starting rethinking tech.

[01:53] As you mentioned, very much focused on that intersection area of gray between geopolitics,

[02:01] data ethics,

[02:03] and I would argue in today's world, especially given changing government around the globe, it's become all the more relevant to see how these things are connected.

[02:15] My co host and I, we met while we were consulting for UNICEF and we started having these conversations on a daily basis and we're like, you know what, let's just hit record.

[02:25] And if there are people out there who are as interested in these topics as us, let's start this conversation with them. And we really strive to provide that non judgmental space because in this ever divisive world, it's nice to not add extra fuel to fire and just talk about things from a practical perspective.

[02:45] That's what that's what we try to do.

[02:47] Debbie Reynolds: Well, I love that lens and as we talked about before, I love the fact that you bring that in. So what I tell people, because I talk about geopolitics as well, not as much as I need to.

[02:57] I'm glad we're able to talk about it today. But it's kind of the things that are happening that are not written that influence everything else,

[03:07] that maybe the things that are not said. And I think also one of the really unfortunate things that happen in some of these heated debates where people are so polarized, like, I have something to say.

[03:19] I'm not going to listen to what you have to say. And I want you to think like me. And I don't think that's the way we should be. Right. So it should be more of an open space dialogue, more accepting space.

[03:30] But I want your thoughts on the whirlwind of things that are happening today.

[03:36] And obviously they impact politics,

[03:39] they impact how people or companies are working in tech. And also there's that human factor. So, you know, things like whether organizations are going to regulate or enforce things,

[03:50] whether tech companies are going to take seriously not just regulation, but how consumers feel about how their data is handled. I think it's a really big deal. And then all the other things in between.

[04:02] Give me your thoughts about what's happening today that have caught your attention and has your focus right now that you think we should be thinking about.

[04:11] Aparna Bhushan: Debbie, such a good question. And literally what I grapple with every single day, because there is so much in this news cycle that I feel like things that I record one week are irrelevant two days later.

[04:22] But you touched on some really interesting stakeholders. So why don't I break it down that way? Let's look at it through the lens of companies.

[04:30] Let's look at it through the lens of users, you and I or consumers.

[04:35] And then let's look at it through the lens of government.

[04:38] And let's start from bigger, all the way down, because I like to talk about users, the most important ones I think last.

[04:44] So let's look at government,

[04:45] at the government layer.

[04:47] Everything is not making sense anymore because the rules that had been written have been completely thrown out the window.

[04:55] And it reminds me a lot of having a bully on the playground sometimes and the bully gets to decide what's being done,

[05:04] even if other people disagree and no names to be taken. But right now all the other countries and all countries worldwide are figuring out how to properly respond to this completely new way of playing this completely new field.

[05:20] So what I think we're going to start seeing from a government perspective is governments are now going to either start redefining their alliances and, and redefining their priorities. You're either going to see them double down on things like freedom of expression,

[05:36] which is interpreted differently by different governments, making it even harder for companies and users to figure out.

[05:44] And you're also going to see them realign themselves within alliances. That might seem strange or that might make perfect sense because that's what's been done historically. You're seeing this come out of the China and South Korea and Japan relationship that's had some negotiations and now you're starting to see some phrase at the edge between Canada and Canadian,

[06:09] the U.S.

[06:10] and the U.S. and Mexico. So we're in a weird but also really interesting time to see how these puzzle pieces are now going to fit in this whole new puzzle.

[06:20] And then you look at it from the company perspective and geez, to be a company that's global.

[06:26] And it really makes no sense right now because how do you appease one government without setting another one?

[06:35] So for companies to now develop internal policies and try not to rock the boat,

[06:42] it almost hurts them to have that public facing space. And it's almost better to just step back and almost hope you get swallowed into the shadows until things settle a little.

[06:51] Because the worst thing you can do right now is trigger a government or trigger an entity to then target you. And then you'll see things like the FTC coming after you.

[07:01] You'll see things like deals not being allowed, or you'll see perhaps your company not being allowed to operate in a geography altogether. We've seen this with Deep Seat in Italy.

[07:13] We're seeing this maybe with TikTok, which is a really interesting thing to talk about.

[07:17] So I do not, I do not envy companies, especially as they then look at their employees and what is it that they're going to allow their employees to do? Are employees allowed to speak out about these things?

[07:30] Because then doesn't that represent the company's interests? But if the company tells the employee not to,

[07:35] what does that mean for the company's overall culture? And we saw a lot of this coming out of Silicon Valley around election time and around the Me Too movement and especially around the Black Lives Matter movement.

[07:47] But now you're seeing companies that had one perspective that you thought you aligned with completely change.

[07:53] So culture is completely changing.

[07:56] And then the last layer, arguably the most important layer, is what does this mean for you and I, for users and when using certain social media platforms, for example,

[08:08] you have to be careful. Now you're hearing these stories about people trying to cross borders and not being careful and not knowing, and they're laptops being searched and the phone's being searched.

[08:19] And I mean, I have a podcast. You have a podcast. I have been told by several people to be careful about what it is that I'm seeing online. And I.

[08:26] I believe I'm just reporting on facts, but it's. It's a weird time. And so I don't know if I'm giving you the answer you're looking for, other than the fact that it's a lot of unknown right now.

[08:37] Debbie Reynolds: That's a great answer. Something that I talk about is something I think about that I want your thoughts about.

[08:45] So moving a space politically where people,

[08:51] you know, are thinking that maybe either certain administrations won't want to slash regulations, don't want to regulate, don't want to create new laws, especially around privacy,

[09:01] may not want. May not be interested in forcing laws. I think it caused some companies to pull back on privacy a bit to their peril. And the reason why I say that is because,

[09:15] well, a couple things. One is that there's still that consumer element where if people aren't happy with what you do, you know, that could be worse than a regulation. I tell people, like, regulation is not your biggest problem.

[09:27] Like your consumers are your biggest problem. So if you don't think of it that way, you're gonna be in a bad position.

[09:33] But then also in the US on a state level and then in other government levels in different countries,

[09:40] you know, we're still seeing enforcement, right? Maybe enforcement that people don't like because it creates more complexity because is coming at you from, like different angles. So, like the example you gave, like OpenAI got fined in Italy.

[09:54] Right. Or GM has these two. This big case against them in Texas. Right. So all these things are still happening. Right. Even though we don't have. In the US we don't have federal regulation.

[10:06] But I think, you know, I do a talk called Beyond Regulation. That's one of the first things I talked about this year. And so,

[10:13] you know, and I talked about don't be so focused on regulation because you have all these other risks that you need to think about. And so I'll tell you a story quickly.

[10:22] I want your thoughts. So I think Cashmere Hill from the New York Times did a story about GM and them selling people's data to data brokers. And then their insurance went up and stuff.

[10:33] And so that was a blockbuster type of Article. And so a couple things happened. First of all, and I read through all of her articles. I read, I went through all the comments I just read.

[10:43] Cause I wanted to see what people were saying. People were outraged. They were like, I've been, I've been a GM car owner for generations and I'll never, you know, use this company again and all this stuff.

[10:55] And to me that seems like, that seems like that's a huge risk to take to make what, a couple hundred dollars a year from a person, right? You're gonna lose all this money.

[11:04] And so I actually ended up at an event, a mobility data event with like a lot of automakers, a lot of insurance makers said it was so funny because they were like,

[11:14] after that happened, we were all saying, okay, we're following the law, quote unquote. Right? But there's more than that, right? So there's a gap between what companies do and what consumers expect.

[11:26] And a lot of them like, well, we threw out our Playbooks and we're kind of doing something differently. So you need to really think about that consumer angle. But I said all that to have you give your thoughts on that.

[11:38] Aparna Bhushan: Yeah, I mean, you hit the nail on the head and you can see it from two different angles.

[11:43] One is that laws are reactionary.

[11:47] So let's say you're doing something, you're an innovative company, you're deciding to resell your customers data because there's no law out there. But let's say there isn't.

[11:54] Cool,

[11:55] fine, you're not breaching the law yet.

[11:58] But a,

[12:00] that's because laws are reactionary. So that means you might be the case in point. Example, you might be the one taken to court being like, ooh, actually new jurisdiction is going to come out and how embarrassing would that be?

[12:12] But,

[12:13] and this is the most important thing, it always comes back to the consumer.

[12:17] I am always telling the organizations that I consult for that compliance is not the goal.

[12:24] You should be making data protection your superpower. Why would you not? You're looking to improve the user's UI UX experience. You're looking to improve their ability to access their data to understand what the heck you're doing with their data.

[12:38] You're looking to make them confident in the way that you process their data. This is only a good thing. If you are confident in what it is that you're offering, then you should be showing that off.

[12:49] So make it as easy and as simple as possible.

[12:52] If you do not want to do that, it's probably because you're not doing Something that you feel all that great about, in which case, is it worth doing like you said, is it worth a couple hundred dollars a year or user max?

[13:05] Or would you rather look at that lifetime value of a customer and reevaluate what your actual player will sell?

[13:12] Debbie Reynolds: Yeah, totally. I want your thoughts on something that's happening in the news right now. Something's happening in the news, but I want your thoughts on this particular one. And that's the 23andMe situation.

[13:23] So 23, just a short version. So 23andMe was a company,

[13:29] did DNA, people want to know their ancestry, stuff about their health. And then it became kind of a honey pot for governments and different companies. Try to get data and try to get insights and use it for stuff that people didn't expect, like chasing down your cousin cause he stole a car or something like that.

[13:46] Or it's also the good stories about maybe family members that found other family members, which is fine. But I think and I did a video about five years ago about this and my concern was like, what would happen if this company went bankrupt or what would happen if this company got sold to another company?

[14:03] And they decided, well,

[14:05] we don't care about protecting your data, we're going to use your data for something else. And so the laws currently aren't written to basically they can sell that data to anyone.

[14:15] Like you really have no control over that. And so that's just kind of a concerning to me. It's kind of like a nightmare scenario in terms of what's going to happen next with people's data that they gave to this one company that may go to this or some other company.

[14:31] What do you think?

[14:33] Aparna Bhushan: Well, first of all, kudos to you for calling this five years ago. I wish that I had talked to you and known you then because back in 2015 confession, I fell for the marketing,

[14:45] I fell for the Black Friday deals. I was like, this is going to be a fantastic Christmas present for my family. I bought it for all of them. It was terrible idea.

[14:54] But that being said,

[14:55] it was a shoo in for best Christmas present that year. So at least I won in my mom's eyes that year.

[15:01] That being said, when I actually started learning about the data protection implications of this, I was flabbergasted. I mean, you just mentioned that yes, they've declared bankruptcy and now with it, the genetic data, 15 million users is now up for grabs.

[15:17] Fortunately, I'm not one of them. Neither is my family because I got us to a few years ago when I realized what a treasure trove of data was now at risk.

[15:26] That got us to get all of our data deleted to the best of our knowledge. That's the other piece. Like yes,

[15:32] we hope that the companies that say they're doing these things are actually doing these things. Although fairly sure there was a security incident just like a couple years ago with 23andMe.

[15:41] So in terms of what data was actually deleted, whether it was pseudonymized, anonymized,

[15:47] you never really know unless you yourself are going to audit this organization, in which case they work as a user doing that.

[15:55] But I digress. Your question was how do I feel about it? I have so many feelings about it.

[16:01] As you said, your DNA will be sold to the highest bidder. And when this company's bankrupt, that's really the only asset they have that anybody will want that's worth anything.

[16:11] And best case scenario, it goes to an organization that won't do anything harmful with it. But worst case scenario, I mean, it can go to a foreign government with questionable motives,

[16:23] it can go to those insurance insurance companies that you already just mentioned, the GM case.

[16:29] If I see my insurance premiums going up,

[16:32] I will naturally suspect that it has something to do with this data that's out there.

[16:37] And that's really, really scary. Can this lead to a different form of discrimination? Something that you and I have dedicated our careers to fighting,

[16:47] and now it's just out there. I mean, this data is just out there.

[16:50] How are we ever going to get it back? We're not. Our data, our most sensitive, valuable data is out there.

[16:56] And as best as I think I've deleted that data, I will never know for sure whether it actually is all gone, which is terrifying.

[17:05] Debbie Reynolds: Well, I'm just super tinfoil hat about stuff like this. And so I think mostly just because my career in technology started when I was trying to move people from paper or analog systems to digital.

[17:19] And I saw some of the things that people were capable of doing that you couldn't do.

[17:25] And so it got me frightened about some of the things that people are doing. And I was like, oh my goodness, you know, it'd be terrible if they did this or they did that.

[17:33] But then what I find, and I want your thoughts on this, is that I think people just don't imagine what could you possibly do with this data. Right. Cause it seems so innocent, like the DNA example.

[17:49] I don't think anybody ever thought that the use cases that they have, the people that, that they will be allowed to even use as data for these other things. But I think part of the issue with data protection and privacy is that for people who don't really.

[18:04] For people who say, a lot of people say, oh, I don't care or I have nothing to hide.

[18:09] You know,

[18:10] those people, if I talk to them for five minutes, they're like in uproar, right? They're like, their face is red, they're like upset.

[18:17] Hey, I gave this company my data for one reason. You're using for another. Like, that's like totally. Like, people expect that. Right? So with the DNA example,

[18:26] people didn't expect that that data would ever be used for anything other than that. So the idea of that kind of scope creep or mission creep is so prevalent, unfortunately, in kind of a data space.

[18:38] And I think that's the part that people don't really think about. But I want your thoughts.

[18:42] Aparna Bhushan: Yes, 100%. And 23andMe is just one example. Is it highly sensitive data? Yes.

[18:47] But let's think about how users are interacting with this exact same use case in other aspects of their life.

[18:54] Social media posts,

[18:57] anything that people have posted on LinkedIn with their job updates.

[19:01] I've seen people get into issues when they're going through any online dating forms. And even if you are using things like ChatGPT or any other AI model and you haven't specifically turned that feature off, that they learn from your data,

[19:16] you are contributing to this algorithm.

[19:19] And my favorite line is that the devil is in the default.

[19:23] The default of many of these services, especially in the us, less so in Europe because of regulations,

[19:32] means that all of our data is probably already opted into unless we go through the painstaking effort of opting out, figuring out how to opt out.

[19:42] And the reason that I think a lot of people say they don't care about this or that they're fine with their data being out there is because it's exhausting for the average person to be living in just perpetual.

[19:54] Not fear, but awareness of what this means.

[19:58] And it makes life more difficult because you're constantly questioning why certain pieces of data are asked for or where it's going. And I mean, it's not the most fun thing to talk about at a dinner party conversation.

[20:12] I mean, not for me. I enjoy it.

[20:14] But my partner has requested me to dial it down so that we can make more friends. Which is fair. Which is fair.

[20:19] Debbie Reynolds: Yeah, right?

[20:22] That's hilarious. Oh, my goodness. I know, I know. I try to,

[20:26] like, if I go to Christmas party or something like that,

[20:29] I try to just have, like, maybe one thing that I can leave people with, like the Text messages you get about tolls or whatever,

[20:37] those are scams. Like don't, don't click on those,

[20:40] please.

[20:41] Aparna Bhushan: I got that yesterday. Oh my gosh. My poor grandfather was convinced he was getting a brand new iPhone for $15.

[20:48] Gave all of his information and I was like, I have failed. I have failed as a grand author.

[20:52] Debbie Reynolds: I know it's just really hard. But then that goes back to the imagination thing, right? Cause we just don't imagine, first of all, you don't imagine someone would know anything about who to know you would contact, you would do something like this.

[21:03] So it's just hard to think about. It's just hard to think about what's happening in maybe either the regulation or enforcement space that you have your eye on. It's like maybe culture, curiosity.

[21:18] Aparna Bhushan: So in the regulation space,

[21:20] earlier this year we had the AI Paris Paris AI Summit.

[21:25] And when big names and big organizations were attending, the expectation or hope was very much that some sort of agreement, not a formal agreement, but at least the parameters of an agreement, would come out around AI regulation and what this meant.

[21:42] And this touches on the earlier point we made about how being a company in today's day and age is hard because there's different rules and regulations and if you're making one country happy, you're making another one unhappy.

[21:54] So at least my hope when people were going into this conference was that the world leaders were going to come out of it and provide companies and users with some sort of framework so that they could keep innovating and do so within the constraints of what is in the best interest of users,

[22:11] those AI regulations. I hesitate to say the word regulation. I didn't think of it as being anything that firm, but at least the makings of it, at least an agreement of sort, a handshake deal.

[22:20] And so I'm hopeful that that will happen.

[22:23] It did not happen. Long story short, TLDR like that did not happen is that the most opposite thing could have happened. Coming out of that AI Paris Summit, we see that the EU and the us, two major players in this game, are seemingly at polar opposite ends of the innovation versus data protection AI regulation space.

[22:42] And if you'll talk to me and ask my opinion, I don't. Things belong on the opposite end of the spectrum. I believe that innovation will be moved forward at a lightning speed if you provide those constraints.

[22:54] And so the space I'm watching right now is the EU AI act, which came into effect earlier this year, but it might be revitalized, to use that term.

[23:05] I would love to see something come out of the U.S. you and I both know that's not happening. Maybe at the state level, we'll see something happening, but at the federal level, there's.

[23:12] There's no way. We've been waiting on that data protection regulation for years.

[23:15] So anything that has evidence of an AI act, be it in Asia, be it in Europe,

[23:22] be it in Oceania, even, I'm in Brazil right now.

[23:26] Anything that provide some evidence to what the eventual global perspective will be on AI is something that I'm gravitating to. It's kind of like how GDPR set the stage for data protection way back in the day.

[23:39] Debbie Reynolds: That's a fascinating point. I'm glad you brought that up. So what GDPR did, and maybe some of my European counterparts may disagree with this, because I know a lot of my European friends, although they were happy with the gdpr, they were also frustrated that they felt like the enforcement was very slow or lacking.

[23:58] Right,

[23:59] but what GDPR did,

[24:01] as you said, it did set the stage, but it was,

[24:04] you know, in absence of anything else, it was kind of the only thing we could all hold on to. And so what happened after that is that a lot of laws and regulations, even though they didn't wholesale take on gdpr, they took bits and pieces of it.

[24:19] So you cannot even look at almost any law around data protection now and not see it have some lineage of gdpr.

[24:28] So I think that's what's going to happen with the AI act, you know, regardless of what tweets that they do. Because I think, you know, in my view,

[24:36] it's hard to kind of dance to the tune of the jurisdiction that has the most rules,

[24:45] basically. So for. For me in the US and I want your thoughts. You know, I feel like I don't. People will say, oh, we want to lead in AI and stuff like that.

[24:55] I said, well, you know,

[24:57] just because we may not have. May not ever have regulation of AI in the US doesn't mean that there won't be regulation elsewhere. And we're going to have to dance by that tune, Right?

[25:09] And then from a business perspective,

[25:11] is it necessarily in your best interest to have different.

[25:16] Different requirements or different things for different jurisdictions? Like, for example,

[25:21] so PayPal decided that for the CCPA in California,

[25:28] we're going to make sure we give these people these rights where they can opt out and do different things, and they decide as a company that they were going to also extend those types of rights to everyone in the US and so that's what I think we're seeing more.

[25:43] So we're going to see more as opposed to, you know, you're in this jurisdiction and you have to apply to these laws with you saying, okay, if you want to do business with me, I need you to align with these regulations because this is how we run our business.

[25:58] But I want your thoughts.

[26:00] Aparna Bhushan: Yeah. And CCPA is a fantastic example because there's no way that companies from the company level, there's no way that companies,

[26:07] the bandwidth, the resources and felt they just the interest in catering to each country, let alone each state separately.

[26:16] It's just not feasible. You look at the gpc, the global privacy control companies are struggling with this. First they have to implement it and figure out how to do that, which is difficult.

[26:26] You have to figure out your entire backend ecosystem to read this signal and then to make it so that only some narrow jurisdictions are able to go off of longitude and latitude or what is this to determine whether that control should be enacted.

[26:42] It's just, it's not reasonable, it's not feasible. So by that token, looking at from a global scale, I would agree that the EU and having the AI act and going off of what we've seen with GDPR will set that baseline.

[26:56] I mentioned that I consult for IGOs and international governmental organizations like the UN and those organizations are not held to any legal standards like the rest of us are. They have their own set of requirements.

[27:10] They're called policies. And I mentioned this because even for them, when I am advising them, I use GDPR as my base.

[27:19] It is what's being done worldwide. Even though they're not held to the same requirements. GDPR has what now the privacy world sees for better or worse, and I know there are some worse, but for better or for worse provides that standard baseline.

[27:32] And then as much as you choose to deviate, at least you understand that that's the baseline and you're making the decision to deviate. It gives you a starving frame of reference by the EU putting the AI act out first.

[27:45] It's fascinating because arguably they don't have the technology that should give them the right to make those regulations. And that's what the US is coming back there. They're like, hey, this is our product, we should get to decide the regulations around it.

[28:00] But by being first to bat,

[28:03] they might have just decided the game rules for the entire AI space. And I think that's where we're going to see a lot of friction,

[28:10] especially with the current administration, for them to like all of the innovation, all the technology is coming from here.

[28:16] So why do we have to play by your rules?

[28:18] And we didn't see that fight in gdpr, which is why I'm curious to see how this is going to net out.

[28:26] Debbie Reynolds: Yeah,

[28:27] that's true. That's true. I just feel. I agree with you wholeheartedly. So we'll definitely see. You know, maybe there'll be some middle ground there or some tweaking.

[28:36] But I still, to me, it's like,

[28:39] you can't replace something with nothing.

[28:44] Aparna Bhushan: Exactly.

[28:45] Debbie Reynolds: So I feel like you have to have something.

[28:47] So I think what they're saying is like, we want nothing.

[28:50] You have something and we want nothing. And so I can't. It just doesn't work that way. So you have to have something.

[28:56] Right,

[28:57] exactly.

[28:58] Aparna Bhushan: I agree. I agree. And short of having that, we're going to default to what we have, which is the EU AI Act.

[29:06] Debbie Reynolds: Exactly.

[29:08] I want your thoughts on.

[29:10] I feel like you're the perfect person to ask this because you understand the tech, you understand the policy parts and also the geopolitical stuff. So I want your thoughts on age verification laws.

[29:24] It's kind of a lot of shenanigans that are going to be happening here. But I try to. I've been telling people, I feel like I'm Paul Revere, like the British are coming.

[29:31] I'm trying to warn people about what this means. Right. Because I feel like, I think globally, people think, oh, it's a great idea to protect kids. Right. But then we see countries try to take the different ways to do it.

[29:46] Where we have, for example, maybe the uk, they're like, well,

[29:49] we can protect children by weakening encryption for everybody,

[29:55] or we can protect children. Like Australia is like, well, we keep kids under 16 off social media or in something like, let's say, L.A. i believe a lot of their.

[30:06] Some of the southern states in the US or red states are saying, well, why don't we just have, for certain sites, everyone give their ID regardless of whether you're a kid or not.

[30:15] And so that's bonkers as well. And so what I'm telling people is that as you're creating these regulations,

[30:24] people don't understand what it will take to actually implement them. So, like I tell a parent,

[30:31] think about your kid being on social media. Like, would you give your own ID as a parent to like 5 or 6 or 7 or 10 different social media platforms for your kid to be on?

[30:41] I mean, you just have to think through what this means in terms of, like, real life. But I want your thoughts.

[30:46] Aparna Bhushan: Yeah, I mean,

[30:48] it's so commendable that governments are recognizing that this is a problem. Because yes, age verification and children being on social media is an issue if it is not properly provided with the appropriate parameters.

[31:03] And that's where the crux of the issue is, those parameters. No one is saying that children should have unfettered access to social media.

[31:11] They do right now, for what it's worth.

[31:14] And for that we've seen huge uptick in suicide rate,

[31:19] huge decline in social behavior learnings. And that's a whole other topic that we can have a discussion about, which is really, really sad. But when it comes to how do you actually enforce and what is the appropriate way to enforce agent verification standards,

[31:37] unfortunately we have more examples of what not to do than what to do.

[31:42] At the end of the day,

[31:44] I would argue that ID verification is terrifying.

[31:48] I don't want to give my driver's license,

[31:51] oh, God knows what, my facial recognition system or something to an unknown third party to then just store.

[31:58] Because likely these social media platforms, which I also frankly do not trust,

[32:04] building it in house or giving it to a third party, it's terrifying. You just talked about the 23andMe example. What if these companies go bankrupt?

[32:13] How do you know how secure these companies are?

[32:15] This is not the solution in its current form in terms of banning access to social media.

[32:22] As fantastic as the thought that that might be, we all know how the black market works. As soon as you take alcohol off of the market,

[32:31] that's when all the speakeasies pop up and children will find a way to get to it.

[32:37] So unfortunately, I don't think a straight up ban is going to be possible. I'm very curious as to what that technology would look like without then also requiring ID verification.

[32:46] And weakening encryption is never a good idea. No, no, no. We should be strengthening encryption. So unfortunately, I think I just answered your question by telling you all the things that people should not be doing.

[32:58] Yeah,

[32:59] yeah,

[33:00] it's difficult. I'm curious to hear how you are actually guiding the companies that you're working with when it comes to solutions. Because the ideas that I've come up with have all been around quickly deleting verification of photo ID like so that you're not actually storing anything, but then that requires them to build out a process in house.

[33:23] And we've evaluated you some third parties, but tangibly there's nothing that I've been super excited about that's scalable across startups to big companies.

[33:32] Debbie Reynolds: It's funny because I did a address for TikTok and the topic Was technology social media for you? And someone asked me that question, I was like, well yeah, that's the like billion dollar question everybody wants to answer, right?

[33:46] How do you actually do age verification for children? And so one of the things, and I agree with you, if you have to collect it at all to verify the information, you should, you know, delete it as soon as possible.

[33:59] But one thing that I think that companies need to think about and this, you know, something I term actual knowledge, we're seeing some cases around this where for example, I think Weight Watchers, they got into trouble with the FTC for some data that they were collecting on children.

[34:15] And like, you know how like all the apps, they say, oh, click this box if you're over 13, right? And then they assume that you're over the age. But what the FTC did with their investigation, they were like, kids are like, they're as they're chatting or doing different things,

[34:28] they're like,

[34:29] today's my eighth birthday kindergarten, I'm in elementary school. So they were like, as the person was using the app, they were providing additional information that would have told you that this person was not of age.

[34:43] And so I think sort of thinking about age verification as a gate where you just say, hey, I'm over this age, I get in, I do whatever. I think it's going to have to, companies are going to have to develop more,

[34:55] more technology because I guess to me this is an example of why technology is a double edged sword because you're collecting so much information.

[35:06] And then on the flip side, it's like the fact that you have all that information that you can do all these other wacky things with it, why can't you use it to tell someone is of a certain age?

[35:17] Aparna Bhushan: So yeah, yeah, no, that's a fantastic point because it's always these companies that are full of the most brilliant minds that when we ask them questions like this they're like, I don't know, it's really hard.

[35:27] Like how do we figure that out? It's like you figured out how to get us addicted to your algorithm so that I am now mindlessly scrolling for hours. I know you are smart, you can figure this out.

[35:37] But to your point, like maybe this is a good use case for AI, maybe this is a way that we get these AI agents or bots to work in the background and look for things like birthdays and numbers and that's what you could then trigger for your content moderation team to review or whatever it may be like there are solutions out there.

[35:58] I refuse to believe any company that says it's too hard because that's you telling me that you're not that intelligent. And I know that nodded to that.

[36:05] Debbie Reynolds: Yeah, even I, as I said, I went to a mobility conference and someone saying that they know how much water is in the wheel well of your car at any given moment.

[36:18] I'm like, surely you should be able to figure out how old someone is. I think companies have not been incentivized to use their innovation in that way. And so I'm hoping that they spend time to do that because I think it's a very important thing to do.

[36:33] And then I think.

[36:34] And I want your thoughts on this as well. I think one of the big problems we have with age rarefication is that we're trying to take something that happens in a physical world and try to recreate it in a digital world.

[36:46] And I don't think it works. Right. So I think us thinking, you know, digital is different, so we have to think about our digital space differently.

[36:54] Aparna Bhushan: Yeah, for sure. And unfortunately, there is no silver bullet solution. I think you actually addressed this in the Weight Watchers example, which is that it will be very specific to each organization.

[37:06] Does that mean that no organization should go through it? No, it just means that every organization needs to figure it out the same way. They have a unique product or service or offering.

[37:15] This is the exact same. How does. How do you make this make sense for you?

[37:20] Debbie Reynolds: I think some companies haven't invested there because they may not see that return on investment, but maybe the return on investment is that you're not losing people. Right. Or that you're providing more of a safe space where parents feel comfortable sending their kids these things.

[37:36] So,

[37:36] yeah, a lot to think about,

[37:38] for sure.

[37:39] Aparna Bhushan: For sure. It's a fascinating space. I think we'll see a lot of development. I hope to see a lot of positive development in the coming year.

[37:47] Debbie Reynolds: Definitely.

[37:48] So if it were the world according to you and we did everything that you said, what would be your wish for privacy anywhere in the world? Whether that be regulation,

[37:57] human behavior, or technology?

[38:00] Aparna Bhushan: Oh, good question.

[38:01] I'm gonna cheat. I'm gonna answer them two ways.

[38:05] So in terms of human behavior and technology.

[38:08] So human behavior, I would love for people to just be more conscious when they're making the decisions they're making. Not just to give data, because blank form, someone's Google Doc is asking you for it, but to really question the purpose of it.

[38:22] And then if they decide that they're fine, to give their Social Security number, which I really hope they're not a random person.

[38:28] But you know what? But at least they're aware. They're making that conscious decision.

[38:32] So many times we're unconsciously on autopilot that we're not actually thinking about the repercussions and the real world implications. So that would be my one beg,

[38:41] my one desire, my one ask of the audience and of the world.

[38:45] And then the second, on the technology piece, we were just talking about what really interesting and really intelligent technologists can do.

[38:53] Is there any way for us to now build an innovation, AI auditing software that can audit AI so that when these models are being developed, when these models are going out there and collecting data, to ensure that it's not biased, to ensure that it's not discriminatory, to ensure that everyone is equally represented and that personal information is not misused like that would be a dream come true.

[39:17] I'm hopeful that there are some really intelligent people out there using it. But using AI to audit AI,

[39:24] great technology.

[39:27] Debbie Reynolds: I agree. I think we need to be more creative here and spend some more time and money on this problem because it is a hard problem. And yeah, I love the fact that you cheated with your two answers and I agree with them both.

[39:39] So it has to be like, should I give my Biometrics for a $10 coupon to Amazon? Probably not. It's probably not worth it. Right? But as long as people have that transparency, then they can make, like, an informed choice.

[39:54] Aparna Bhushan: Exactly.

[39:55] Debbie Reynolds: Excellent. Well, thank you so much. This has been so much fun, oh, my gosh. To be able to talk with you today. And I look forward to us being able to collaborate in the future.

[40:05] Aparna Bhushan: Thank you so much for having me. I really enjoyed thinking about these things from your perspective and my perspective, and I can't wait to see what everyone else thinks.

[40:12] Debbie Reynolds: Yeah, me too. Me too. We'll talk to you soon. Thank you.

[40:15] Aparna Bhushan: Thank you. Thanks.

Next
Next

E245 - Onur Korucu, a Non-Executive Director, Managing Partner, Advisory Board Member, IAPP