E155 - Beatrice Botti, Vice President, Chief Privacy Officer, DoubleVerify
57:53
SUMMARY KEYWORDS
privacy, data, companies, work, credit score, people, internet, law, transparency, read, issue, understand, real, technology, conversation, transfers, patent litigation, beatrice, decisions, localization
SPEAKERS
Debbie Reynolds, Beatrice Botti
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show all the way from Boston, Massachusetts, Beatrice Botti. She is the Vice President and Chief Privacy Officer at DoubleVerify. Welcome.
Beatrice Botti 00:39
Hi, thank you.
Debbie Reynolds 00:42
Well, I'm happy to have you on the show. You and I have been chatting probably for quite some time while trying to get you on the show. And then I ran into Jeff Jokisch in Washington, DC, at IAPP, and he told me he's going to meet you like, tell Beatrice I want her on the show. So that's how we ended up.
Beatrice Botti 00:59
Well, it's an honor to be here.
Debbie Reynolds 01:04
Well, it's great. I follow you on LinkedIn. I'm glad we're getting to know each other now. I sort of LinkedIn know you, but I'm always fascinated by you and the things that you say and the types of commentary that you make. Because I feel like you're a truth-teller, like me sort of tell it like it is, and that's what I really like about you.
Beatrice Botti 01:26
I appreciate that. I certainly try. I don't like a lot of the posturing. It reflects in my negotiation style when we do, say, like I do contracting sometimes. But obviously, I do privacy and security documentation. And there'll be several times where you'll get on with one of the, you know, the type of lawyer I'm talking about, and they'll be like, well, in my 30 years of experience, this has never happened to me. And it's like uncapped liability. I'm like, yes, it has. People ask you to cap liability; it's totally fine. You don't need to do that. We can make it through or not; maybe this will be the first time you agree to a liability cap, we'll explore this experience together. All good.
Debbie Reynolds 02:06
Yeah, I think it's hard to tell a data person about your gut feelings, right? It's like, yeah, I don't think so.
Beatrice Botti 02:12
Yeah, I know. It's always it's just, I think it's also there's a lot of edging in the industry, not out of like malice. I mean, there are some people out there who I, just because they always want to hedge their bets. But there are some people out there are like, yeah, I don't know what's gonna happen. Which is a very legitimate thing to say when 90% of what you do is undecided uncharted territory, like what's going to happen with GDPR with this decision? Unclear? Well.
Debbie Reynolds 02:44
I think it's hard because so many people, maybe in the way that they were trained, it's all about looking at the rearview mirror and thinking about what precedent was set in the past and not really about the future.
Beatrice Botti 02:58
Yeah, I mean, it's, it's a byproduct of a lot of laws being new, right? It's funny it was one of the things that made it possible for me and a lot of other people to get into the industry. The lack of an established contingent of people working in the industry and established patterns and established decisions, and really well, you know, settled law created an opening for an entirely new market of lawyers and privacy managers and privacy engineers. And, nowadays, it's such an asset, especially for the technical people, I think we're looking at the phase where a lot of the technical engineering or architecture experts are getting into privacy, not just at the big companies, I mean, places like Google have had architecture engineers with specialization and privacy for a while. But a lot of smaller places are dedicating resources to those roles. And you know, I use it as a hook for my engineering team and my product team to become involved in sort of pushing forward our plans. I'm always like, you know, this is a really valuable learning experience. It's going to convert into a real asset for you in your career because of having experience and working in applied privacy. Projects have just it's something that few people have, especially outside of those big tech corporations. So I every time I need people to join a project, I'm like, think of the possibility for your own career.
Debbie Reynolds 04:50
Yeah. Well, tell me what sparked your interest in privacy.
Beatrice Botti 04:56
I will admit that I did not have a profound calling. My husband always jokes about the fact that I was born to do privacy because I myself am a very private person. Sometimes we'll be driving, and he'll be on the phone, he'll get so mad at me for saying this, and he'll be on the phone with his mom. And he'll be like, oh, yeah, Beatrice did this, and I will, like, hit him, like stop telling everyone my business; your mother doesn't need to know where I was on Friday night. Which is kind of a funny thing. My mom was always like; people don't need to know your business kind of thing. I'm also European, which brings you up with just a different perception of privacy in general. But it wasn't the business that I tried to go into. I went to law school in Europe, and then I went to law school in the States. And my goal was to get into international patent work, probably patent litigation. That was my big calling. When I was a kid, I saw Perry Mason, and I thought I was going to be a lawyer just like that in courtrooms calling people out, you know, or to make a more recent quote, you know, you can't handle the truth kind of lawyer, but as it turns out, one, I couldn't join JAG because I wasn't a citizen. And two, not a lot of people screaming you can't handle the truth in patent litigation. Also, when I moved here, I found out that the vast majority of people that work in patent litigation and impact and work in general have technical backgrounds, which I didn't have. That's not the case in Italy, for example, where I'm from. First of all, in Italy, you go to law school right after high school; you don't have an undergraduate degree. So you kind of develop the skills over time to the extent you need them. We also don't have this dichotomy between patent prosecution and patent litigation, like we just don't have that; everybody does a little bit of everything. We obviously have engineers that write patents, but it doesn't work like here. But my goal was ultimately to live here and try to leverage this, you know, legal training from civil law in a common law country. And it didn't pan out the way I thought it would because when I tried to get into IP, it didn't really work for me. So when I finally ended up getting a job at a law school, I had a choice at the time between joining a health and wellness company that was part of the Virgin Group, and a kind of unknown tech company, which later turned, they were working on early blockchain technology. In hindsight, I probably should have gone there. But if I hadn't, I probably wouldn't have gotten into privacy. So it's kind of funny how that worked out. Probably would have been much wealthier if I had, though. But I ended up joining Virgin Pulse, which is this wellness and wellbeing company, in the Employee Benefits space. And when I got there, it was 2017. And they you relied on outside counsel to do privacy work, I got in to do contracting work. So my full-time life was focused on commercial contracting, which anyone who's worked on commercial contracting knows that it goes way beyond your work hours; it follows you at home. Because, you know, in the Florida rush and all of that good stuff. So as we got closer to 2018, we saw a massive increase in privacy documentation, and the cost of getting outside counsel involved in them every time was getting out of control. And I was kinda like; we probably have to do some work around the technology. And it was an end-user experience; there was an app and a website where users put information and including sensitive information; we probably have to work a little bit on the way we do consent, we have to do this, we have to do that. And it got to the point where I, you know, I was given the option to do it myself. They were like, oh, you seem to have an interest in this. Why don't you do it? And I was very worried about it. Because I had no experience working with products, working with engineering, working with privacy, it was very tangential. I had picked up from the outside counsel people some about negotiating the DPAs,but not enough; I'd never read the GDPR at that point. I had no reason to actually get into anything other than Article 28. And I remember calling my dad and me being an only child and a girl; of course, I called my dad. And I asked him, you know, do you think I should sort of shift my career in this direction and do this, you know, compliance project that I have no experience with? And he was like, Well, how long has GDPR been enforceable? And I was like, well, it's not yet; it's going to be enforceable in May 2018. It was like then you have the same experience enforcing it as everyone else. Kind of a wild thing to say, but it inspired me enough to say, okay, well, I'll try it. And if it goes wrong, it went wrong. And we'll figure it out from there. But so far, no one has called me out. My imposter syndrome lives with me permanently. But I have somehow glided through stuff and figured that out on my own. Mostly, I was lucky enough to, even though I worked in really small companies when no one else did privacy, I was lucky enough to make a lot of friends in the community. Andy Dale was, was a big influencing mentor in my, even in my career decisions to move into ad tech, you know.
Debbie Reynolds 10:39
I love Andy.
Beatrice Botti 10:40
Yeah, Andy's great. And he's so generous with his time. And, you know, he's always willing to have a chat and help you out. And he's the guy I called when I needed to figure out if I wanted to stay in the job that I had or if I wanted to move into ad tech. So it kind of just happened, I didn't have this like, you know, I want to work in privacy thing the way kids do nowadays because, you know, seven or eight years ago, nobody even talked about it in law school, like I went to a fairly competitive law school, and there were hundreds of elective classes, and nothing was related to privacy or security, it was just a not talked about topic. So, and I don't come from a family of lawyers or technologists, so I wouldn't have even known that it was a career option at that point in my life. So that's all that happened?
Debbie Reynolds 11:35
Well, you’ve done very well navigating that for yourself.
Beatrice Botti 11:39
Thank you. I try. I always say it's a lot of hard work and definitely a lot of luck. Because in the business, you end up having to make a lot of decisions based on very little information, which is the opposite of what you're taught to do in law school. Law school, kind of what you were saying earlier when we're chatting, really exists in this universe of decades, sometimes hundreds of years of precedent. And privacy is not like that; you can make some inferences, and you can make some deductions based on how similar issues have been settled. But, you know, a lot of the time, you're like, okay, well, the most history we have about privacy is like 50, 60 years of principles in some legislation. You know, America's perception of privacy before 25 or 30 years ago was rights against search and seizure, which in an online app is questionable. So it's very hard to take those concepts and be like, okay, well, based on this, I think this is what you know, the court is gonna go, or the rules gonna go, it's like, there's just no bearing whatsoever on the final outcome. So it's, it's complicated.
Debbie Reynolds 13:01
I agree. And I think trying to look at it in that lens just doesn't really work. And I think that's another reason why we need more regulation around privacy, because basically, people are trying to, as these cases come up, different courts are ruling on them in different ways. There is a trend in one direction or another, it just depends on the court that they want the understanding of the lawyers and the judge about those technology issues at that point.
Beatrice Botti 13:30
It's especially interesting in a place like the United States because of the like, very nature of the Federal government, and State's rights and all of those, you know, all of the amazing aspects of the experiment in democracy that's been going on for the last few hundred years. Lots of it is good. Some of it is a work in progress. That's how I like to think about it. It is a young nation if you really think about how long it's been around. So I think with privacy, and especially information privacy, it's so difficult to operate in this broken down way when the Internet and technology do not operate in this broken down way. Like, at least when you think about it in Europe, you had the failsafe of the language issue, even before your GDPR I mean, Europe figured out that you needed more consistent data practices across the union because of the border issue. But even then, at least, you know, you could say, well, there's language barriers; if you're creating an app in German, it's less likely that a French user might; there were some limitations that kind of helped build barriers that otherwise don't normally exist in the Internet world. But in the US, you don't have that at all. It's everybody kind of speaks the same English across the country. Some slang In differentiation, but you know, the yield is now going to keep you from the very nature of technology and the Internet and the fact that having a California or Utah or Massachusetts regulation is just untenable; like my phone, we have a home in New Hampshire, that is very close to the border with Massachusetts, my phone can't tell where I am, sometimes. It can't figure out if I'm in Massachusetts or in New Hampshire. And that's a problem. And it's funny because some of these local regulations, as well-intentioned as they are, are going to lead us to collecting more and more information about people; you're going to have to track people that much more closely, that much more specifically, to even know what they are, to make sure that you're not walking afoul of certain rules. So it's really a difficult thing to balance. And you see, with children's protections as well, they, on the one hand, all of the policy intent behind it is fantastic. But a lot of the results of it is we have to build up much more, well not us, because we don't work in children's data. But social media platforms and other websites are going to have to build up much more in-depth profiles about children. To know that they're children, you're going to have to ask for more government documentation, I was just reading an article in The New York Times last night about this fascinating law in Louisiana that I'd never heard of that apparently went into effect last year that affects websites and content providers, that may display content that is considered harmful to children. And in order to make that content available, they need to verify people's age, and they do it by using an online driver's license system. And, you know, they were explaining how there is no data actually exchanged. And it's just a verification process. And those systems can work. But it is a lot more data that you're, you know, putting into other systems and a lot more faith that you're putting into these products. And we've seen the biggest tech companies have security failures. So it does beg the question, is it good that there are all these companies with that much information solely for the purpose of verifying whether or not you can access a website? A lot of people make comparisons with accessing a bar. But it's a little different when you have a bouncer, and you show him your driver's license for two seconds, and he's forgotten it two seconds later, versus you're inputting all this information in an app. It's definitely, like, more and more information going out there. It seems counterintuitive, but I don't know how else you could do it.
Debbie Reynolds 17:54
Yeah. That's really difficult. Right? Right. So I always tell people more data, more problems, right?
Beatrice Botti 18:00
Yeah.
Debbie Reynolds 18:01
But so the more data you collect, the more problems. Yeah, yeah.
Beatrice Botti 18:05
If the law tells you, you got to verify a, b, and c. Yeah. What more data it is? It's actually one of the things that I thought was really funny about CCPA. And the data subject requests, when they first came out, they had this authentication language, like you have to authenticate the user. And when you have very little data as a provider, as a company, how do you authenticate a user? You're going to collect more information about them solely for the purpose of authenticating it? It seems counterintuitive. So it's rather interesting how our intent to create legislation, which presumably also focuses on minimization, is one of its major principles because that is one of the fundamental principles of privacy, is inevitably going to lead us in the opposite direction.
Debbie Reynolds 19:03
Yeah, I agree. It's very confusing. And it's hard to read the tea leaves or figure out what's the right approach or the right balance. I want to talk a little bit about transparency, I guess we have the same issue of transparency where we're saying, minimize data, don't collect too much. But make sure that you're being transparent about what you collect. And I think one of the challenges with transparency is that companies traditionally had not ever had to be as transparent as these privacy laws or regulations are asking them to be now. What are your thoughts?
Beatrice Botti 19:39
I mean, transparency is the other, you know, paramount principle of privacy, right? Like, minimize, make sure that you have purpose, limitation, and all that good stuff, and then provide transparency to the users whose data is being used. I think the biggest challenge is what happens when a recipient of the information right, like when the person who the notice, for example, a privacy notice or any attempt at providing transparency is directed at is not prepared to receive that information, they don't have the necessary underlying knowledge. The burden is still on the data controller, the collector, the processor, and whoever it is to provide that transparency. But how do you bridge that gap? That lack of knowledge, and I think that's become painfully obvious now with the Internet. But it was a problem that was always there. I mean, who if you could find me one person on the street in America who could concisely and accurately explain to me how credit scores are created? That would be amazing; unless they work in the industry, I can't imagine that they will actually be able to verbalize it for you. Like, you know, in theory, right? Oh, if you don't pay your bills, your credit score is going to be affected. And if you check your credit score, the credit score is going to be affected, which by the way, still doesn't make sense to me. No one's explained to me why that is the case. Of the but in reality, right, like what actually calculates your credit score? What factors are relevant? I guess we could ask Equifax or a company like that maybe they'll provide information, maybe their privacy notice. I've read it a couple of times. And even though I being a privacy person, I didn't really understand what was happening. Why? Because there's more to the privacy notice than the privacy content, there is a lot of jargon and assumptions that the recipient understands the business or that the recipient understands the industry. And to an extent, the company obviously has to make an effort to compensate for that. But how far can you go before your privacy notice turns into a 100-page compendium on how the credit score industry works. So it's just becoming unattainable to some degree. I think we could do better to fix it. Right? I think when we figured out in the 80s and the 90s, that the future was computers, we realized that we needed? to teach children how to use computers, so they would be permanently disadvantaged in their life, right? They wouldn't be able to compete at the college level; they would not be able to get the best jobs, etc. So we decided collectively we need to teach kids how to type on a computer, how to print things, how Word works, and how this works. Maybe it's not consistent enough; maybe it's not advanced enough. Some schools do it better than others. But we did sort of agree that was the way to go. We haven't gotten to that point when it comes to privacy. And when it comes to online security, and when it comes to the Internet in general, we just don't teach it enough. And that just creates a pretty, I don't want to say insurmountable because I think some people do a better job than others. But it does create a really challenging hurdle, right? Like if the average person doesn't understand what an IP address is, what transparency are they going to get from a notice that tells them how their IP address has been used? I just, it's a struggle, right? So and I think it's kind of unfair to the companies that have to deal with this, right? Because, as usual, there's bad actors out there, let's look assuming that the conversation we're having at any given time is not about the people that are intentionally trying to do something wrong. The average company out there may not be going 100 miles above and beyond, but the average person wants to comply with the law. I don't think that companies are actively trying not to comply. It's just the bar is getting very high on transparency because people just don't understand what you're talking about. And there's just no amount of background that you could give in a concise way that will really compensate for that when you're in an industry like ad tech, or your industry like health tech or, you know, AI. I mean, look at the frenzy that's happened around the AI, a lot of which is manufactured anxiety that I feel genuinely bad about because, guess what? AI is used consistently and has been used consistently for the last decade-plus, in a lot of different industries. Once again, credit scores, mortgage decisions, and parole decisions are made in part by using machine learning. I don't think the average person realizes that, so am I scared of ChatGPT, scared of the way other AIs are used? ChatGPT is like the thing online if you don't use it, it doesn't really. It's affected some people's lives in a negative way. But I don't see a Terminator, you know, sort of ChatGPT happening anytime soon. But if you're that concerned about AI, there is AI out there that we should be thinking about. Right, that is being used to impact people's lives in like a very real way that I 100% think we should be betting like, how does the credit score industry work?
Debbie Reynolds 25:42
Yeah, I think you hit the nail on the head around AI. So that's my concern as well. So just like you said, some people don't understand what an IP address is. And so now we're moving into areas where computing is even that much more complicated and less transparent. Right? So we don't know how these APIs work. We don't know how they're being used. We don't know what facet of them is being used in decisions. Like you mentioned parole, that's very scary, right?
Beatrice Botti 26:12
I mean, it's some of the stuff is already out there. I worry sometimes that the frenzy about ChatGPT, which in some instances is a very real concern, right, like the continued amplification of nonfactual information, is a very real concern. These articles that we read are terrifying about people whose lives have been impacted by incorrect claims made by ChatGPT; I read about someone who I think was an educator or who ChatGPT claimed had real issues with the law, and those have the potential to impact individuals greatly. But those are kind of like non-legitimate outcomes of AI, like they're caught on, I presume, unintended; I doubt that the way ChatGPT was created was intentionally trying to do this. There's technology out there that's used to make determinations about whether or not you're getting a mortgage. And presumably, that's working as intended. And it's broadly used. Same for credit scores, like there's technology there that is broadly used and affects all of us consistently. So I would focus my attention on the very real instances of automated decision-making that is happening. That affects everyone, as opposed to having another segment on the news about ChatGPT claiming that xy and, you know, John X is dead, but he's alive. It is. That's an outlier outcome. But there's technology out there that's embedded that we should be very concerned about. I mean, cars nowadays heavily rely on technology. There is a lot out there that we should be reading more closely. And, you know, GDPR, with all of its flaws, did realize that automated decision-making can be a real problem, especially in highly sensitive areas like credit score; there is no way I mean, I guess I should say no way because I don't know enough about credit scores. But it would be hard for me to believe that the way credit scores operate here will be deemed acceptable under GDPR. It's hard to believe because I've seen so many issues even on myself. And I've experienced that closely because I received a Social Security number and then started building my credit score as an adult when I moved here. I have had conversations about my credit score dropping to try and understand why. And the reasons are baffling. A background check was conducted on me. And I was like, yeah, that was the US government; I was getting a visa. Why does that affect me? Right? It's very odd. And God knows with background checks for a lot of reasons. So it's just, it has such a real impact on people's lives that, you know, I would worry less about ChatGPT. I'm not saying it's negligible, but I would worry less about that. And I would tackle first the types of technologies and machine learning that already are embedded in highly sensitive areas of our lives. And then I would worry about ChatGPT telling me whether or not you know, a piece of news is accurate while you're getting your news from ChatGPT any way?
Debbie Reynolds 30:00
Right. Yeah, right. I tell people I don't use it for that. I mean, you could search the Internet yourself. You don't need it to help you do that.
Beatrice Botti 30:08
Yeah, like, it's just I get it. It's just right. Like, that's another example of, like, our people like we grew up, even I grew up. I'm in my 30s. I never had to question whether or not the documents or things that I was reading were accurate. Like, I didn't have to, like verify information elsewhere. That's a pretty recent problem. And, you know, I think we're all painfully aware of that now. And I think a lot of us do verify information. But, you know, when you talk about charge up, the people that I've spoken to, they're like, oh, my God, I gave incorrect information. I'm like, what if you have ever done an online risk, like search of any sort, and just taking it at face value? Absolutely. Right. I don't know; check a second source just to see if it's completely, like, out of whack. It's just funny to me that, like, that is really the problem we've identified with ChatGPT. It's just a shiny thing that we're using to distract people from the real problem.
Debbie Reynolds 31:29
Yeah. Yeah, I agree with it. I agree with that. What are your thoughts on cross-border data transfers? I think you're the perfect person to ask about this.
Beatrice Botti 31:41
Oh, what a fun topic!
Debbie Reynolds 31:43
Because it is a fun topic. Because a lot of my friends in Europe and me like I've been doing these transfers for as long as they've existed, right? So once the GDPR came out, once, you know a lot of these new regulations started coming out, people started looking at cross-border data transfers like it was a new thing. And people were like hair on fire about it. But it is part and parcel of business. It has been for many years, you know, a lot of companies have been very mature in how they do that. But how do you think it's changing now, if at all, as a result of some of the new, more emerging technologies and regulations?
Beatrice Botti 32:28
I think cross-border data transfers are one of the most sensitive and most pressing issues of our, I mean, honestly, of my career, there hasn't been a day in my somewhat short-lived privacy career that has not been a pressing concern. For a number of reasons. I worry that we are moving in the opposite direction. We see more and more countries coming up with data transfer limitations. And again, sort of to the conversation we were having before, the nature of the Internet does not support conceptually the idea of geographical segregation. That's just not how the Internet works. That's not what we built it to do. We built the Internet to bridge distance, not to make it greater among us. So it is, I think, intellectually concerning. I think it's also, you know, there's so much into the way the data transfers conversations have evolved that is, to me, frustrating because the concerns voiced by organizations are very valid. Government monitoring, the invasive nature of some of the things that have happened in the last 20 years. I mean, there's no one who's not concerned about that. And now, because, you know, I always joke about, well, you know, you'd be concerned about government monitoring only because you have nothing to hide that comes from a place of privilege. That joke comes from a place of privilege, of having always lived in countries that value democracy. So I'm painfully aware that there are real concerns with government monitoring of its citizens. I just, it's not just happening here. And I say this as the European countries do citizen monitoring; everybody does. So this idea that the real bar to data transfers to the US as the monitoring that occurs in America, like yes, it's a little different. Yes, there are specific concerns. We need to address them. I agree. I mean, America went through a history-altering event with 9/11, which led to some highly concerning legal decisions being made, which I think we all agree now, we understand why they happened. But I think we're all probably in agreement, at least most of us, that it's time to move forward from some of them. And they're still creating challenges today. What worries me is, as we talked about, the Internet is not subject to geographical borders, and neither is the economy. It's just wishful thinking to imagine that the US economy or the European economy, or the Chinese economy could thrive if we suddenly shut down borders, and we saw that firsthand with COVID. So because we do live in a global economy, whether or not some people want to live in it, it's a different issue. But because that is the reality that we are in today. And because we're moving at a very high speed towards a more and more technological society, because that's where we want to go, that is collectively what we want. I saw my own life experience positively impacted by the advancement of technology, I moved here when FaceTime was not a thing. And I lived in the States when FaceTime was not a thing, and I could not talk to my parents and see them unless I was home on my desktop or laptop, and I turned on a different, you know, chatting mechanisms like Skype or something, and I could talk to them, and then it became an app, and then FaceTime became available, and then iMessage became available. And suddenly, we weren't paying to talk anymore, I wasn't paying to send the message to my dad to tell them that I got to the hotel safely. It closed geographical gaps for a lot of us, I can't imagine anyone seriously arguing that we were better off than before. As it always happens, people took advantage of it; some people took advantage of it, some people paid a price for it. I am a big believer. And this kind of ties into everything that we've talked about before that if you can have transparency if people understand what is happening, I think you can get to a place of accepting that your data is needed to do X, Y, and Z. That's what it's being used for. You'd still use the service; you'd still want it. And I think the instability that's being driven by the open issue of data transfers is concerning. It creates instability and uncertainty for companies big and small. So I think it's a common mistake to think that the people that really suffer from the instability in data transfers are big tech, the majority of tech companies that are successful, and the top 100 tech companies are from the United States. Or from APAC, there's very few from Europe. These are global companies as big or small as they may be, someone like Reddit, someone like ourselves, like DoubleVerify, these are all companies that are not big tech but work globally. They need stability in data transfers. And it takes us back to the conversation about Federal legislation; it is sorely needed because it could fix this problem. Federal legislation could possibly, hopefully, ideally, catch the issue of adequacy with the EU behind this. I don't think it's an unattainable goal. I think it's it's very much in reach. And I think you can see, from the change of heart that a lot of companies have had about Federal legislation, that it really does matter. You know, part of that is because of a fragmented system of I find it funny when people are like, oh, there's all these state laws about privacy. There are hundreds of privacy laws in America, there are cities that have their own privacy laws. Absolutely. Chicago has a biometrics law. So we've always had the issue of fragmentation. I think part of that is what's driving the change of heart that a lot of companies have had about Federal legislation. But I think the issue of data transfers cannot be underestimated. For a lot of these companies. It means living with a rest that we don't like. We don't have to live with, and if we don't find the solution for data transfers, both from a European perspective and a US perspective, it could be one of the most consequential self-inflicted wounds that we have ever experienced as a society. Like, I don't know why we would do this to ourselves. Can we really not just get along? And figure out how does it benefit Europe? If Meta shuts down services there, right? What is that doing for them? It's like they have employees there. They have offices like, and the idea of localizing services is so outlandishly unattainable. They just can't even fathom the conversation. S.
Debbie Reynolds 40:47
Yeah, I don't know, localization. So we see that a lot at APAC around laws, around localization. Yeah, I think people imagine that localization solves more problems than it actually does. Right, you still have to secure the data. I mean, it, to me doesn't really solve the problem that they think it does.
Beatrice Botti 41:09
But also, I mean, at least from my perspective, right? I think that we have an understanding of localization that is faulty like people are like, I'm just gonna get a European data set. That is not localization. No, you would have to spin up a completely separate environment and make sure no one outside of the EU has access to it. Any vendors that you've been using up until now for your primary environment, kiss them goodbye because they're not in the EU; all of your vendors have to be localized, and all of your resources have to be localized. If something breaks down, and the one engineer that can fix it happens to be in San Diego, you better be prepared to put them on a plane because he's not going to be able to help you, no matter if he could stop a massive data breach from happening, that would be a data transfer. It's just not to mention that even if you solve all of those problems, I still haven't heard one reasonable explanation as to how you avoid liability under the Cloud Act. Every time I have a conversation with European counterparts about the localization of your services, would solve all these issues. And I'm like, what about the Cloud Act? We would still own the environment. So we're still subject to US laws. So what am I doing now spinning off a subsidiary that I have less than a majority stake in? So it's an independent? I just can't understand. Maybe I'm just not smart enough. And these people are just, like, fully outsmarting me. And they have it all figured out. But.
Debbie Reynolds 42:49
I don't know, I agree with you 1,000% on this. And I always wondered, I'm like, probably the ways that people do these laws. It's almost as though they think data is like a widget that fits in a box so they can move it from place to place. And we know that it's not that way.
Beatrice Botti 43:05
And I mean, like, but it's understandable, right? Like, what's the average age of a legislator in any one country? Right? Like, how often if they use the Internet, what do they know about? Some I will say there are some people in Congress, for example, they're trying really hard to understand the tech. They're engaging with people; they're, you know, they're bringing on resources who have an understanding of tech, they're trying. It does not give me a particularly, you know, warm and fuzzy feeling when in the Supreme Court, an oral argument says we're not really the experts here. Like you're making the decision, though. So, right. I'm concerned that you would be comfortable. I mean, on the one hand, you're like, oh, thank God, they realize that they're talking about things that maybe elude their grasp a little bit. But on the other hand, I'm like, man, you make the law of the land. So wow, yeah. But you know, I don't think that's a solvable issue. That's a byproduct of democracy, right? Like, no one's ever going to be all-knowing. So I can probably get comfortable with that. What makes me less comfortable is, once again, the lack of understanding of what it would take, for example, for localization, to give space to some players out in the industry to say things that are not true. And it's a very uncomfortable place as a business vendor to have a conversation with a potential client who's like, well, other vendor says they can do this, and I, you're sitting there, you're like, I'd be fascinated to learn how,
Debbie Reynolds 44:54
Right, yeah.
Beatrice Botti 44:58
It's not happening the way you think this was happening, sir. But you know, you never want to be out there calling people liars. It's just like, yeah, hey, we know we need to think about these things. It's just, it takes a lot of eloquence and touch that maybe, you know, to take us back to the beginning where you're like, you speak the truth and what, maybe I'm not the most delicate person. You know, I've been known to say stuff like, well, we'll be in a conversation. And we're like, well, this other company does this. And I'm like, yeah, that cannot be true. So just right. Maybe they misstated it; maybe it was misunderstood. I'm not calling anyone a liar. But let's level set on the fact that this is impossible. Right. So and move on from there. Because I think that's the most valuable use of our time. And that, you know, some people respond positively to that, some people not so much. So it's always interesting, but it does worry me that it leaves space for a lot of, let's call them, embellished claims. Yeah. And some of us are just like, This is how it is. Right? Like, I'm telling you the truth. Do I have a solution? I say I don't know all the time, which makes a lot of people uncomfortable when it comes from a lawyer. I'm a big proponent of saying, I don't know, oh, how are we going to solve this problem? I don't know. Right? Now we're gonna have to think about it. Like, what did you think the law is gonna do in the next two years? I don't know, I would not be here if I knew where the law was going to be in the next hour. Right. I would be doing other things. So it's, it's just a very interesting, ever-changing environment. And you probably feel the same way that I do. I feel like we're constantly bombarded with information. It's like, oh, Fiji just passed its privacy law. And I'm like, I do not have the emotional capacity, right? I'm happy for them.
Debbie Reynolds 47:24
Yeah, I have a very long reading list. I just flag it and then I have to go back to it later. It's just a lot.
Beatrice Botti 47:31
I mean, honestly, sometimes it feels like by the time you're gonna be done with all the reading, a new law is gonna have, like, I feel like in the last couple years, Canada's churned through so many different versions of updates to people that every time I like, bookmark, something to read. And I don't make it a few weeks later; there's a new version of it. And right. I guess it's a good thing. I didn't read the other one.
Debbie Reynolds 47:59
Right. Yeah. Well, my thing now is, I hate when people say, oh, someone's proposing this or proposing that. I just want to know the end. The end result is like because I can't keep up with that.
Beatrice Botti 48:11
The frenzy about all the bills that are being proposed in different State houses and State Senates, and they're like, this bill wouldn't require, I don't know, pink walls in every office. And I'm like, well, let's assume that that's not gonna make it through committee. And then it's gonna get edited out. It isn't retaining, and like I discovered new laws that I hadn't heard of all the time, like this. Louisiana law, I don't get involved with the content. So it makes sense that I wouldn't know about it; it has a tangential impact on privacy, right? It's not a privacy law. So it makes sense that I wouldn't have gotten an alert for it. But I read this article in The New York Times, that was all about how laws like that have the potential to change the Internet. And it just gives you an idea of how much you have to pay attention to when you work in our industry. It's just like; it's not even just the laws about privacy, not just the laws about security. Now, not just the laws about AI, you got Europe putting out the Digital Services Act and the Digital Markets Act, which are tangentially related to privacy. And now you have all these other content issues that, by virtue of existing on the Internet, do have privacy-related concerns. So it’s, I mean, I could use like two more brains. Have you ever seen my husband and I are big fans of The Avengers in the Marvel movies? There's a scene in Dr. Strange I don't know if you're a fan, but there was a scene in Dr. Strange where he's a magician. And you can see him sleeping, and while he's sleeping, he's using his astral projection to read books. And the first time I started, I was like that, that's what I want to do.
Debbie Reynolds 50:00
Yeah, right.
Beatrice Botti 50:01
I want to be able to do that. How do I learn how to do that? It would be helpful.
Debbie Reynolds 50:09
Oh, absolutely. Yeah. I mean, you could get caught up on that reading list. Absolutely. I agree. Well, if it were the world, according to you, Beatrice, and we did everything that you said, what would be your wish for privacy or data protection anywhere in the world? Whether that be technology, regulation, or human behavior.
Beatrice Botti 50:32
Any, any wish? For privacy?
Debbie Reynolds 50:36
Yep. Any wish.
Beatrice Botti 50:37
I will say this. I think as privacy professionals, we complain a lot about the complication of the industry, but to some extent, that keeps us all employed. So it's, you know, wouldn't want it to get too easy. I think it would just, you know, I think it would connect back to finding a way to bridge the gap with end users with just the general public if we could find a way to bridge the gap between how technology works and what role their data plays into it. I think that would enable us to have more transparency, and it would enable us to have probably even more effective minimization techniques because you would find yourself being able to limit things that you now are using to make sure that people are actually able to use the services that maybe you wouldn't need if they knew how to use the services better. So yeah, just honestly, the funny thing about it is that it would benefit people. Above benefiting companies, there's obviously a flow-down benefit to the companies; we're now better able to communicate calm concepts and get across the information that they need to get across and largely want to get across. But it would just greatly benefit people. Make them more comfortable in their lives and enable them to use services without being worried. I mean, sometimes even I worry about stuff; I boarded a plane last Christmas, to go home for the holidays. And I found out that British Airways uses facial recognition to board you. And I was like, how does this work? Or I was, like, kind of putting two and two together. Obviously, they have a copy of my passport. So they know sort of what I look like, they're probably mapping my face; I'm like, do I want them to be able to do this? Because you know, like I have Clear, I voluntarily engaged in that. I made a decision, maybe not a smart one; some of my friends are like, I will never use Clear, I find that helpful. I make it through the airport much faster. You could use it for sporting events during COVID; there was a lot of value to it for me. So I made that trade-off knowingly. It didn't feel that way. As I boarded that plane, it didn't feel like I knowingly engaged in that facial recognition exercise. And it was one of the times where I kind of felt like, you know, the average nontech, nonprivacy person who's like, I wonder where this data goes and what they do with it. So it's not a good feeling. And if I could snap my fingers and bridge that gap, I think it would help a lot of people greatly, people, businesses, governments. Yeah, so that is my privacy wish. Write it on my Santa list for a while, and see if he can bring it to me.
Debbie Reynolds 51:05
That's a great list. I agree with that wholeheartedly. There definitely needs to be more transparency. And a lot of times, thinking, is this even necessary?
Beatrice Botti 54:02
Yeah.
Debbie Reynolds 54:04
You know what I mean? I think if people felt that it was necessary, yeah. They will be more comfortable as well.
Beatrice Botti 54:20
It's actually like some of the things that were required to write privacy notices for transparency purposes. I mean, truth like we all know, privacy notices for a lot of companies are CYA exercises. I'm like checking the boxes of everything I have to write. And like you're going through, and it's like, you got to have a section on security, and you're like, okay, well, there's a lot of companies out there though, right? Like our systems are the most secure, and I'm always the one that is like if you look at my privacy notices, they all have this in common. I'm always like, we do everything we can, but no system is 100% secure, and people hate it. And I'm like, I'm telling you the truth. I'm sorry, that's uncomfortable and does not conform with your expectations. But the reality is, if I told you there's nothing that could possibly go wrong, I would be lying to you. And maybe if they understood how technology works a little better, they would understand why. You know, things sometimes go wrong due to human error or technical error bugs; there's a number of reasons why things might go wrong. And it's not intentional. It's not malicious. And the last thing I want is for it to go wrong. Yeah, I'd rather tell you the truth. But it doesn't yield the result that we think it does. And it's, I think if we could, if we could bridge that knowledge gap, not entirely, like I don't expect everyone to suddenly become an ad tech expert. But just understand that a little bit more; I think it would help people, and I am a true believer, and this is my unpopular opinion. As I told you before, I'm the unpopular opinion person. I'm a true believer that if people understood a little better on average, people like the Internet the way it works, and people like getting some degree of relevant advertising. I'm not saying like I saw you in the shower two hours ago, and I know you're out of shampoo. Weird. More like, you know, I know you like sports. Beatrice, did you know that this is a cool reference for you? You might be aware of it. Did you know that the Savannah Bananas are coming to visit Massachusetts, and you could see them? Have you ever heard of them? Oh, God, you know, you gotta look them up. They're this baseball team that does dancing acts are kind of like an event. It's not a real; it's like a, would you call it like, a show baseball team. There's a couple of teams that do this. And they all play each other, and they tour the country, and you go see them, and they'll like, sometimes the batter will come out and stilts. And sometimes they'll like all do a choreography, all of the outfielders; it's super funny. I had no idea they existed, I had no reason to know they existed. The only reason I found out they existed is that Instagram, based probably on the fact that I follow every single baseball team in America, every single football team, every single basketball team, every tennis player that breathes was like, you know what, this woman would like this. And I once saw a video that was sponsored, and I loved it. And I started following them. And I go look at their new videos every day, and they bring me true joy, and I can't wait until they get to Massachusetts, and I can go see them live because I'm gonna love it, it's gonna be a great night, I would have never had any idea they existed. So I really truly believe that the average person does not hate everything that happens on the Internet. If they had a little bit more understanding, a little bit more control of it. I really don't think we need to alter the internet that much. It's a pretty amazing invention. And it's, it's so awesome. And it's brought a lot of joy and a lot of solutions for people. It gives you access to the information you never thought you would have; I mean, I still am the, I know my parents are supposed to be the Internet generation. But I really still am, probably because I moved here before it was as pervasive as it is now with smartphones. So I still feel grateful when I get to talk to my friends. And I don't have to worry about paying forever. All of that. But it's just I really think people don't hate advertising on the Internet as much as we make it out to be it's just I don't think they understand the crucial role advertising plays and keeping the Internet free the way we know it today. And if they did, maybe they would have a different take.
Debbie Reynolds 58:59
Yeah, I agree with that. Very cool. Well, thank you.
Beatrice Botti 59:04
Check it out. The Savannah Bananas, you've got to check them out..
Debbie Reynolds 59:07
The Savannah Bananas.
Beatrice Botti 59:09
They're amazing. I'm telling you. It is such fun. If you're just having a bad day, you go on Instagram and you look at the videos. It's impossible not to smile.
Debbie Reynolds 59:20
Oh, my goodness.
Beatrice Botti 59:22
Oh, silly. And they're like this family event. And there's all these children. It's just; it's just, it looks amazing. I can't wait.
Debbie Reynolds 59:31
Oh, I'm going to check it out. Well, thank you for that tip. Thank you so much. Well, it's been great. It's been fantastic to have you on the show. And I'm sure everyone will love the episode as much as I do. This is great. This is great.
Beatrice Botti 59:44
I hope we can always catch up again in the future. I'll be happy to whenever you want me.
Debbie Reynolds 59:51
Definitely, definitely. Thanks so much, and I'm happy to chat with you further in the future. Absolutely. I'll talk to you soon. Bye.