E275 - Toin Berry, Personal Data Privacy Consultant

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:13] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast, where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:26] Now I have a very special guest on the show, Toin Berry. He is a personal data privacy consultant. Welcome.

[00:35] Toin Berry: Well, hey, hello and thank you.

[00:39] Debbie Reynolds: Well, this has been a very interesting journey. It was hilarious because you had called me on the phone and you and I had such a great,

[00:50] just impromptu conversation about privacy and data.

[00:54] we were geeking out about privacy and things like that. And I thought you just have such a unique, different perspective about privacy. Not only you have a professional interest in privacy, but then you have a, a very compelling personal interest as well.

[01:13] So tell us about your journey. Why is privacy in your wheelhouse? Why is it important?

[01:21] Give us your background.

[01:23] Toin Berry: Yeah. And just part and parcel to our call,

[01:26] I had lived experiences of what happens when there's a personal data stewardship that doesn't end up becoming handled responsibly.

[01:37] And along with that,

[01:39] you know, unbeknownst to myself, I became more compromised than I had initially thought. Just you like everybody using Gmail since it came out, and then Hotmail and just using messengers and thinking that reading privacy guidelines on the majors, Google, Facebook, medics, corollary's, WhatsApp, and those interests that own them and thinking,

[01:59] okay, well,

[02:00] it's kind of like that old slogan, if it's free, you're the product.

[02:04] And that other slogan that I don't really subscribe too much, which is, I forget kind of how it goes in the industry.

[02:11] Like, I've got nothing to hide, but you got something detect and it doesn't matter. So those are the two slogans that I became very much like, okay,

[02:21] well,

[02:22] the feeling of. And when it was made clear to me.

[02:26] There's also another story, real similar, of a gentleman that's actually on ProtonMail's YouTube channel. If you scroll back, like in about a year and two months ago now, there was a gentleman who was saying why he's advocating pretty good privacy PGP products.

[02:43] And he goes, well, he had the unfortunate experience of being swatted,

[02:47] meaning somebody. That's a term in law enforcement where anybody for any reason can call in anything. And it could be a number of, who knows why. It could be a just a person wanting to be harassing.

[02:59] It could be somebody who's got a bone to pick. It could Also be subject to is it a legitimate questions arise, is this suspicious about a targeted operation to evoke some sort of unrest in a particular instance.

[03:12] However, that is the testimony he gave and he did it publicly. I don't give mine with too much public because I've learned to be very private.

[03:23] But that's essentially what brought me in to a forced awareness that data and softwares like Palantir today,

[03:31] how the fusion centers work, the information distribution centers that are behind a lot of law enforcement and or tie into Google and Facebook corroborated data.

[03:41] So it's not just advertisements that people get this data.

[03:44] It's not just, oh, I don't want, you know, Amazon, Walmart or some target ad coming on my phone. That word targeted ad can be targeted anything.

[03:54] So information about your purchasing habits, you know, race, gender, religious creed, anything can then be open to social stratification, engineering and then targeting. And then that doesn't mean targeted advertising.

[04:09] It can just mean data amalgamation. And for, you know, not to overplay the word weaponize, but these things almost like can and at times maybe you can and may be used against you outside of a court of law, but just for the sake of other interests.

[04:24] You know, that was a long minute. But that's essentially how I got brought in was through an unfortunate series of circumstances over the course of a year that led into a misappropriation use of my personal and private data to create narratives and fictionalize issues and perceive issues that were non existent to the point of great risk for my personal data privacy and my and my own liberty and civil liberties.

[04:48] Debbie Reynolds: Yeah,

[04:50] I think,

[04:51] first of all, thank you for sharing that. You are not the first person that I've talked to has had a lived experience around their data being abused and them being targeted in some way, whether that be stalking inferences made about them or choices being made about them without their knowledge and consent.

[05:12] I want your thoughts about.

[05:14] Let's dig a bit deeper here. And this is about inference.

[05:19] You made a very good point here. And so a lot of times when you hear people talk about privacy of the media, they're talking about advertising and they're talking about marketing.

[05:29] What they're not talking about is the things that are inferred about someone and how that data can be used against them. But I want your thoughts.

[05:39] Toin Berry: Thank you. Inference, it's, it's an interesting word too because it comes up in a lot of circles. It comes up in psychology circles. It's actually a form of cognitive behavior. Right.

[05:48] That's a word that's used to infer and construe.

[05:53] It's kind of like that other saying I like there's lies, damned lies and statistics.

[05:58] That's a great,

[05:59] it's a great little, you know, hey, you know,

[06:02] meaning you can, you can almost depending on who's reading the data and their, and the beliefs and values backing that anything can be inferred, whether it's purchasing habits, credit card habits,

[06:13] or it was with predictive behavioral algorithms and somebody, let's say, is not stewarding their YouTube viewing history.

[06:20] You can have various interactions with various agencies and targeted advertisement as well as other agencies that are inferring that by watching, let's say History of the World that you have a long history.

[06:35] This is just one extrapolation of watching a lot of world history, World War I and World War II. Well,

[06:41] if you have, you know, 20 hours of that and those documentaries have been predictively, behaviorally algorithmized by YouTube, well then it can be inferred that why is this individual watching so much?

[06:52] And then they can correlate that if you're using Gmail to your emails and then to your text messages and if they want to and they meaning any particular adversarial interest that wants to pry into metadata, text analysis, phone voice recognition,

[07:08] cell tower triangulation,

[07:10] and amalgamate that to form a social picture of that individual. And then let's say that's an operation that's going on in intelligence to quantify the next predictive move of a particular person.

[07:22] Well, that can be used for anything, not just purchasing habits at Walmart or Target.

[07:27] It can be used to stratify and, or isolate and, or neutralize in a variety of circumstances. And it's actually, it's par for the course for operational security.

[07:37] But when it's taken to the context of who's the real puppet master behind predictive behavioral algorithms, who's, who's whose interests are really being brokered, whether it's Walmart, Target, Amazon, whether it's.

[07:51] Then it gets into polity and it gets into social stratification, economic barriers. And then it enters into the kind of. A lot of people are more and more saying about.

[08:00] Such as Rod Dreher wrote a book called Live not by Lies. There's a lot of other recent discussions around this. I don't want to get too carried away with the Orwellian or the inverted totalitarianistic era or what Shamblin Woolen wrote about.

[08:13] He wrote a book that had to do with inverted totalitarianism. He's a World War II pilot. He died just early,

[08:19] maybe about 2010ish yet the Wikipedia, the guy. But these are all contributors that saw these types of information based tools that they move from tools to weapons in a real different kind of context.

[08:33] And I use that word not as in the traditional sense of meaning threat, but meaning can be used to infer about anything based on the values and the interest backing that desired outcome to see what the numbers are wanting to quote, have read into them.

[08:50] Debbie Reynolds: Yeah, totally. So the example I like to use, I had read the testimony of the, of one of the Cambridge Analytica whistleblowers and they had a project they were running and they, they called it the KitKat project.

[09:05] I don't know if you heard about this, but the KitKat project, basically what they found is that there was a correlation in their data set with people who like kit bars and anti Semitism.

[09:19] So for them they were like, so does that now mean in their data set that if you like a Kit Kat bar that you're anti Semite? Right, so this is the problem that you're describing.

[09:32] Toin Berry: Yeah. And I really appreciate your diligence and that's actually, you know, Debbie, it's really, it shares a very. And I'm, I'm really excited about your work and I'll just take a moment to this because I mean, I'm sure like there's other movers and shakers I brought up who put out some really well communicated,

[09:47] concise, succinct information about data privacy. But what I love about your particular DNA is you just, you don't miss a beat. And what I mean by that is like for instance, even brax, that's another YouTuber, guy's got like 700,000 folks.

[10:04] Naomi Brockwell is a big privacy buff. Henry Fisher over at Tech Lore, the hated ones got a lot of following,

[10:11] but he tends to be very, very, very overly concerned that. Yeah, I mean like, in other words,

[10:19] just because surveillance is under about every other rock doesn't mean it's under every rock.

[10:26] But the point I'm saying is is that this message that you're carrying is like it's, I think it's now reached a critical mass in a lot of other sectors. I mean, I'm also involved in philanthropy and I am watching certain generations grab hold of, with a significant amount of gravity,

[10:45] but not an overly imprudent reactionary sense of gravity. So there's some composure behind this data privacy. In other words, nobody's overreacting, but people are actually systemically aware that these Kit Kat bar correlation to what you said anti semi like is there and that correlation goes into anything and the real concern becomes who's the puppet masters so to speak.

[11:10] And then when you look at Palantir, that's the recent big one that has a lot of software that is capable and using the intelligence agency and automated government surveillance. It's traded on Wall Street.

[11:20] Peter Till is behind that. He's looked at Zuckerberg, Sergey Brand, Larry Page, Google and Facebook. They're all three. And you can get in a really nuanced careful areas.

[11:29] Where one I find is it's best to exercise some discretion to not be blameworthy. But what you are bringing up is a very quasi Orwellian question about to what degree.

[11:42] It's not that I don't have anything to hide but one has something to protect but more than something to protect. It's almost like now one has something to prevent which is these sorts of moral social responsibility issues by having and studying using Session using Simple Log and using Proton using Tutanota using using understanding free open source which is part of my introduction here I came to you about like these are like maybe self education resources that people

[12:10] can give. That's why I mentioned Henry Fisher techlore. He's a wonderful public service advocate. But these people have great risks too. And I even concern. That's why when I interact with you as Toynberry people in this data space speaking various anomalies and suggestions into personal data privacy can themselves become targeted.

[12:30] Because that in and of itself is a chilling effect that can be presented when you hear about Edward Snowden or he went away or the young person, the young lady that was working for Liberty I think was her name.

[12:41] She did some time and then not free Ray Ricky Ross, he got it but the other guy that did Silk Road he just got exonerated when in 2025 and there's people are trying to be brokering freedom for the sake of the good intent.

[12:55] When Julian Assange finally was able to go back home after that drama like there's people and I'm just saying this I know I'm taking some rabbit holes. Your episodes are beautiful for this opportunity to get our thoughts out there in a stream of consciousness that is actually succinct when to look at it.

[13:11] I do think there is an era now where the integration of the age of information has now came into full breadth of custody to the degree that Web 2.0 and AI is kind of here to where the sense of moral and social responsibility around the individual agency to handle themselves is now starting to get to a regularly homogenized.

[13:33] And now hopefully what I consider a moral conscience is going to come into play. And people have to realize, like anything during the industrial revolutions and then you have Mechanized warfare in 1 and 2.

[13:45] And now all of a sudden with this age of information,

[13:48] it can be hopefully a cessation of, okay, let's not use this for its negative intent. Because industry brings air conditioning, heating, refrigeration, the automobile. It doesn't have to bring tanks and weapons, but if misused, it does.

[14:02] Debbie Reynolds: That's true. That's true. As you were talking, I guess I was thinking for me, a lot of times when I talk about data in terms from privacy perspective, we talk about data that's collected and retained.

[14:18] But I think the conversation that we're getting into is about what's created after that.

[14:25] And that is the problem because that's what laws or regulation, first of all, laws aren't. They don't stop things from happening. Right. And so a lot of what we're talking about is prevention.

[14:34] But then also,

[14:35] and this is the reason why the data brokering industry exists at all,

[14:40] is that there really isn't laws created or mechanisms created to prevent that data being turned into something else.

[14:54] Right. And so even if you go to a service say, okay, someone has data about me and you want to delete this data,

[15:01] there are certain data that is being created about you that you don't really have any control over. But I want your thoughts.

[15:10] Toin Berry: Thank you. No, and that does get into like and even what's a trend that I'm seeing happening. And this is. And also Noemi Brockwell on dad Saves America podcast hit this beautifully, which I can't wait to advocate more places for you, Debbie, for sure,

[15:22] your message and your insightful questions,

[15:25] what I'm starting to observe now as a practical measure, because I investigate things and see what shell companies doing what with data. And Naomi Brockwell brought that up too, that a lot of the companies are just that actually broker back to the government,

[15:40] so to speak, using Google and Facebook data. They have to have several layers of separation. The old school mafia used to do this when they'd have layers of corporation, when they're really the fishing canning company New York City.

[15:51] And during the age of that era, they were doing a lot of other things besides canning fish, you know, but they would have layers of various interests.

[15:59] And the data deletion privacy movement is now getting almost to me it's got clear signs where the data brokers are in cahoots with the data deletion services.

[16:11] And I won't say names just for the sake of respect, but those companies that are movers and supposed movers and shakers and data deletion are now it's just a recycling machine.

[16:21] It's just like taking data, but then the brokers are going to broker it so far and then reinvent it. And that keeps business good. And so people are going to pay for a subscription service to have your data deleted and monitored.

[16:32] And so, and what happens is that's been, that's what's going on.

[16:36] The revolving door is here. I won't say any names. I could prove it all out. If I do it on your podcast, it might burn some bridges,

[16:41] but that's another effective model for keeping businesses good.

[16:45] But the nature of how fast data brokers themselves supposedly, quote unquote, are changing.

[16:52] And who knows if they're backed by Brin Page and Zuckerberg finance companies to Peter Thiel, who knows if those ownership bodies and the interest they serve with their softwares are not also corollarily doing that?

[17:04] I mean,

[17:05] just things to consider.

[17:07] Debbie Reynolds: Yeah. And I've actually seen that in the cookie space where first of all, the whole cookie thing has birthed a multi billion dollar business.

[17:18] So those companies that create tools, I won't name names, at least one that has a tool about cookies,

[17:28] they're literally advocating in,

[17:31] you know, in regulatory spaces that cookies not be outlawed.

[17:39] They're like, hey, this is our bread and butter, so we want cookies. But then on the other side they're saying, hey, use our tool because we can help protect your privacy.

[17:47] So I think there is some of that going on for sure.

[17:51] Yeah. I want your thoughts about.

[17:55] So we've been talking about consumer privacy. You're saying that you feel like, well, I just want your thoughts.

[18:02] Because the reason why I started this podcast is because people weren't really having deep conversations about privacy.

[18:09] A lot of people were like waiting for a law to come out and then they give their opinion. But I'm like, this is a really deep personal issue. Like your data is you.

[18:17] Right? Your data is a representation of you. And so being able to talk about it from with people around the world in different areas of data has really fascinated me.

[18:30] But what do you think the awareness is now and where do you think it's going in terms of people understanding how important privacy is? Or maybe, maybe not. Cause I don't know.

[18:43] I had Dr. Ann Kavukian on the show.

[18:46] She's a privacy regulator in Canada.

[18:48] And one thing she said really stuck with me. And she said, privacy is not a religion. So like, if people want to give their data away, then they can. Right?

[18:57] And that's not what we're saying. We're saying for me, I'm all about advocacy, but I'm more about education.

[19:03] I'm like, look, this is what's happening with your data.

[19:06] Here's your choices. And then you have to make a decision about what you want to do. But what are your thoughts?

[19:11] Toin Berry: Distinction between advocacy and education. And I share that sense of it's like ecumenical, right? Privacy is not a religion. I like that. It's like the context of.

[19:20] But part parcel. And I'm saying things that are the public domain. When people put something on YouTube,

[19:26] it's fair game to just comment on. And I get that and I'm respectful for various ways in how I respond.

[19:31] But like, for instance,

[19:32] there's a couple of encrypted email services that I find, you know, beneficial for one wants to use them. That's Toot Nota over there in Hanover, Germany. And there's also Proton.

[19:40] The two main ones that are free and open source. But part and parcel of the interesting perspective that the German company Toot Nota gave, which was in one of her talks was wonderful.

[19:51] She says the thing about. And I'm going to correlate this back to your earlier comment about once that data is collected, what can be done with it is up to really anybody.

[20:01] And that can mean a regime.

[20:03] And when you look back at history and you look at the 1940-1945 era in that area,

[20:10] how did that much data get used to find people and do things like,

[20:15] I mean, data in the hands of somebody is everything depending on what they want to do with it.

[20:22] If somebody cares that you bought Charmin toilet paper at Walmart or Amazon, who cares? You're right, it doesn't make a big difference until every piece of data,

[20:33] whether it's used and acquired through tracking,

[20:36] is then amalgamated and has AI ran on it to create patterns.

[20:41] I think the sense of urgency is. And I appreciate your take on education and you walk in that line.

[20:48] It's like a well educated, so to speak.

[20:51] I don't use that word fool. But like,

[20:54] there's a way that I think when people have an awareness,

[21:00] it's like. And you've heard this one Year earlier podcast, I listened to several of them, but one was like, the roads have rules so you don't hit each other. That's why you've got stop signs, that's why you got paint on Double lines when you cross, they're there to keep you from hitting the other person.

[21:16] So laws maybe don't change things. People still have car accidents,

[21:19] but very rarely will somebody do that because of the self harm inflicted. Okay. Because you're at risk. I think when people realize with their education the self harm that's at risk with information,

[21:30] then the advocacy just takes care of itself in terms of what that individual will do. Also,

[21:35] as you mentioned about like getting stuck in rabbit holes. Now I've read a lot of history post World War I and War II and that was a big industrial revolution, mind you, mechanization was big.

[21:44] So there's that. In the age of information,

[21:46] we have a blissingly fast way to transmit notes and messages and information and media at a level that's unprecedented for consumption.

[21:57] The, the context being that there's just the loss of human personality.

[22:04] And what happens when the individual personality is lost is exactly how the clinging to any sort of directional force. And that can include regimes that have dictators,

[22:18] that can include corporatocracies that have deep sayings and influence and social media. And so what happens when personality is a casualty, however perceived or influentially rooted out? That's a whole other debate of ethics of a population.

[22:34] And nobody has personality.

[22:36] Well that's the beginning of unrest and that's the beginning of division. And unfortunately that can lead to catastrophic outcomes such as 1945, Germany in that era.

[22:47] Debbie Reynolds: Yeah, let's talk about that. It's very interesting. So to me I've been very fascinated with the cultural differences, especially between Europe and the US in the way that they think about privacy or data protection.

[23:04] And so because of their experiences with **** Germany,

[23:08] I feel that the Europe has, has for decades been more like let's collect less and less, protect what's out there more. Where we as in the US we're like let's collect more and then try to protect whatever there is.

[23:25] Right. So it's a very different point of view. I'm more for all about prevention. And so for me a lot of times when I work on projects,

[23:37] a lot of people are talking about, well let's, let's gather like all this or gather create these goo gobs of data and then try to protect it. Where I'm like well let's just collect less data.

[23:49] But what are your thoughts about those two different ideas around data protection or privacy?

[23:56] Toin Berry: Well, you know, they're saying an ounce of prevention is worth a pound of cure. But also you said laws don't stop people from breaking.

[24:02] Right.

[24:03] So there's kind of two pieces there and I think part and parcel to that is while prevention through data severity and privacy is something that I particularly am biased towards, it doesn't mean like back to your earlier point about education versus advocacy.

[24:20] Like I just keyed in on the point that how people give rise to Orwellian law what it ends up the very hand that feeds you is the one that takes away the very convenience that one had sought out with credit card payments and instant messaging.

[24:35] And instant this ends up becoming not only dangerously depersonalizing but when human agency in and of itself throughout a population is it is usurped and the personality is devoid.

[24:49] That will be. That's from whence come wars. It teeters into the ethics now of like what happens when it's almost like rebellions throughout all of mankind and all of history and this becomes a very values based situation and it can open up a bigger can of worms about why do certain groups seem to systemically over mankind's time be targeted.

[25:09] What is it that happens to these scenarios throughout human history and then to the degree that history becomes curated with data that was a typical example. The book burnings in **** Germany.

[25:19] The goal, the propaganda spin master. And also what can happen in the microcosms and like you said back to Europe's because they went through a catastrophic rise from not just Frankish Spain and not just World War I that was supposed to end all wars, but also the rise of the dictators,

[25:35] the rise of mechanization.

[25:36] It's interestingly corollary about the rise of corporatocracy back then the age of the robber barons here from Carnegie and Steel in the United States when we were having our civil war which was based a lot of other things too.

[25:48] But when you start to really look at this it can be.

[25:51] It starts to extrapolate into the common denominator I see is what forced did suppression happen to root out personality of another.

[26:00] Once that happens then there's going to be unfortunately ultimately other means taken that are not necessarily unforceful can happen.

[26:09] And that's just because I particularly in my own values I believe the personality is a sacred event given to an individual that is to be revered and to be given its freedom.

[26:22] But of course we're humans and you know there's roads on the highway because you can't just take a car and drive it any way you want. Right.

[26:30] Debbie Reynolds: That's true, very true, very true. What is happening in privacy or data right now that's concerning you most.

[26:40] Toin Berry: Do you think that I Mean Palantir is the latest one, right? Palantir is the latest one. And then obviously the scare has been out there post Snowden, but we got now 82 post Patriot act we've got 82 I think fusion centers which are the data distribution centers know and a lot of me know.

[26:56] Well if you know I've done an episode on those Bruce Schneider on Schneider on Security is a great blog.

[27:02] Krebs on Security is a great blog.

[27:04] But the most concerning thing is that,

[27:06] that I'm finding is that well really the big one is April coming up for the renewal of the Foreign Intelligence Surveillance act right now. Big one.

[27:14] I hope they'll get that back to non administrative subpoenas because what happens under with the, the with the data centers getting built the information industrial complex met post Patriot act careerism within the FBI, DHS intelligence officials and then, and then the usurp data they're getting, they're all wanting to build a house and have kids and,

[27:33] and build the American dream inside of intelligence and they get careerisms. They need cases,

[27:39] they need data so that information industrial complex expands and then you got mission creep. So these all issues are very prevalent. Careerism mixed with mission creep meaning that they just need to create cases of alibi to quote Arism.

[27:53] They need social segmentation out. They need to target vulnerable populations that can't represent themselves and that goes awry and that runs rampant.

[28:01] And then so these are the same if that gets reeled in through, not through where. Now if an administrative subpoena wants to be served for whether you're supposed to be busting up cells or busting up drugs or busting up trafficking or you know, the dreaded messages.

[28:16] I don't want to even use the word here on air because they'll use a certain underage population as caveat to do anything they want when it could be the Godfather I.

[28:24] I had a friend of mine who was taking his goddaughter out to his God niece to vacation. He got sweated and separated by the FBI in Puerto Rico airport and he was with his goddaughter for trafficking.

[28:37] Like there you have it. Like what's up with that?

[28:42] The tendency for mission creep in an intelligence police era state era that that will take any data to extrapolate justification for budgetary purposes because even if they have and they train people on it and that's been the real thing.

[28:57] If it gets real back in, I'm not going to say if the new departmental heads are going to do it,

[29:02] you know movers and shakers on YouTube are John Kerry, Akal, everything he's been saying the past year because he was an agent that didn't serve time from the CIA. He blew the whistle on them and then he's now he's on Joe Rogan for 4 million hits.

[29:14] His message is sinking in at the same message I'm telling you right now.

[29:18] So my concern to answer that question in full is that I hope that that people are starting to see that look, we want police, we want good police,

[29:27] sensible police. We want things with prudence and the the real button on the if FISA comes up and they say no longer can you just administratively meaning Officer Vance and Stan Getz and Sunshine and Officer Comfort and Officer Lilliam can just now go over to Chicago and go to Debbie Reynolds consulting website,

[29:47] get the 314 number track and then arrest you for six months unlawfully detained because they didn't know what your podcast was really doing yet and you were a subject of an investigation and under Rule 41 of the Patriot act, yes they can't extra judicially capitally punish you with without a court under the guise of a non patriotic actor that you could be construed as.

[30:08] So I'm not doing that to scare you, it's just that but if there is going to be legislative oversight in Rule 41 where they said now you've got to go get a court order of Y AKA I'm doing that, I'm using that as a personal example.

[30:19] I hope it doesn't cause offense but I'm bringing gravity to it because you're in a great space advocating education for a very apropos topic that like we just said if somebody wants to interpret that as a threat, good luck because it depends on who perceives that.

[30:32] So that was kind of a provoking way of I hope this April send my thoughts and prayers and hope that they'll reel it in and say you just can't serve willy nilly,

[30:42] you're going to have to go to a bench ward and you're going to have to tell us what it is about Debbie or Toy or who and why you want to go there and either extrajudicially capitally punish them as a threat with a drone or you want to unlawful or you want to detain them indefinitely without warrant which is just as effective going to jail the rest of your life and they can do it right now.

[31:02] Debbie Reynolds: I know very disturbing. Definitely.

[31:05] Well if it were the world according to you and we did everything that you said, what would be your wish for privacy or data protection anywhere in the world,

[31:14] whether that be regulation, human behavior or technology.

[31:18] Toin Berry: First of all, I just, I love the way you put things, Debbie. It's just beautiful.

[31:21] It's kind of like old errors, new labels. As Archbishop Fulton Sheen would say, it's just like too much of a good thing is not a good thing. It's prudence governing convenience and privacy.

[31:32] And in an ideal world,

[31:34] people can still look at content they want to look at, right? You can't. Because I myself don't want to become the very sort of policey stadia, Orwellian authoritarian dictatorial decree,

[31:47] nobody died, made me boss.

[31:49] But in a partial way, it would be to keep these amendments moving forward where privacy and convenience are catastrophe and safety and security are coming encumbering freedom and then the restoration of, you know, dare I say, common sense.

[32:03] But that's a very ecumenical way to put. Put it to. Because who am I to be commandeering of another person's preferences either? I don't want to do that, but I don't want to cater into danger zones of.

[32:14] Where societal targeting and stigmatization and then eventually,

[32:19] like a cyber concentration camp of sorts, you can't get jobs, you can't do things. You got your trunk investigated.

[32:25] So constraints would be on some of these new eras of information, which I hope may happen.

[32:31] Debbie Reynolds: I share your concern there. I just think we,

[32:35] in the digital age,

[32:37] the idea has been that more data is better. And I don't think that that's true.

[32:43] And so we're seeing,

[32:45] and as people in Europe have seen,

[32:48] even data in the olden days, even before computers were used in dangerous ways. That's what I'm concerned about right now in the digital age, where we have more access to data and more things can be inferred about people and different things.

[33:06] So, yeah,

[33:07] you know, I'm glad we're having this conversation. I'm glad that people are listening to this conversation. I'm glad that because we have a global audience, I think this isn't something that is unique to our country.

[33:21] I think that we're seeing this happen all over the world. So it's just very important that we have these types of discussions.

[33:29] Toin Berry: And not only that, like first and parcel to that is like the movement. Like for instance on dad Says America podcast, just two days ago, there's a wonderful lady that came over from Maoist China in the 70s and she actually went to her during.

[33:43] She started to see all these same tokens of the inverted, totalitarianistic, communistic ways that were beginning to Permeate and impede on human freedom to the point of very, very unfortunate circumstance.

[33:54] And she wrote a book and she was a mom. She had the American dream and she ended up being a huge advocate. And just like Brax, who came over here from the Philippines, people who want to be American because they're choosing to be for what we just said, I think like you and,

[34:11] and if this podcast goes somewhere else and Brax hears about it, you get on other people's podcasts because that's where. And then you get people like Joe Rogan.

[34:18] The new era of information can be leveraged to where it's for the good. And I do think that with hope that the, the movements like that if this podcast and it spreads and the next thing she was a stay at home mom and her message is like.

[34:31] And everybody who's carrying that message, all of a sudden we're. It's kind of a paradox because the people who were initially having a hard time coming over here, immigrating and to practice the American dream, that have actually had it, have realized why they came here when you did.

[34:45] I was generationally born here for several years,

[34:48] so I was kind of. Now I'm learning what it's like to have it back. Even though I was adopted by a Vietnam veteran. I definitely carry a sense of patriotism in me.

[34:56] But that context, I hope, is what the gravity and episodes like this and you personally, your groundwork and who you're talking to. I would love to advocate you to some of these large youtubers if you prefer it, because what you're saying and the way you ask questions, if you got on an audience where they had 50,

[35:14] 80,000 views a week and then you're talking about this and they're interviewing you not just for the sake of the goodness of your podcast, because not everything's a good fit.

[35:21] We talked about that. Not every, not every gig's a good gig. But if meaning and purpose through personality are being ordained and I believe that comes from the Creator and through then that purpose and vitality is something I support.

[35:35] Debbie Reynolds: Oh, thank you so much.

[35:38] Thank you, thank you, thank you. Well, I really appreciate you being on, on the show. I love talking with you,

[35:46] getting your thoughts and ideas and yeah, I think people really love the episode and so I look forward to us being able to find ways that we can collaborate in the future.

[35:56] Toin Berry: Oh, I'm thrilled. I follow you. I'm going to be doing everything I can to just mention you to a couple of people, if you don't mind that already I'LL try to go through the people at I think it's Rob Brack.

[36:05] I call him Brax. He talked data privacy. He's got 7, 120,000 subscribers. And Naomi Brockwell's foundation, she's got the. The PGP guy on her board as a founding nonprofit. So I'll do what I can for you because you paid it forward with me.

[36:19] So I really appreciate you.

[36:20] Debbie Reynolds: A thank you. Thank you so much. Well, we'll talk soon.

[36:25] Toin Berry: Okay.

[36:25] Debbie Reynolds: All right. Thank you.

[36:27] Toin Berry: Thank you. Bye.

[36:28] Debbie Reynolds: Bye.

 

Previous
Previous

E276 - Willem Koenders, Global Leader in Data Strategy and Author of “The Data Product Playbook”

Next
Next

E274 - Liz MacPherson, Deputy Privacy Commissioner, Office of the Privacy Commissioner, New Zealand