E270 – Filipe Pinto, Researcher and Strategist, Author of Consumer-Controlled Digital Twin Architecture
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:25] Now I have a very special guest on the show from Washington D.C. Filipe Pinto. He is a researcher and strategist, also the author of Consumer Control,
[00:36] Digital Twin Architecture and Consumer Control Data Absolutist.
[00:41] Welcome.
[00:43] Filipe Pinto: Thank you, Debbie. It's a pleasure to be here and thank you for inviting me. I really appreciate it.
[00:48] Debbie Reynolds: Well, yeah, I love data people. So you and I, we were connected on LinkedIn and we had a chat, I think we had a meeting and we just hit it off right away and I thought, oh my God, like,
[01:00] I love conversations where I feel like we could have recorded what we talked about and that would have been the podcast. Happy to have you here. Happy to have you on.
[01:08] You just have a fascinating background and data and the things that you're interested in and researching are tremendous. But give me a idea of your journey. How did you come to this stage and this state in life and your work in digital twins and AI and all things data,
[01:28] right.
[01:29] Filipe Pinto: So I've been fascinated.
[01:32] Educationally or from. I'm, I'm an electronics engineer, okay.
[01:37] And from an early stage in my career, I've been fascinated with digital twins, okay?
[01:42] Digital twins as a mechanism to hyper efficiency, right?
[01:48] The ability for systems to sense and learn and adapt in real time.
[01:54] And then combined with that,
[01:57] there's this innate alignment with the Enlightenment philosophy, right? The Enlightenment rebellion,
[02:07] the. The conviction that freedom begins with ownership of the self. Right.
[02:15] And I wanted to carry this to the digital age.
[02:20] And so I initially started, in 2008, started the. Went the startup route,
[02:27] okay.
[02:29] And to give control of the digital twins to people.
[02:34] Not corporations, not OEMs,
[02:37] give the control of digital twins to people. But I realized that there were a lot of missing blocks, okay.
[02:44] And so I decided to veer off to the re research route and I started a PhD.
[02:52] And this notion of the consumer controlled digital twin architecture is the result of my PhD. So in that route, in that research route, I went deep, I went deep on the rabbit hole of privacy from a legal standpoint,
[03:11] from a philosophical standpoint and from an economic standpoint, right?
[03:16] And I try to organize this all in one.
[03:21] And so that's how I get here. But above all,
[03:25] it's this deep seated belief of the self, of the freedom,
[03:32] our ability to be free agents in the digital era and the ability for us to have the control of our data not through legislative processes,
[03:43] but de facto control, meaning the data is on my side.
[03:48] Okay.
[03:49] There is no cloud.
[03:53] It's believed in the personal AI paradigm, in the sovereign compute,
[03:57] in the personal cloud.
[04:00] Everything at the edge, everything close to the consumer. So I think that's been my journey.
[04:06] Debbie Reynolds: That's fascinating. As you're talking, just so many wheels are turning in my head. First of all, I just want to clarify for people.
[04:13] So I work in all types of industries because I'm a data person.
[04:17] So my folks in IoT think a digital twin is something different than what we're talking about. So for them,
[04:24] digital twin is a representation of maybe like a physical space or some type of system where you're talking about digital twins, you're talking about a digital self sovereign identity and who owns and controls it.
[04:39] Correct.
[04:40] Filipe Pinto: So very good point. And I appreciate you making that clarification, Debbie, and I appreciate you doing it because people may confuse. So when I speak about digital twins, I'm talking about the digital twins of your wearables, of your iPhone,
[04:55] of your car,
[04:56] of your washing machines.
[04:58] Okay.
[04:59] It's not a coincidence that in my paper I talk about Tesla and the fact that they've been early adopter digital twins for their cars. So there's a digital twin for every Tesla that is out there that is being used to train full self driving and it's giving these huge,
[05:16] humongous valuations and all of this stuff. And then also it's curious that Segway into that other idea that you have is that when we aggregate all of these device digital twins,
[05:34] we can create the personal digital twin which then becomes a replica of ours.
[05:39] So in the future, in the section five of my paper, I point out to this notion of a consumer control personal digital twin,
[05:49] okay.
[05:50] Which then becomes our representative in the digital realm,
[05:55] someone whose AI and data is all controlled by us and then we can let loose in the digital world,
[06:06] right? Words,
[06:07] data we control at when we can trust.
[06:13] Debbie Reynolds: Excellent. So I want your thoughts.
[06:15] I have tons of ideas now. Two things.
[06:18] One is that I find that regulation is too narrow because not all data is regulated. Right. I feel like anything that I produce for myself should be personal to me.
[06:31] Right. And I should be able to do whatever I want to with it. But then also on the digital side of things,
[06:40] not everything about a person is in digital spaces. And so to me, that creates a situation where inferences can happen or there's kind of an overfit of identification about you.
[06:57] For instance, let's say a digital system only knew that you like peanut butter and jelly. Okay?
[07:02] They could probably say, well,
[07:04] that's the only thing that Philippe eats is peanut butter and jelly, because that's the only information that I have. Or he loves that, or he eats it all the time.
[07:12] And so those are both kind of the two gaps that I see in those spaces. But I want your thoughts.
[07:18] Filipe Pinto: That brings a very interesting concept over here, which is this notion of biased AI based on the narrow sets of data that they consume,
[07:30] or because one group is overly represented over the other, or one group is not even represented. Right.
[07:37] And so that is something that I think we need to research further because look at this,
[07:46] Debbie. There's aspects of this enchilada that I can go very deep and then there's aspects that my knowledge is kind of surface. Right. And one of the things that I most appreciate about my PhD journey is that it taught me to say that that issue is fascinating, but I still don't have the answers.
[08:12] And I'll tell you one example of why I say this.
[08:16] The biggest criticism people can can say about my architecture or my proposal is that, Philippe, you're pushing to the personal cloud,
[08:24] which means that there will have to be consumers willing to invest in buying this personal cloud. There is investment.
[08:33] What about the Native American Indian that lives in the reservation with very low ability to spend all of this money? What about them?
[08:44] How is their data going to claim their digital self? And I will flat out tell you, Debbie,
[08:52] in the attempt to reduce the externalities of technology,
[08:57] we constantly create new other.
[09:00] And so there's an innate disenfranchisement from this.
[09:04] Do I have solutions to close them?
[09:07] Yeah, I do.
[09:10] I think that one way to do this is through subsidies from certain types of technologies.
[09:19] But again,
[09:21] one of my objectives,
[09:22] Debbie, is to bring a coalition of people,
[09:27] economists,
[09:28] you know, legal scholars, ethicists,
[09:32] technologists, and people like you. Right. People that can communicate with public.
[09:37] Right.
[09:38] Communicators to try to resolve this in an open source approach.
[09:45] I don't think anyone has a solution. I don't think organizations have a solution. They have totally different incentives.
[09:53] And I think it's people like you, people that can bring people together.
[09:57] And I don't know if I answered your question, but bottom line is, Debbie, I don't have all solutions.
[10:03] Right?
[10:03] Debbie Reynolds: Yeah, totally.
[10:05] Filipe Pinto: And that's a very critical problem.
[10:08] Debbie Reynolds: Yeah, well, that's why we're talking, because we're trying to all figure it out. There's so much to do and so much to think about.
[10:17] So the way the data is today,
[10:19] I feel like we, we can't continue what we're doing.
[10:23] So the way data is today is everywhere. It's flows everywhere.
[10:28] People don't have control over it. Decisions are being made about people with data they may not know about.
[10:35] People are concerned, right? They want more control, especially to be able to take back their data or be able to like say, I don't want you to use my data anymore.
[10:45] Data systems, I tell people data systems are made to remember data, not to forget it. So it's the opposite of privacy and it's the opposite of what some of these regulations are trying to do.
[10:56] But then also operationally the organizations really struggle with trying to do things like have people opt out of things or getting rid of their data. And so I feel like self, when I think about self sovereignty,
[11:14] I think about the future where a person will be like a bank of their own data and, and they can decide whether they want to share or not. And it's a very dynamic situation where that data,
[11:27] the company has to come to them as the source of data and they can shut data off for, on like a faucet.
[11:33] And so that's what I think. But I want your thoughts.
[11:36] Filipe Pinto: I mean, Debbie, this is why I think on our first conversation we enjoyed so much, you know, each other's thoughts because we are very alive.
[11:46] I so happen to be very aligned from the engineering perspective, so to speak. Right? But it's amazing how many people think like you and how many people are aligned with us, right?
[11:57] So let me tell you one thing. Prior to do that, I agree with everything you said, but let me just go a little bit deeper, right?
[12:05] So the way I see my work is the way to evolve to an era of engineered privacy,
[12:15] not legislative privacy. Okay?
[12:19] So as you were saying about my data bank, right.
[12:23] People should come to me to see if I can share the data or not.
[12:27] I've evolved that to a slightly different concept, right?
[12:32] So my concept is data stays with me.
[12:37] So I have de facto,
[12:39] not the jury,
[12:41] not legislative. I have de facto control.
[12:44] The data is on my end. I have physical control over my data, okay?
[12:51] Because the way we do this is the digital twin, for instance, the digital twin of this iWatch that runs somewhere in the Apple cloud now runs on my side, runs on what I call the edge gateway,
[13:04] okay? So the data is on my side.
[13:07] So I've evolved this notion to the post consent era.
[13:14] There's no consent because the Data never moves,
[13:18] AI models move, AI parameters move, and everything is strained and inferred on my side,
[13:26] okay?
[13:27] So this is what I call the engineered privacy versus what we've been doing as legislative privacy. So what I want is don't claim privacy,
[13:40] engineer it.
[13:42] And that's how I say, now you may say, debbie, how do we get here? And how do we get here is to me a fascinating thing,
[13:52] right?
[13:53] And I'm sure you talk to a bunch of people and they'll give you different ideas, right?
[13:58] So here's my idea of how we get here, okay?
[14:01] So Kodak releases a camera that has a hundred slides as a roll inside.
[14:10] And the IDEA Here, for $25, you take the 100 pictures and then you send it to Kodak and we send you the pictures.
[14:16] So no more the cloth over your head, and then you shoot the picture now.
[14:22] And so what happened was a bunch of journalists, sensationalist journalists, decide to take picture of the socio,
[14:30] the socialite parties, the elite parties,
[14:33] until they do so,
[14:35] of a party of two preeminent attorneys.
[14:39] And these two attorneys then get together and write a paper about the right to be left alone,
[14:48] which to me is the event that triggers privacy.
[14:52] Right?
[14:52] Here's my point.
[14:54] Technology,
[14:55] since that camera from Kodak up until today has never been engineered to empower privacy.
[15:07] We develop innovation and then we think about privacy.
[15:12] You see what I'm saying? And so we did that with a computer, we did that with the Internet.
[15:18] We're doing it now with AI. God knows we're doing it with AI. And so we shoot first and then ask questions. And so we are always one step behind.
[15:28] And with this approach,
[15:30] it will always be very impossible for us to get privacy. And so here is my claim of don't claim it,
[15:38] engineer it. And I'm sorry I went over this big round, but to me, it's important to that you understand.
[15:47] Where do I see privacy?
[15:49] Because one of the issues that I see many times,
[15:52] Debbie, is that because I'm an engineer,
[15:56] when I talk to legal scholars, because there's a huge amount of legal scholars thinking about privacy, thinking about how we're going to do privacy in AI,
[16:08] they always look to me as you can articulate the history of privacy from a legal standpoint.
[16:16] And I can.
[16:17] Or when I talk to ethicists, ethical scholars, and they say, you can't articulate the ethics of privacy. And I think I can.
[16:25] And so.
[16:26] Yeah, you did.
[16:28] But you see, it's this important thing, Debbie, when we are talking to these different audiences that we say, Hey, I know more or less how you say this, how you.
[16:40] What your perspective is about this topic,
[16:43] right? Because one of the biggest discussions is about data propertization,
[16:50] data properties. In Europe, it's blasphemy.
[16:54] Data cannot be property.
[16:57] Right.
[16:57] And they've built their approach to data on that sense.
[17:02] Over here in the United States,
[17:04] we know that corporations rule the world. So Schmichel, Schmeichel, for them, it's the same thing, right?
[17:10] They're not going to worry too much,
[17:12] but there's a bunch of concepts. So data properties. Am I in favor of data propertization or not? And the answer is,
[17:22] if you own the data,
[17:24] does that discussion take place?
[17:27] I don't think so. The data is on your end.
[17:30] There is an oem. Because look at this, Debbie. There's. There's a manufacturer and I watch manufacturer.
[17:37] Not Apple,
[17:38] but I watch manufacturer out there that says,
[17:41] I want to engineer my smartwatch with privacy,
[17:45] with privacy in mind.
[17:47] I'm going to develop, I'm going to manufacture,
[17:50] and I'm going to use this architecture to allow the consumer to have the twin on their end.
[17:57] I, as a manufacturer,
[17:59] am willing to stay away from the data.
[18:03] I am willing to have my models train on the consumer side and pay for the training versus what we have today. And you and I know what is today.
[18:18] The more data,
[18:19] the more power the corporation has.
[18:22] So I'm sorry, a long enchilada again, but I just wanted people to understand a little bit more about the perspective I have about privacy and why I say certain things the way I say.
[18:37] Debbie Reynolds: Yeah,
[18:38] I agree with you. So I think when people think about privacy in a legal sense, the problem with it is that for me, law is more about the rearview mirror,
[18:49] not the future. And so that is the problem, because in order to really protect someone's privacy, it has to be more proactive as opposed to reactive.
[19:00] And I think what you're articulating is that you want to solve a problem before it's a problem.
[19:07] So the problem is once you share the data out, people lose control.
[19:12] So you're saying, let's handle the control part up front and then we don't have to worry about these other downstream effects, correct?
[19:20] Filipe Pinto: Absolutely. That in name of intellectual honesty,
[19:25] you may say, but, Philippe, if I send you my model to train on your data,
[19:32] wouldn't there be some private leakage, some data that goes into the model?
[19:37] And I would say, yeah, in name of the honesty. But we have moved the ball forward.
[19:44] Is it going to be perfect? No.
[19:47] Are there technologies that will eventually solve this issue. Yes. Homomorphic encryption. Other aspects of encrypting data so you can train without leaking data.
[19:58] Problem is right now homomorphic encryption is very, very slow. It's incredibly slow. Right.
[20:05] Debbie Reynolds: And expensive.
[20:06] Filipe Pinto: And expensive. Expensive.
[20:08] But I think that we need to continue to take the ball forward.
[20:16] And let me just put something in parentheses over here real quick, Debbie, which is this notion of.
[20:22] I mean we've known that people you talk to, Debbie, people that are your friends in the self sovereign identity arena, right? And I'm sure there's others, but people have been fighting and preaching to the fish for the past 20 years.
[20:38] The self sovereign identity movement started in early 2000s,
[20:44] right?
[20:45] Federated learning from 1999. They start saying, hey, Microsoft wants all our data. What's up with this? With the passport,
[20:52] right?
[20:53] And then Microsoft turns out has a great engineer,
[20:57] Cameron,
[20:58] okay? He is an amazing engineer that joins the self sovereign. Well, what becomes then the precursor to the self sovereign.
[21:06] And these people have been fighting and architecting, right. Evernim then the first indie blockchain, right.
[21:16] So I think we stand on the shoulders of a lot of other people and we just need to do our part.
[21:23] That's all I wanted to say.
[21:25] Debbie Reynolds: Well, I think one of the key things that I tell people about this is that first of all, I think it will happen. I think it has to happen.
[21:34] Because the way that we're going now with data is unsustainable, I feel.
[21:39] But also the technology has,
[21:42] you know, exponentially improved in those 20 years or in those many years. So that the things that we were talking about philosophically can be done now.
[21:55] Now it's like, do we have the will to do it? Can we fight? The incumbents still want data in a huge bucket where you have to go to them, right?
[22:06] And the technology exists now where you really don't have to do that, but it needs to be more of a groundswell and getting more companies on board with the fact that really this solves a lot of problems that we're having right now.
[22:22] But then on the flip side, I want your thoughts on this. A lot of people make money on these problems that we're talking about. So for them, they may be in whatever realm they're in,
[22:32] they may be saying, well let's let me continue on the path that I'm going because I'm making more money this way. But I think in the future that will dry up.
[22:43] I don't know. What do you think?
[22:45] Filipe Pinto: So very good question. And when I talk to the economists, especially the Specialists in data economy. That's something that immediately comes up. And one of the criticism is, Philippe,
[22:56] what in God's name would ever lead Apple to get away from the data drug?
[23:05] Because to them,
[23:06] they can't even fathom their existence without the data from the consumers. I mean, it's just ingrained,
[23:12] right?
[23:13] And my answer is this.
[23:17] There is always a new era.
[23:20] And I think that the new era that is emerging is 3D printing, okay? The ability for you to build devices that are now currently built by large OEMs, large manufacturers, right?
[23:36] And so my claim to fame is to save this.
[23:39] Imagine a company that right now prints cars, okay?
[23:44] Case in point,
[23:45] divergence 3D. A company out in California that builds supercars at this time, but actually they build the machines that build the car. There's a world in which we can go to these guys and say, hey,
[23:58] don't build the infrastructure for surveillance.
[24:02] Don't build it.
[24:03] Use this other architecture. Give the data to the consumers. Give the telematics of your cars to the consumer, right?
[24:10] And so what I'm seeing here as a possibility,
[24:14] okay, because short of a miracle,
[24:17] I mean, that is short of a miracle. Apple, Tesla,
[24:21] Microsoft,
[24:22] Bosch,
[24:24] Samsung will never get their data.
[24:27] Never,
[24:28] right?
[24:29] Their investors would just abandon, right?
[24:33] So our hope is that these new industries that are emerging,
[24:38] okay,
[24:39] will be able to accept this new era of engineered privacy.
[24:45] I don't want to spy on my consumer.
[24:49] Even what if. Even if my only objective is to get data to improve my product,
[24:56] Even if that was the case, I would get this massive amount of personal AI.
[25:01] I want to not do that,
[25:04] right?
[25:04] And so, again, in name of intellectual honesty, I don't want people to think that I have a crazy idea that we are going to influence Apple. No,
[25:13] we'll never do that.
[25:14] I mean,
[25:16] Debbie,
[25:17] the data is what really is valuating their business.
[25:21] Tesla stock is a reaction of that. It's not because they build that many cars. It's because everyone knows that FSD is going to power robots. It's going to power everything that has to see.
[25:35] So that is my perspective on that, Debbie.
[25:39] Debbie Reynolds: So what is the incentive then for companies to move in this direction? So I'm not talking about the Apples and the.
[25:47] Filipe Pinto: Yeah, yeah.
[25:48] Debbie Reynolds: Well, to me, I think Apple so OS makers to me, are different than other types of companies because like Apple, well, this pretty much the mantra,
[25:59] Apple, they're like, trust us with your data because we care about your privacy. And so people do give them data because they're like, okay, I get something. I feel like I Get something in return for what I use.
[26:13] And so to me, that's probably the best that you can ask for in that type of situation.
[26:18] But how do we get other companies to move towards this other, this new type of thing? And so before you answer,
[26:25] I'm gonna bring up another Kodak example. And so remember,
[26:30] I believe Kodak when they were printing, doing their regular print business.
[26:35] Well, first of all, Kodak, they made so much money on development of film that they did not want to give that up.
[26:44] And then when digital came out, they didn't want to do that because they were like, well,
[26:49] if we do digital,
[26:50] then we are gonna give up all this film business. And so they're like nowhere now,
[26:56] right? Digital is the thing. And so that was like a really bad play for them. So I feel like we're kind of at this moment now where if companies don't try to pivot, it's gonna be,
[27:11] you know, maybe not tomorrow,
[27:13] maybe not in five years. But, you know, I think the money train is gonna stop at some point and companies need to change. But what's your thoughts?
[27:22] Filipe Pinto: Okay, so a lot of good points over there and I just want to make sure that I address them. Okay,
[27:29] I agree 100%.
[27:32] No buts, no otherwise as we do, Debbie. Okay. Apple seems to be having a slight more consumer centered approach to the way they use data.
[27:46] And one could articulate that the fact that they're not on the leading edge of AI may be a consequence of the syncrasies that reflect from that. Hey, I don't want to spy on my customer,
[28:00] so I completely understand that posture. And this is an Apple,
[28:06] this is a Mac, and this is an Apple iWatch. Right.
[28:11] Having said that, you bring something very interesting.
[28:15] Are we at a Kodak moment in the privacy sector?
[28:21] Are we at that Kodak moment that says you didn't embrace privacy?
[28:26] You're going to have the same results of a Kodak, right. That goes into obsolete because no one wants to do film.
[28:35] My answer with what I know right now, Debbie, is that I don't think we're there.
[28:42] And the reason I say we don't think we're there is that we are in this race with the eastern bloc of who gets to AGI. Right?
[28:56] And so what we've seen is that all pediments to accelerate that race have been removed. Right.
[29:04] I mean, there was a piece of legislation that said, you know,
[29:08] you can't, you know, over the next 10 years sue anyone or try to stop the, the data flow.
[29:16] And so, no, I don't think we are there.
[29:19] I think that the European efforts of the AI act also are not going to do much about it.
[29:29] Because the way I see it is that there is similarities between this race and the race in during the Second World War to see who got to the atomic bomb,
[29:42] Right.
[29:43] We did everything we needed to do to make sure we got to that technology first. I think there's a parallel in there now.
[29:51] Am I hopeful about the future?
[29:53] Absolutely. Otherwise we wouldn't be doing this. Okay.
[29:57] We just need to increase our voice. I think that. And I apologize for the analogies, but I'll give this one more.
[30:06] I think that what I call the 3D printing world, right? I call them the mammals of the Jurassic era.
[30:16] So they're still there. They're very small. They have a very small portion. This is a world of the dinosaurs, right?
[30:23] So this is the world of the big OEMs, the people that consume data.
[30:28] They're dinosaurs. They consume data, right?
[30:31] And so there will be an event where we will say, Hey,
[30:36] 3D printing now can print electronic device. We're not there yet.
[30:43] Okay.
[30:44] When we can have either nano bots or 3D printing or whatever process of manufacturing that gives us compelling devices, may not be as sophisticated, but compelling devices to say, hey, Debbie,
[31:00] do you want the iWatch that sends data to Apple or the iWatch that sends data to your cloud?
[31:08] When we get to that parody, even if it's not very similar, but we. When we get to that stage,
[31:15] I think we'll get there.
[31:18] Now,
[31:19] because of the analogy of the dinosaur, this is where I see the comet that will.
[31:26] We are about to enter the human humanoid robot era.
[31:32] And the question I put back to you, Debbie, is this.
[31:37] Would you feel comfortable to have a Tesla bot in your house,
[31:44] seeing how you do things, how you like your kitchen cleaned,
[31:49] when do you like your kids to be put to bed?
[31:53] Do you feel comfortable with having all of that processes of your life in a Tesla Circle?
[32:01] Debbie Reynolds: No. I will probably buy a Louisville Slugger baseball bat.
[32:08] Filipe Pinto: So that is my point,
[32:10] right? There's a little mammals crawling over there. They still don't know where they want to go,
[32:17] but I think we need to bring them up to speed. And I think the aha moment is going to be, wait a minute, I have a droid inside my home and I do this to you, right?
[32:30] This is an example that I think it's one of the most provocatives that I've thought so far, which is,
[32:36] when will we wake up to the fact that these droids will be able to testify against us in court.
[32:44] Debbie Reynolds: Yeah, well, smart devices already do that. Right.
[32:50] Filipe Pinto: But Droid, just because that's this much more information can listen, can see, can see your process and say, was Debbie at home? Yes or no?
[33:02] Debbie Reynolds: Yeah, well, that's what your thermostat does. Basically.
[33:05] Filipe Pinto: It's like,
[33:06] so are we there yet? Are we at that Kodak moment? Debbie,
[33:11] we're getting to it fast.
[33:13] Let's hope that your voice and the people that you bring onto to your podcast get Joe Rogan level audience.
[33:22] Right? We need you to become Joe Rogan voice to really bring these issues up.
[33:29] Debbie Reynolds: Oh, thank you so much. Yeah, I think it's really important. I think that my view is that we need to share less.
[33:40] And so how do we do that? So there are ways, technology,
[33:44] there are some low tech ways to do that. I think a lot of people are doing, thinking about that as well. And so I think the future is going to be really interesting because there are so many technologies now that are really being pushed and being invested in.
[34:00] And so I think that a lot of those answers we have about technology will come because there will be more advancements that have come more rapidly because of,
[34:11] probably because of AI,
[34:13] the money that's being spent there. I think that the escalation of the innovations will come fast and we'll be able to do a lot more of the things that we're talking about today for sure.
[34:27] Filipe Pinto: Completely agree with you, Debbie. And AI.
[34:30] This notion of the AI co founder, right. The ability that Debbie now is Debbie. Plus a bunch of intelligent agents,
[34:39] one that reaches, sends emails, reach out, see the papers,
[34:43] your voice will now be multiplied by AI.
[34:48] I do think that you are right. There is a bunch of contributions and we need to recognize them.
[34:55] And one thing that is important as well is that one thing I would like to ask you, Debbie, if you were to look back and see what was the one technology that was developed that was engineered to privacy,
[35:12] what would you say?
[35:16] Debbie Reynolds: Probably the one that could be most capable to be something that can enable privacy. Is that what you mean?
[35:24] Filipe Pinto: Yes. Looking back, what private? Because I said we've never engineered to be private. We, we're not here for privacy with the exception of one. Oh, wow.
[35:35] Debbie Reynolds: Oh, I wouldn't know. I don't know. I don't. Tell me, tell me.
[35:39] Filipe Pinto: So the only technology, and this is, this is fascinating. The only technology that was engineered for privacy was cryptography.
[35:48] So the cypherpunks when they decided, hey, let's do pgp, right? Let's encrypt information so we can send it with everyone knowing.
[35:56] And that in combination with digital money,
[36:00] leads to Bitcoin.
[36:03] Bitcoin initiates the blockchain revolution.
[36:07] That's how we got decentralized identifiers, so on and so forth. So there are a lot of small contributions.
[36:16] It would be great to have another contribution such as cryptography.
[36:21] Debbie Reynolds: Okay.
[36:21] Filipe Pinto: You know, and.
[36:23] And I don't know what it might be. And I think we will find out because you will find those people, bring them together, and we'll invent the next phase.
[36:33] Debbie Reynolds: That's so cool. Well, I think we will have to join two things together. So when I think about cryptography,
[36:40] I think of it.
[36:42] The thing that I think about is different from privacy,
[36:45] is like, when I think of cryptography, I think of it as a key to a house,
[36:49] and I think of privacy as like a deed to a house. Right.
[36:53] So in my mind, those two,
[36:55] they can work together in some ways to create what we want, truly, which is I have a right, and then this is how I protect the right that I have.
[37:07] Filipe Pinto: Yes. And don't get me wrong, privacy is incredibly multidimensional.
[37:15] Debbie Reynolds: Totally.
[37:16] Filipe Pinto: One of the things I did on my PhD was a mind map.
[37:19] And the mind map tool that I use usually runs out of memory because it is that multidimensional.
[37:26] And so I completely agree with you. Cryptography as an engineer, as a geek in mind, okay. As the geek in me,
[37:33] saying to you, yes,
[37:35] that is something I can look back, that is the trigger,
[37:39] the start, the genesis of the consumer empowerment. Without it, we wouldn't have built all of the other books. That was my point.
[37:49] Debbie Reynolds: That's fantastic.
[37:51] So, Filipe, if it were the world according to you, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be human behavior, technology, or regulation?
[38:04] Filipe Pinto: So with the risk of coming across as repeating myself,
[38:08] I want us to stop legislating about privacy. I want us to engineer it,
[38:13] okay.
[38:14] I want us to produce,
[38:16] to create devices in which privacy is not an afterthought.
[38:21] Privacy was an afterthought for the Internet.
[38:24] Privacy is going to be an afterthought for AI.
[38:28] So I think that in a privacy engineer world,
[38:33] it's the only way for us to have our digital self determination.
[38:38] Okay.
[38:39] And in my perspective, with the limitations that we have so far,
[38:44] that points towards the personal cloud, to the self, to the sovereign compute, to the edge compute. So if I can wish for privacy,
[38:57] I wish for a model in which we have de facto control. The data stays with us in that safe you call Debbie.
[39:08] It doesn't move. It's my data stays there.
[39:11] I may wish to train models or not.
[39:14] I need to be able to be the right to be forgotten.
[39:18] Okay.
[39:19] Because if the data is here,
[39:21] I can just twin or untwin my devices. I can have my devices get information about me or just say untwin,
[39:29] do your thing.
[39:30] No data is moved.
[39:32] And so that's what I wish to for. I wish for an era of engineered privacy.
[39:40] There's a rule for everyone.
[39:43] We will push the boundary of privacy forward. We'll come across other problems and for that we need the legal scholars.
[39:53] What I think we need to accept is that the legal route, and I think you and I agree on this,
[40:00] has not led to much improvements. If anything,
[40:04] I'm extremely ****** off that every time I go to a website I have to click the same thing.
[40:10] I agree. Don't.
[40:11] Debbie Reynolds: I agree.
[40:12] Filipe Pinto: I know the i2 police coming up with a standard that will eliminate that.
[40:17] But no, I mean that's the only real thing I see.
[40:22] And, and I still feel that the large OEMs, the large software olagopolies still run with impunity.
[40:34] I mean,
[40:36] so sorry I went so.
[40:37] Debbie Reynolds: Long, but no,
[40:40] that was amazing. And the thing that you said, which is to me now, in my mind, the missing link is the determination.
[40:49] You said self determination and the determination part is what has been missing.
[40:55] Yes. So, yeah, perfect. Perfect. Well, thank you so much. This has been so much fun to talk with you. You're amazing.
[41:02] Please follow Philippe on LinkedIn. I've. I just love your.
[41:07] The things that you say and your work is fascinating. So, yeah, I look forward to talking with you soon.
[41:13] Filipe Pinto: Thanks so much, Debbie. I really appreciate your time. And we need you to get Joe Rogan level.
[41:18] Debbie Reynolds: That's right.
[41:19] Filipe Pinto: I'll help you in anything I can.
[41:21] Debbie Reynolds: Okay, thank you. I really appreciate it. I'll talk to you soon.
[41:26] Filipe Pinto: Thanks. Debbie.