E37 - Debra Farber Investor, and Advisor on Privacy Tech

July  Podcast Guests_Debra Farber.gif

 

Debra_Farber

43:45

SUMMARY KEYWORDS

privacy, companies, data, tech, problems, people, business, regulation, technology, tool, customers, security, focusing, individuals, compliance, blockchain, question, feel, collect, build

SPEAKERS

Debra Farber, Debbie Reynolds

 

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds and this is "The Data Diva Talks" Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know right now. So today, I have a very, very special guest on the show is Debra Farber, she is the CEO of Principled LLC. She is a strategist and a lawyer. But her focus is technology. So hello, Deborah.

 

Debra Farber  00:45

Hi, it's great to be here. I'm actually a lawyer by training, but I've never practiced. So I want to make that clear. And, yeah, it's really great to be here to have this conversation.

 

Debbie Reynolds  00:59

So we met many, many years ago, we ended up on a panel at a conference in California many years ago. And so that was several jobs back, right. I think you were at BigID at that time. But you've had a lot of really big jobs in privacy. So you've been at AWS, Prime Video. You've just been to a lot of different places. But I feel like your skill is very unique. So obviously, you're a smart cookie, you're an executive, you focus a lot on the technology side of privacy. And I think this is really a cool time for you to be in the business that you're, that we're in right now. So tell me a little bit about kind of your trajectory into kind of this privacy tech arena, and what you're doing with your business right now.

 

Debra Farber  02:00

Sure. So I've been in privacy and directly working to operationalize privacy since 2005. So it's been about 16 years now. And I have seen the industry go from super small, to incredibly large, to breaking out into different, I guess subsections. Right. So I am now focusing on privacy tech as, and when I say privacy tech, I mean, any tech that will solve a privacy problem that we're trying, that an organization has. So, you mentioned reg tech, and I think that that has been where we see the beginnings of, privacy being addressed because there's a market need for regulation. And companies needed to have a story that they were complying with this new regulation. And so you had to map to regulation. I think that what I'm committing myself to now is really helping, to broaden the perspective of the technology that can solve privacy problems, and, really help that ecosystem thrive. Both. I'm an advisor for the rise of privacy tech that brings together investors, privacy tech founders, privacy experts, and invites who could be advisors for these to help the privacy tech ecosystem. And so, that has not really been addressed before really like the problems have been centered around how can we help your Chief Privacy Officer achieve compliance and then demonstrate compliance and accountability. Whereas there are, now, there are tools being developed that will help developers to embed privacy into their tech stack, right, there are tools so that we might call that dev-priv-ops are, privacy operations. There are, people developing tools for the sharing of data outside of organizations. And instead of just say, let's put the hammer down of encryption, where you're, it's a great security capability and enforces the privacy rules. But we're now saying that, hey, maybe you don't need to use a hammer and you want to use a scalpel so that you could, use differential privacy to actually derive insights from the data set that you have collected, but while preserving privacy, or maybe you'd want to use homomorphic encryption, which is deriving insights from encrypted data. Right. So there's all of this, a new technology that's helping us meet privacy problems. And so that's really where I'm focusing my career now with Principledl LLC, is to really help privacy tech companies understand their market understand who is the buyer for their tech so that they can appropriately sell to those C level executives. And then also, understand, like, what's needed versus there's, 1000 of these tools in the market already? Or how do you differentiate your privacy tech company from another so that they don't all sound the same, right? Because you go to the RSA conference for security, you could go to Black Hat. And, you see the same buzzwords on all the booths, a regulation, maybe that you're trying to comply with what's a GDPR? , AI, just throwing out different buzzwords of the moment, accountability, compliance, whatever. But there are, there's not one company that can address all of the privacy problems. So, it's, I help them with the clarity of their messaging. So that really resonates with the stakeholders. Yeah.

 

Debbie Reynolds  05:50

So I love that. So I want to talk about that. So here are the things I don't want to talk about. So let's throw these out. Let's throw these out of the privacy-enhancing tech buckets, let's say, let's throw out a bucket right now, people that have some other tech that added the data subject access requests to their tech, I'm throwing them out of the bucket right now. And then I'm also throwing out companies that strictly focus on the kind of what I call paper promises. So this is kind of what we do , the story, like you, 're talking about, let's cut throughout the story, people. So I'm really interested in what's happening in the area that you're talking about the companies that touch data. So what is going on in this bucket of companies that touch data, what's happening right now?

 

Debra Farber  06:43

A lot, a lot is happening. And so I'm really glad to have this conversation. Because, I didn't even realize and I'm focusing myself on this area of innovation, right. And I didn't realize the breadth of companies that are actually trying to solve privacy problems right now. So I guess I'll tell that story in a second about how I discovered that. But first, I'd say to answer your question, a lot is happening. Because what we're seeing is that privacy is moving from a regular, we're trying to say, oh, the regulations say that you need to make changes to your businesses, right, you need to provide, you don't have obligations, because the granting of data protection rights to individuals in, the United States and the EU, and in all around the worlds proliferating the GDPR like laws, but these granting of rights become obligations on companies to facilitate these rights. So in the past, we've seen the past few years have been technology that's just allowing companies, you don't really want to talk about this, and I get it, allowing companies to just demonstrate their process, they have processes that meet these compliance requirements, right. We are now seeing that the market is changing itself. And there are market drivers beyond the regulation. Apple's firm stance on privacy and its decision to eliminate third-party cookies is a real game-changer, right? I mean, for the entire advertising ecosystem. Privacy advocates, like myself, for many years, have been talking about the ad industry and kind of it's almost a black box, and we don't necessarily have a lot of fidelity as to what data is being collected, and what is known about the individuals and how the ad companies are using that data in ways that serve humanity versus serving the, their financials. Right. So, so that's becoming more that was a hard fight to fight at the beginning, right? When Google was new, When an ad, Web 2.0 was kind of like, stretching its legs and trying to, build new services for the web, right. But the web was built to facilitate sharing not to facilitate security. So as not to facilitate, a nuanced control over how data is used about individuals. So that's proliferated a lot of the privacy problems we see today. What I'm seeing is that, because of the regulation, because of the consistent calls for privacy by design, because of the calls for privacy engineering, because the Big Tech has then said, Well, we need some engineers to like code up to address the privacy problems in our companies. Many of those engineers, did they answer the call, they learned, they became really, they understood me? They came in from the security space. Maybe there are engineers in the data analysis space, but they've gotten curious about what does it mean to help protect privacy and how do we do that? And what I've seen is a lot of them leave Big Tech or small. Wherever they were working on their, or their PhDs or wherever, and they see, there's an opportunity, and they're going in, they're starting their own privacy tech companies, because they feel like just advertising that they know, well, and they don't know the whole privacy ecosystem, right, or even who all the buyers are anything, but they understand the market need for certain tech. And so I had no idea how many were truly working on these problems, until I posted publicly, on Twitter, on LinkedIn on, all social media that I could about the perils of Clubhouse, when the social media app that kind of rose to wide acclaim by many, but also, did not have their privacy and security house in order when they launched, during the pandemic times, right. And so for those who don't know, Clubhouse has a voice, an audio-based app where you can have conversations, almost like the old party lines of untold of the past. And, there could be great, you can, I think now you can start to pay creators and such, but in my trying to warn people that it's not safe yet to use Clubhouse, because of the way they deployed their marketing and the way that they, they basically, in my opinion, built a great feature without a platform that existed that was safe, with privacy and security and accessibility built-in. And so, when I went on my screen, and maybe it was because it was the end of the pandemic, and I have real feelings that I put out there, and they weren't, they might have seemed harsh, but I truly believed in them. I had just an outpouring of tech founders say thank you, this is what we've been saying, this is your thank you, like I, we want to build privacy by design. So how come Andreessen Horowitz is giving $100 million to this company that does not build a minimum viable product, right, they didn't build for the regulations in a global marketplace, and then yet they put this in the stream commerce. So once I realized that I was like, wow, and I, I got a lot of requests, can you help me with, my marketing? Will you be my advisor, and I'm like this, this is an opportunity for me too, really get in on the ground floor of the emergence of privacy tech for, kind of the planning of Web 3.0. Even when you start introducing a lot of, the blockchain hash graph, distributed ledger technology stuff that I'm also getting into as well. So there's a lot going on to answer your question. And I think there's a lot more movement than just regulation alone.

 

Debbie Reynolds  12:37

Absolutely. And I actually like what Apple is doing with some of their privacy features, mostly because I feel like the thing that they're doing, well, they're doing a lot of things to kind of impact industry all over. But the thing that they're showing, or demonstrating is that privacy can be profitable. So if you respect someone's privacy, you can get more customers, or you or the flipside of you disrespect someone's privacy, you can lose customers. So I think that has more of a bottom-line impact, and it kind of resonates probably more than, oh, my God, we have this regulation. Oh, people think that I will as a privacy tax where it can actually help you gay business. So what are your thoughts about that? 100% agree,

 

Debra Farber  13:30

In fact, I've been following Cisco's privacy, I think they put it out at data privacy day, every January, like their new report, based on, polling a lot of CEOs and such about, how is privacy helping your, your organization, right, and, and, they have all these great stats about how your sales cycle is shortened astronomically, if you have good privacy, positioning, and that even if you move from like maturity of like completely ad hoc privacy, maturity, just one notch over and just slightly improve your privacy posture, you can even shorten the sales cycle significantly, where it makes sense to put your money in, put your dollars towards these, these, privacy problems will actually help you with profitability and raising that getting more revenue and such. So, the value exchange of data is really important. And now I hear like Lourdes Turrecha has to ratchet in my ear going, it's important that consumers have control over, how their data is used about them. And, she focuses privacy around, one's ability to control their data, as many do. These are just like privacy experts in my brain that are constantly like, having almost like little, not devils and angels, but my core is of privacy expert supporters in my head and, she's going to talk about control.

 

Debbie Reynolds  13:31

I love her, by the way.

 

Debra Farber  15:02

Oh, yeah, they're both wonderful. I mean, they're great.

 

Debbie Reynolds  15:05

One thing I would love your perspective on, I don't know, maybe this is inside baseball. But I, when I'm working with companies in this arena, one of the early questions I asked people about is who their competitors are. And I've been, I don't know if you find it this way, too. But I'm like, these are the companies that you say are your competitors, they're actually not your competitor. So you don't actually do the same thing. So I'm wondering when you're working with companies on market positioning in this area, do you feel like there's confusion there?

 

Debra Farber  15:43

I do absolutely feel filler, there's confusion to companies that are trying to compete in the whole marketplace. So in some ways, if you're trying to compete for investment dollars, any privacy tech company, you might view as a competitor, because you're maybe you're trying to reach the same investors, whether it's angels, or VCs or whatnot, and they only have a limited amount of money that they're planning to spend in privacy tech, potentially. And so in that way, maybe you feel like they are, they're all competitors. In the other respect, I honestly don't think that that's the best way to view it. I feel like come together as an ecosystem, let's understand each other's nuances and what we're building. And then if you see that there is a company that is kind of like, for instance, good example, I'm an advisor for a company that is working on, well, I mean, I guess this is not deep tech, but DSAR's, data mapping, right, lots of different data mapping companies. And I'm only using this because there are so many of them. I know, this was not the area of tech we wanted to talk about, but because there are so many companies endeavoring to help in this space, it is hard for them to differentiate from one another. And so maybe the tech is very similar, but maybe the market that they're trying to get is different, right, like enterprise versus startups and DSAR's, completely different needs. And, you mentioned I worked at BigID in the past, well, they're an enterprise solution, they have been an enterprise solution. Now they're kind of getting into some smaller, to other markets with some products. But when I was there, they were an enterprise solution. And they, they were overkill, for anybody who was even a medium-sized business, right? It would have been too expensive, it would have been just a lot of bells and whistles that wouldn't be used by them, by as a medium-sized company. So they knew their lane, they knew enterprise was their thing, and that that's what they focused on. And then we had tons of small and medium-sized businesses going help, who do I go to to help you automate my, my data mapping? DSAR's, and, people heard the call. And so a lot of different companies have, are attacking that challenge in different ways. And so I think that you could be a competitor and have completely different tech, I think you could be a competitor and have very similar tech, or you could even be not a competitor and have similar tech that just aims to do different things. So when I say that, I mean, like, I think of a bunch of eDiscovery firms were trying, thinking how can we take our technology and then just repurpose it for discovering personal data within? , instead of just legal documents and for litigation? How can we repurpose our eDiscovery tool for privacy? And I'm not entirely sure about successes there. But I, it is not outlandish to think that you would repurpose technology that can be used for something else. Yeah, so I just think it definitely comes down about and not just transparency for your customers from a privacy perspective. But if you want your buyers to really get why you're you as a company, or your company is differentiated, and maybe better suited to them, your marketing is really going to be the best place to do that. And so maybe it's three use cases, or maybe it's through, getting privacy experts to help you map, an advisor or consultant to help you map your, what the market outcomes you want, and then try to, work backward from that, when I worked at Amazon, everything was about working backward from the customer, right? To get from like, this is where we want to be and what is the customer needs, and then kind of working backward, I think the marketing teams need to be really working with experts in privacy. , even if they came from an adjacent field like security, you really need kind of the nuances that can speak the terms of art from either law, risk management, data governance, they all have very specific terms of art that are going to that if you speak them incorrectly, right I know you I want to hear your thoughts on this if you speak them incorrectly in your marketing knowledge, how does that make you feel when you read because, for me, I just had to dismiss we don't know what they're talking about. They can't articulate the problem or the solution. Well, right.

 

Debbie Reynolds  20:00

Oh, yeah, definitely, I work with people a lot on this. So, people, a lot of times, especially when you're working with companies that are developing technology, a lot of times they're thinking inside out, instead of outside in. So they're thinking, oh, I'm developing this cool thing, and they can do all this XYZ. And that may sound like gibberish to a customer who wants to buy, they don't really understand how you can connect, how they can connect their problem to what your tool can do. So that comes up a lot, I think. And I think that comes up a lot in a lot of different areas, right, and tech areas with software, I think this is kind of a traditional problem. But I think it's, it's especially the case in privacy, where, these are problems that almost any company can have, and any and hiring companies trying to wade through all the different tech solutions to try to figure out, what's the best for them? I think it's extremely hard to do. Yeah, yet the ecosystem is just growing right now. Right? So we don't have that breadth of like analysts and like, just, insight into the entire ecosystem. And like, so that's one of the things that we're working on at the rest of privacy tech is kind of defining, what does that look like? What, how do you, what's the value of the market, that each of this tech is trying to, like, achieve? And then how do you position that? And what are some of the elements of the tech stack that you would need to, to achieve, adoption of your technology, and then that's also like, I'm focusing a lot on B2B. I mean, that's where my expertise is, I find it a lot more fun. Mostly, because that's where my expertise is. But there's so much on the consumer side, too. So I don't want to make it seem like it's all business all the time, right. There's a lot of consumer-related privacy tech, giving transparency to, through APIs and whatnot, about how data is being used by third parties. There's, there's a lot out there in the B2C space as well. Yeah, right. Yeah. I, what I find when I'm working with companies that want to try to bring on some type of privacy, enhancing tech, well two things, a lot of times they don't know, the question to ask, right? So they're asking the wrong question. And then maybe they're dazzled by, some marketing or whatever. So they get a tool, it doesn't really do what they thought it would do, or it doesn't solve their problem, because they really didn't define the question. And then what I'm saying is that these companies end up going there boomerang back into the market, again, trying to find something else, because I feel like they're, organizations have a vast array, very different issues, and privacy. Right. So, as you had said earlier, sometimes you don't need a hammer, you need a scalpel. So, can they find the scalpel, if they, what tool offers in the scalpel or the sliver of thing that they need, right then or what they can grow with? and different things like that. What do you, what are your thoughts about that?

 

Debra Farber  23:29

It's a really great question. It's a complex question. Because there's, I mean, there's a lot to say, so I'm going to try to keep it all straight. While I, while I talk about it, but, so first, you're absolutely right, privacy, this is a business problem first, right? You can't stop. You can't solve anything with tech if you don't know what the problem is, and define it. And so at first, it was being defined by there's a regulation, someone needs to address this, we need to comply. And there were some platforms that emerged to be there at the beginning that, through social proofing like this big company's using the platform, so I'm going to get the platform. So, so that everybody can say the same thing. And it's almost like a feels like a safe harbor that, hey, you can't say I'm not using the right compliance tool, because all of my CPO friends or these other large companies are using the same tool. And so they kind of all rose together. And that tool is there, that plus those platforms have evolved in a certain way to try to address everything privacy. And I just, I think it'd be really, really difficult for any company to address all things privacy at once. I just, had the conversation this morning, when we were talking about a working group call and Michelle Dennedy said the exact same thing. I mean, I'm really quoting her now. So I just want to give her, her credit, but like security companies that did this, she was at Cisco and McAfee and Sun, and like, she's like this all bought up other privacy, other security companies to be a behemoth of a security company, right Symantec, whatever and, and it didn't improve, it may be improved their profile and some financials, but it didn't improve the value for their customers. At times, it felt siloed. And, I worked at IBM, which was a giant, giant company of other companies that were purchased. And it felt siloed like that. Right. So it's, I honestly don't think that one company can rule them all. And I don't think they should. Actually, if you're endeavoring to answer all privacy problems, it kind of shows you're not deep enough into any of those privacy problems. And so what I think is that, obviously, customers, obviously, companies need, like privacy experts, within companies, your CEOs and their teams, they need tools, they can't just do everything, don't have an unlimited budget, we get that. And it's even hard to get a budget for privacy at times, as well. So I think what's important is to not only research the privacy problem, but research, things you could try to fix without tech, like, maybe we should just not do that anymore, sharing the data with this or that. And because it's causing the problems, or maybe if we shift left, and we, I don't know, classify your data differently, when we take it in, like that'll prevent a whole bunch of problems through the data lifecycle that we have to manage, right. And we don't have all these compliance problems on the outside for data deletion, if we had certain rules at the beginning that said, that once data is ingested, your data is deleted after a certain date, or in a different or a certain way or whatnot. Right? So combining a lot of solutions here, but you don't always need a deck, right? Sometimes it just stopped doing that thing, right? Sometimes it's trade-offs as to, it's gonna cost us X to bring in all the innards, bring in some tech to, I don't know, de-risk the data in some way. obfuscate it, delete it, do risk it in some way.

 

Debbie Reynolds  27:02

No, no, I know exactly what you're trying to say.

 

Debra Farber  27:05

You just it's not always about the tech, you really want to understand, because then there's gonna be trade-offs. Before you buy the tech, you're gonna have to go to like, you're somebody who's going to give you that money, your CFO, whomever and say, here's what I'd like to do. Here's what we need to do. Here's what, this is what we're doing today and can't happen because this is the financial burden. If we, get in trouble for it or, or we need to do right by our customers. That's, that should be the first one we need to be done, right by the humans that trust us, or we'll lose that trust.

 

Debbie Reynolds  27:37

That's right. The right. I was thinking about this earlier today. And I'm so happy you brought it up. It's so funny that you did about let's talk about the low tech stuff and privacy like we collect less stuff. How about that, like put less data into the hopper, to begin with? And that will reduce your privacy risk? Right?

 

Debra Farber  28:00

Yeah, I think we're coming to the end of collect everything we might have a use for it later. Right? I mean, we're starting to realize that having to put privacy around big data, and big data collection is a huge hurdle for a lot of companies. So maybe it's don't do that, or have a better have pre, have a good rationale for why you're collecting the data in the first place. So you only have to collect, I'm sorry. So you only have to, manage and guardrails around the data that's absolutely necessary for what you're trying to achieve in your business.

 

Debbie Reynolds  28:33

Yeah, I think also one of the interesting things about privacy, and I've been actually happy to see that there's been developing this way, which is to talk about data retention. So a lot of times when we think about data retention and businesses, they're thinking about some like statutory thing, like, you keep taxes a certain amount of years, or whatever. But the cool thing that I like about privacy is that a lot of these laws are saying, keep the data as long as is needed for a business process. So it's kind of creating a situation where businesses need to have a trigger, right, for when that stuff is no longer needed. And the problem in the past has been, there has been no, there have been no particular things that said that you have to delete that stuff. So a lot of companies kept that and I've always been concerned about legacy data that companies keep so data that isn't as utilize as I have been before, is in a backroom somewhere on a server in the cloud, and no one knows about and especially with a lot of the breaches that are happening right now. , that's that data tends to be low value, lower value than some other name, but it's some of their highest risk for companies because a lot of times they, saw all, maybe Jasper down the hall knows what's on that server and no one else or something. But what are your thoughts about data retention? Or, the privacy risks with that?

 

Debra Farber  30:11

Yeah, no, you're absolutely right. And I guess I have two thoughts on this, like one, it's a challenge. But as you can become a leader in the field by setting by saying putting your stake in the ground and saying, like, this is what I think is an appropriate amount of time for this activity. A good example of that is back when Google was, kind of new on the scene and search was its, main product, and it decided, and privacy was, being discussed, they decided that how long should you keep search records? How long like, how long is enough time before you delete it for search purposes. And I don't exactly know, the inner workings of how they came up with this number. But they said six months. And so that became the de facto industry standard, they follow people followed Google. And if anyone deviates from that, and, anyone does it longer, it's kind of like, why are you doing that's on industry best practice? So best practices are definitely a thing that, can be helpful for company governance, it's not all right. But it is something you can point to and be like, this is the industry standard, because, well, it was created by a company that was really in this space, and they did the calculation, then everyone followed. Yeah, now I don't really see a lot of, that type of data retention, best practice across all things all use cases, right. So there's definitely a challenge. So the second thing I'd want to bring up is like I have helped, I have seen something that does help teams at like, small individual levels, rethink, do I really need a, keep this data are collected in the first place, or whatnot, or persisted for longer than 30 days or something like that. And that was when, like, that's basically like, if you want to onboard a team, to a data deletion system, that may be a homegrown system within a company, right or tech that you've purchased, and you want them to onboard to the tech to allow automated deletion over time. You can tell them, hey, as a privacy practitioner, if you store this data longer than X number of days, then you're gonna have to onboard to the system, if you only persist for like a day or something like that. You don't. It's amazing how quickly they re-architect their business process as to how long they need that data. Oh, okay, we were going to keep her indefinitely, forever. But now that you say that, we're just going to keep it for one day. But that's going to be different based on your use case, right? Like, right, if your use case is I want to have the most relevant content to the individual signing into my, my company, be shown to them, one day might be enough. It's gonna be, something that needs more longitudinal, data where you have over time, some sort of more like, personal personalized thing, maybe it's not. But there are, you got to create some tension within the processes to get people to rethink that. And I think that's the hardest part, right? It's changing, changing corporate behavior is I have found as a privacy practitioner, the hardest thing to do from inside a company is change culture.

 

Debbie Reynolds  33:20

Oh, absolutely. Absolutely. I know, obviously, younger company younger, and newer companies don't have the same data retention problems, I feel. But some of these older legacy companies, this is audio, so you should look at me, or I wish you can hear my eyes rolling about this whole thing. But it's extremely, extraordinarily hard to get people to delete stuff or, decommission things, because the older it gets, the less corporate knowledge there is within the organization about what that data is, and a lot of people are afraid to delete stuff. And, so it really is incumbent upon the business, I think, on a going-forward basis to really look at that data retention thing, because a lot of times and I always say sometimes people, they treat it like an abandoned amusement park, right. So they use the data for their business purpose, and then it just stays there, right, or, you throw it into another system and may be less expensive, or you throw it in the cloud or whatever, and sort of that, being able to tie the bow at the end of that business process. I think it's something, I think is a problem that's existed for a really long time. And I'm hoping that cup companies can get more of a handle on that as a result of kind of privacy. And then also, to lower their cyber risk.

 

Debra Farber  34:55

Yeah. And so to that point, it's a huge, huge problem. So For really old companies, right? And then they've got some random person and I don't know, somewhere that is able to code in that ancient language and they're paying like out, out the window for that person to code in that language. Yeah, I mean, the older your tech gets, I mean, the more risk there's going to be there for exactly that reason no one who's maintaining the datastore that it's sitting in? And how to do, it's, I mean, for one respect, it might actually be more secure in that it's maybe it's not connected to all the other networks. harder to get to for a criminal hacker, but yeah, so what I'm seeing there is with some solutions, like data, discovery tools, you can create a connector to a lot of applications, but then some of them require that there be an API to connect to, which might not exist in some of these legacy systems.

 

Debbie Reynolds  35:55

So, I don't know more than that, like, I can't go to the depths of like, how, how many, like, how bad, how big is the legacy system problem? I can't quantify, but it is a big challenge. And, I don't know what the answer is there to migrate it over to newer tech, and then take action upon the data somehow, or, do you just leave it there and just persist it forever with this risk? Because you don't know how to do right. That's what I'm seeing a lot more of, right? It's just kind of like, Don't look at the elephant in the room. Right? We know it's a problem, but we don't know how to fix it. If you have ideas, let us know. We'll fix it. So yeah, it's a problem. So if it were the world, according to Deborah, and we would do whatever you said, What would be your wish for privacy and tech and law anywhere in the world?

 

Debra Farber  36:49

Well, that is a broad stroke. I think, I mean, I don't want to be super prescriptive. But I guess, we need to remember that privacy is about protecting people. Right? And I would implore because even privacy professionals can get away from this when their mandates are coming from companies. And I would implore them to constantly re-communicate that this is not just about de-risking for companies. So that the company has less risk, this is de-risking and preventing the infringement of privacy of individuals. And that if you lose those individuals and your customers' trust, it'll be very difficult to get it back if you're lucky enough to even get it back. And so, the more that you could get away from talking about compliance with regulations with your customers with your and more talk about how you're addressing privacy and trust, and, the transparency and things that can customers know, right, like, things that my mom can understand who has no knowledge of tech really. Right, like then I think we could start to really have the conversation with the larger community and not just amongst ourselves and with just big businesses, and, how can we reduce our liability? And so what I want everyone Yeah, that's basically it, addressing privacy should not be addressing corporate liability. Yes, that is important. And that's why the companies will probably pay, throw money at a problem, and or throw resources at it. But in the end, you're only going to be rewarded for your privacy, posture, and strategy, and, for actually protecting individuals. So that, yeah, that would be my overall big. I'm sure I have plenty of others like the country of Debrastan.

 

Debbie Reynolds  38:56

That's right. As as I think things should be, but well, that's a great one. I think that's a foundational one, right? I tell people that, the data that the individual gives to you is not your data that belongs to that person. So almost like let's say, we put money in the bank, and then you call your bank and said, I want to look at my balance. And they say, well, we can't show it to you, because we don't know where your money when you go ballistic, but yet people think if What if a person gives them data, they can do whatever they want with it. So I think, yeah, being able to have the idea that this, that this data is not yours, and that you're a steward of it, I think is really important.

 

Debra Farber  39:39

I agree. And, it's funny, because well, technically, the conversation around like rights to data is not as defined as what you just said. I mean, it shouldn't be that way. , I'm not saying it should be a property right. Although I came into privacy. I learned about privacy from taking privacy law with Professor Paul Schwartz who's At the University of Berkeley now, before that I took copyright law from him. And so I will I really enjoyed this. And so privacy to me seems like another intangible property, right? Monsters that overlap. But there are reasons we have not gone the property route with privacy that I won't get into right now. There are just some challenges where it kind of breaks down. But I do want to address your point of the, there are two things around the banking concept that you just brought up is one, I mean, should there be some sort of like, fiduciary responsibility and collecting data that's been brought up? That's a great question. I'm a jury's still out, to something we should explore, right? And then, the other is that that very use that use case you're explaining about how it's mine, and I don't, I'm putting in the bank, and you're a steward of it, you shouldn't even do anything with it. And without my permission, or whatever, a lot of the web 3.0 technology, a lot of blockchains and distributed ledger technology is bringing is kind of like, aligned with that, very thought process. And, I just dived into that space. So there's a lot of subcultures there from your anarchist, everything should be decentralized, and nobody should be in charge of anything of the tech businesses, Here are ways that you can pretty much have an architecture that allows for, fair, like, incredibly quick, fair and, finality into the transactions, where maybe you're, you're maybe transaction transacted is completely unknown. But you're solving real-world company, privacy, and security problems. That's kind of where I am right in that space, as opposed to like the, what do they call them the crypto basically, the anarchist, so the ones who didn't want control of anything. So?

 

Debbie Reynolds  41:58

Yeah, well, this is fascinating. Thank you so much. I love this conversation. So a lot of fun. And I know a lot audience would really love to hear about this. And then we have to talk some more time about blockchain and all the wackiness going on some of the other technologies that are, are out there.

 

Debra Farber  42:17

Absolutely. It's so it is, in many ways a wild west. And it's also kind of, difficult to understand what's going on there too because you have to read all the white papers from the different blockchain companies to understand what they're working on. And then you have to have enough knowledge of all of the various spaces from finance to, tokenomics and to, to really weed through like, what is a good, use case versus what is just trying to raise money to, to have a product, but isn't gonna be very useful.

 

Debbie Reynolds  42:51

Yeah, exactly. Well, thank you so much. This was so much fun. I'm so happy that you agreed to be on the show. It was my pleasure. Thanks for inviting me.

 

43:01

Nice to have two Deb's on show together. Double Debs! That's right. All right. I'll talk to you soon. All right. Thank you, Debbie.

Previous
Previous

E38 - Dr. Mechie Nkengla CEO at Data Products LLC, Chief Data Strategist, AI Advisor

Next
Next

E36 - Michael Clohisy Sports Attorney Privacy and Data Rights Sports Tech