E287 - Vibeke Specht, Author and Co-Founder at Peak Privacy
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:11] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:24] Now I have a very special guest all the way from Copenhagen, Vibeke Specht.
[00:30] She is the co founder of Peak Privacy and the author of From GDPR Confusion to Privacy First Marketing. Welcome.
[00:43] Vibeke Specht: Thank you, Debbie. Thank you so much. Thanks for having me.
[00:47] Debbie Reynolds: Yeah. Well, this is exciting. This is our second collaboration indeed.
[00:53] Vibeke Specht: Yeah. What has it been like? Was it two years ago now? Three years.
[00:57] Debbie Reynolds: Maybe two or three years that we collaborated together. So you all had asked me to come to do a speaking thing with you all, which was incredibly fun.
[01:07] I love to talk about marketing and I love to talk about the way companies use data in marketing because for me, especially when GDPR came out,
[01:19] the marketing folks, they were like the vanguard, right. So they got hit with all the gdpr.
[01:26] Vibeke Specht: Yeah.
[01:26] Debbie Reynolds: But the cookie madness started with the marketing department and I felt like,
[01:31] unfortunately, the marketing folks felt the brunt of GDPR before a lot of other people did. But.
[01:38] But yeah. I would love for you to tell more about your background. I know you that you're incredibly passionate around data protection and, and having people be protected when they're using services and product.
[01:53] And you're also very well steeped in how companies can do marketing and do it in a way that helps their brand, but then also doesn't harm people. Tell me your background.
[02:07] Vibeke Specht: First of all, thank you for having me. It's super, super fun to be here. It's a great honor.
[02:13] You. So my background when I started was a little bit of an accident that I fell into the data protection rabbit hole.
[02:23] I.
[02:24] As a way, way back, like more than 25 years ago, I studied political science and history. So I've always had like. And I've been a journalist and editorial writer.
[02:35] So.
[02:36] But then I ventured into marketing and I helped companies,
[02:42] helped them with communication and marketing and branding.
[02:45] And I profiled myself as helping companies that had.
[02:50] Were good with their CSR work. Social,
[02:53] corporate social responsibility.
[02:54] And I came, I started working at this logistics company and we were working this really exciting project regarding the Internet of logistics. So that was the first time I started learning about how the Internet worked.
[03:09] And to think about the Internet,
[03:12] I guess it's, you could say, IoT. Right. Device.
[03:15] What we were trying to do there was like, how do you create a System where everything is an. Is an address and everything that moves is. Is a carrier and every.
[03:24] I was not aware of data protection then.
[03:27] So it was this whole like, how do you do things as seamless as possible and efficient as possible? And how can you like get anything you want just merely by thinking it, ultimately.
[03:38] But we were talking a lot about circular logistics back then. In order to enable goods to not just come as products, you want to make them also be recycled and get to a bread, you know, how do you like.
[03:54] Because we were also like trying to enable this kind of circular logistics into how do you leverage it to fight food waste?
[04:04] Because that's like a huge environmental problem and financial waste.
[04:09] And so we were looking into how you could. So that's where like, you know, really started to think about the Internet, the Web. And what's the difference between the World Wide Web and the Internet.
[04:19] And packages, right? Because you talk about packages in zeros and ones too. And after that I,
[04:26] I was heading out to work at this company.
[04:29] That's where I met you. It was content management platform, right? So that's when I started thinking about cookies.
[04:37] What's a cookie?
[04:39] And what's the deal with the GDPR actually? And what's with this? Because before that there was the E. Privacy Directive, right? And here in Europe. And. And I noticed that a lot of these people that I was, that was our target group.
[04:52] We had like, we were talking to marketers that needed to understand these issues.
[04:57] Their aspect was like, how do you eat the cookie and have it keep it, right? Have the cake and eat it while the lawyers were like, how does the Internet,
[05:06] you know, So I became this like API,
[05:10] trying to bridge these gaps and also like simultaneously trying to explain things for myself.
[05:17] And then while I was doing that, creating all this content and digging deeper and deeper into what was what.
[05:24] I ended up writing a book which you so kindly introduced and what I found fascinating and where I felt like, okay, I actually went full circle. I came back to my roots, the social science,
[05:39] the. The care for how. The care for democracy, the care for how the importance of institutions and the rule of law and how everyone is part of that. And the Internet fundamental understanding for how the general data protection regulation is anchored in these fundamental human rights that are codified in Europe in the Charter like that we're.
[06:01] Privacy is one, one very important,
[06:03] right? Because we had the war.
[06:06] We knew what happened when you were able to track people.
[06:10] And so we had these systems in Europe where every government had these like registers of people, right? And this was. The better those registers were, the easier it was for the **** regime to.
[06:23] To basically find people, track them down.
[06:27] And so I thought, I was like, for me, it was like, oh, wow,
[06:31] why are we not talking more about that? Because to me, it seemed like there was this.
[06:36] The GDPR seemed to be discussed only in, like, like a very practical, tedious way. And it was seen as something that only gives you more work and bureaucracy.
[06:47] And today there's also this discourse about how it prevents innovation.
[06:53] But so, like, I was. I was really fascinated by why hadn't I thought about this before? And also, like, this is extremely important because the tools we have today and the Internet and the web is so much more powerful in how it can enable the tracking of people and profiling of people.
[07:14] And this in turn also then taps into the whole discussion of how kind of a marketplace do we want the Web to be?
[07:23] Is it a healthy marketplace today?
[07:26] Do we have a good market diversity and competition, or are we in a little bit of a gnarly place where we have very dominant actors that actually is setting kind of the rules for how the web is working today?
[07:43] So these are, like,
[07:44] deeply fascinating subjects. And now when we have a lot of turbulence here in the EU with the omnibus discourse,
[07:53] all these fundamental topics are again on the table.
[07:58] Even though there are a lot of interests in this discussion that don't want to lift it from that perspective. They want to trivialize it and break,
[08:07] say that it's, oh, it's just these like, cookies and cookie banners and that's so tedious and like, who cares? It's ridiculous.
[08:14] But that's such a superficial discourse, and I think we really deserve to be able to discuss this in a much more profound way.
[08:25] And yeah, so. So that's, That's a very long answer on, like, my background or where I come from.
[08:31] Debbie Reynolds: Yeah,
[08:32] that's a great background. And thank you for sharing your thinking there.
[08:38] I think it's true. So when I first.
[08:42] When GDPR first came to enforcement, I was on a television show, actually.
[08:48] I was asked to be there to talk about GDPR and why it was important.
[08:53] And I only had like eight minutes, so I had to say, hey, you know, thinking about fundamental rights in **** Germany and how the European Union thinks differently around data,
[09:04] personal freedoms and data, and it's very different than we do in the US and so I think at that point started doing a juxtaposition.
[09:13] They really couldn't tell the difference between what was happening in Europe versus what's happening in the US and it's very different. It's diametrically opposed, frankly, in my view. Right. So I think in Europe,
[09:27] because our histories are different and this is very important,
[09:31] people think about privacy and data protection differently in different places based on their culture. So where in Europe, it was more like, okay, too much tracking diminishes someone's fundamental right.
[09:47] Where in the US we're like, well, we need to gather more information.
[09:51] So it's very, very different. And then also to me, there are like two layers of the technical,
[09:58] not the technical issues. It's two layers to what happens with data.
[10:03] And the problem, I think, and I want your thoughts since you're the expert in this.
[10:08] The problem, I think is that there are two avenues that are trying to run parallel with data.
[10:14] One is that obviously people who have products or services want to sell those products and services to people, right? That's one thing. And then this underbelly, and this is the problem is that the data that we give or the data that we omit in the digital space can be used and abused for different purposes.
[10:38] They have nothing to do with selling a product.
[10:41] But I want your thoughts.
[10:42] Vibeke Specht: If I understand you correctly, you're talking about the whole like,
[10:46] because companies want to be able to sell their products and using collecting data, first party data, for example, is like one thing. And then you have the whole ad tech ecosystem where you are able to target people with ads that are more likely to buy your products, right?
[11:06] And that, that's like,
[11:09] that's an understandable business interest to be able to do that.
[11:15] The trick is to do that with transparency and respect and to have a market, an ad tech market that is,
[11:24] is healthy and works that, that respects this,
[11:29] the individual's rights to, to their own data.
[11:33] You know, if you, so you could go to any vendor or any company and say, where did, what did you do with my data in order to can I get it back?
[11:39] Can you delete it?
[11:42] So,
[11:43] so I'm not like, oh, ad tech is fundamentally bad because there's different types of ad tech, but because this business model grew, grew in parallel to how the Internet was came into all these like, you know, Netscape guys back in the early 90s,
[12:03] they wanted the web to like become everything it could possibly become,
[12:08] right?
[12:09] And so, but they were, it was interesting because they were really aware of how dangerous it could be if you were able to track people, the net. So they were pushing back against tracking.
[12:19] They invented the session cookie, right? So you, so you were able to,
[12:23] so the site could remember you and to make like things a little bit less Alzheimer's. But then the third party cookie came, right? So you could track people from site to site.
[12:36] And what the Netscape guys were like thinking then was like, okay. At least what I've heard in interviews afterwards is that they were okay. Either we just let things be, or like, we can't stop this and let's just let the third party cookie roam freely or we kind of just like make it impossible for it to work in our browser or we give the user a choice.
[12:59] And so they went with the choice. So that's how I understand that we still have this like on a browser level. You can go in and do some settings. It's not easy, it's not intuitive.
[13:09] People don't do it like this in a lot of ways. We're still discussing this, right? How can we get rid of the cookie banner and just put it on a browser level?
[13:17] But all the ad tech players are really afraid of that. They don't want that.
[13:21] And then we have this whole long saga where Google tried to deprecate the third party cookie by inventing all sorts of different types of cakes.
[13:30] And it's like a really hard conundrum. At the core of this is this business model, right? This is their fundamental business model is to, to collect data and enable tracking and personalized advertising.
[13:44] That is Google's core business model. 75 to 80% of their income is ad tech, is meta, is 97, 98%.
[13:54] And so we saw this, like, so while we saw the Internet grow out of this and these businesses, Google and Meta and all these other companies,
[14:04] they're all based on this, right? Because there was, there were no rules against it, there was no,
[14:11] no federal privacy regulation in the US and those none on the state level. So it was like free harvesting of data.
[14:19] And so the friction, the.
[14:21] And the discussions has come later,
[14:23] but also like, not today. It was like 10 years ago. So we had the,
[14:28] we've had these like, you know,
[14:30] target when the,
[14:32] when was this? When, when the dad found out that his teenager daughter was pregnant before she knew,
[14:39] kind of. And people were like, oh dear, this is not good. And they had like a backlash because of this. And there was a good discourse around that.
[14:47] And then we had the Cambridge Analytica with the Snowden revelation. So we've had this debate for a very long time.
[14:56] But our.
[14:58] And in the EU we met this. So we had the E privacy already in 2002. So we could have, we could have done so much more by leaning into, we updated it so it was like, so we incorporated the cookie part, but the whole part about like that you can't put anything on someone's device,
[15:14] which basically comes from the way back when you had physical phones and someone was putting a bug into it.
[15:20] We still had those.
[15:22] But I think lawmakers were not technically comfortable with and understood what was happening. So I think that's like a real actual practical reason for that. They didn't do more. But then we got the GDPR and there was a lot of hope.
[15:38] And then the E Privacy regulation didn't come. You know, we wanted it to, to, to replace the directives. We got a stronger one.
[15:45] But it didn't come because of the, the ad tech companies were lobbying so hard against it.
[15:51] So it got delayed and now it's basically tanked.
[15:55] And so we have, I don't know. So this is like the history and the present that we're still very much struggling. This is like very much the core of where we are.
[16:06] We are at still today. It's the same discussion. It's the same. But we see today,
[16:11] I think there's especially what's happening today in the US People can see in a much clearer way what's at stake.
[16:21] You can see that history is about to repeat itself. Because when you get access to this much data,
[16:28] it is not a coincidence that they're knocking on the ad tech door and asking for this kind of data to be used in swaying citizens. Right? They know the value of this.
[16:38] And that's like, so it's really, really dangerous. On the one part we have like this civil society democracy discussion and then we have the financial marketing. How do we,
[16:50] you know, sell our products and have a healthy marketplace.
[16:54] And I think the key is to,
[16:58] to have like a really good regulated,
[17:01] transparent where these companies are regulated and of course to respect certain things because as many,
[17:09] for example,
[17:11] the advertising watchdog,
[17:13] check my Ads Institute,
[17:15] they're doing a tremendously good job at explaining this conundrum and also putting the spotlight on what are you actually getting for your advertising dollars? Are you actually getting the bang for your buck that you think?
[17:30] And how do you know that?
[17:32] And so they're really putting the spotlight also on like on this, like how opaque the system is and how it's also a lot of the times self attributing the success so you will put in more money into the system.
[17:49] And so basically it's really, really hard for companies to do the right thing because you are smack in the middle here and you have to sell and you have to Market.
[17:58] So what do you do?
[18:00] Right, And I'm not saying it's hopeless because there are options and you can, there's definitely a lot of good things that can, that comes out of taking, taking a deeper look into how you're working with data and to be careful with it and like do your due diligence.
[18:14] Because data is resources, right? And you want to be careful with the resources and you also want to be careful with like what kind of data you share with other if.
[18:22] Because if you now have first party data, why would you ever want to share that with external stakeholders? Who does not take care of this data?
[18:30] Isn't it better that you keep it and like maybe build your own work more with first party data? That's one way of like trying to look at it more.
[18:38] Another way is to maybe. We have an amazing contextual platform here in Scandinavia from Norway, called Cobbler.
[18:48] And so they're like zero tracking, zero personalization and they're like ad tech platform.
[18:54] But they are in direct link with, you know, established real magazines and publishers, serious ones.
[19:04] And so what they see and what their customers are very happy about is that it's a little more expensive,
[19:11] but you're really being shown exactly in the right moment, in the right context for your product.
[19:17] And you can trust that you're not on a made for advertising site or some really weird, I don't know, **** site or whatever. Right?
[19:26] So they're like really talking about the value of quality and what if you could control where you're actually shown? And apparently this is like a no brainer, right?
[19:36] If you're shown where your customers actually are, you don't need to know who they are,
[19:41] you know, what kind of car they're driving or whatever. You just need to know that they're interested in the subject. This is your subject, so your ad is there.
[19:50] Yeah. So I think there's a lot of hope and potential in a new type of what we need to do is enable more innovation in this kind of area.
[20:00] And in order to do that we need to have proper enforcement on this. We need politicians and lawmakers to wake up and really, really look at this and see it for what it is and not buy in to the,
[20:14] to the argument that these regulations are stifling innovation.
[20:19] And so,
[20:20] so while privacy regulations are incredibly important to have and to keep fighting for and to, to get in place, you guys now have like 20,
[20:28] you know, states that have it. Right.
[20:31] At the same time,
[20:33] we also need to talk about product safety because that's also like an area that been lacking.
[20:40] It's like the Internet is not the real world,
[20:42] so a platform is not a product,
[20:45] but it is. It's like some of the most dangerous products out there in terms of mental health, how kids are feeling, et cetera.
[20:53] I know you guys have really exciting people,
[20:57] two of whom has been on your podcast previously. Like Monique Priestley from Vermont and also Lisa Levasseur from. From Internet Safety Labs. She just released a really interesting paper at Harvard regarding product safety.
[21:13] So I think that's also like a really interesting aspect. We today have a discourse in Europe and in Australia where we are like, talking about age verification for social media platforms.
[21:23] And of course that's like, it's good that we're discussing this, but it's also, you're attacking the symptom and not the core problem.
[21:29] And at least you're like aware and talking about it.
[21:33] Debbie Reynolds: Thank you for the shout out for my past guests. And actually we were talking about Check my ads. Ariel Garcia. She's been on the show as well.
[21:43] Vibeke Specht: Yeah, I interviewed her for my book too, because it was so. Yeah, because I was, you know, I was also part of the thing is because I was like, really unaware.
[21:53] I was living in this cognitive dissonance between like doing Facebook ads and like, oh my God, Edward Snowden. This is horrible. And everyone's tracking and like, but you're part of this equation.
[22:03] Why are you. And so I couldn't figure it out. And I was asking questions so much. And then when I was really in that. I don't get it. Am I insane?
[22:11] And then Ariel Gestila resigned from, um, Worldwide. Like in her. Her article in went viral where she was just like,
[22:20] okay, I've had it.
[22:22] This industry is like, they just want to walk backwards into the future.
[22:27] And I contacted her, I was like, please, can you talk to me? I need to understand what's going on. And so I'm really. I find it so there's so many good people like you, Ariel,
[22:38] Lisa,
[22:39] so many good people out there that are like really looking into things and exposing things and problematizing things or like talking about it and lifting the debate to the level it should be at and focusing on the right things.
[22:53] And I find a lot of hope in that.
[22:56] Debbie Reynolds: Aw,
[22:57] that's amazing. Thank you for sharing that.
[23:01] One thing I would love to talk about that you touched on. First of all, it's gonna be hard for me to keep this to an hour. It's really hard because all the topics you bring up are so interesting.
[23:11] But I want to Talk about this is one of my favorite topics and people rarely ever talk about it. And that's the connection between privacy and antitrust.
[23:20] So I had done a, I had done a piece in Bloomberg many years ago.
[23:25] This is before the privacy debate got super hot in the US And I was saying antitrust and privacy are tied together. And I had someone else tell me that they didn't think it was,
[23:38] they didn't think it was connected or intersex. I'm like, absolutely, it does. But I want your thoughts there, especially as it relates to how it impacts the scale of enforcement.
[23:49] Vibeke Specht: Yeah,
[23:50] that's such, that's a huge topic too. Right. Because I was, that's also like one of those rabbit holes I went down in and the eu, we had a really strong commissioner called Maglietta Vestage.
[24:01] She was Danish and she had a really high profile and was like very unapologetic against the big tech companies.
[24:09] So she was part of the force that made sure we got the Digital Markets act and Digital Services act and primarily the D is I guess considered a sort of antitrust regulation because it goes after the so called big gatekeepers.
[24:24] The DSA is also going after the. In a different way. Right. It's just, it's. I relate that more to a product safety kind of a framework. And the other one is more like you guys are too big, you're stifling innovation,
[24:37] you're destroying the market. So we can't have the kind of diversity and healthy market.
[24:43] So you guys need to calm down a little more. You can't do anything just because you're big.
[24:49] At the end of the day, what are we trying to accomplish? What do we want to have? We want to have a healthy marketplace both physically and digitally,
[24:59] online and offline,
[25:01] where everybody plays by the same rules,
[25:04] where fundamental human rights to privacy is respected,
[25:09] where you respect people as citizens and as consumers.
[25:14] And in order to get all that, we have to have good privacy regulations that frame this.
[25:20] We need to have anti surveillance laws that are up to date with how technology works.
[25:26] And we need to have a marketplace where the bigger you get, you buy up other companies just because you want to get rid of the competition. You can't become even bigger by merging with another huge company.
[25:40] So there needs to be balance. Like antitrust is like a pretty fundamental. Most human beings understand the value of that. I. It was a long time ago that Standard Oil in, in the US was,
[25:51] was on the table and we've been hoping for something like that to be able to repeat itself with the you know, the antitrust court cases against Google and Meta. But it seems very, very difficult to get to that point.
[26:06] It's almost like the judges, they agree first, like this is a problem, but then they don't take the needed actions to actually split them or because I guess the link, the easiest way to explain the link,
[26:19] if I'm thinking out loud, is there was no privacy laws as a. You have been able to collect as much data as possible and grow these businesses this way for many, many years.
[26:30] And they grew because of this. So their whole business model is based on they're not being data protection regulations stifling or like keeping holding them back.
[26:42] And so they've become too big. And as an advertiser, as a company,
[26:47] you don't have many options today when you want to find your customers online.
[26:53] Debbie Reynolds: I think this again is a fascinating topic. I guess the thing that I had always been frustrated about, especially in the digital age, is that a lot of these companies, in my view,
[27:06] they're not becoming like you gave the example, Standard Oil, right. And so when you think about that, so let's say back in the olden days, the way it was like, okay,
[27:16] there are a hundred oil fields and we don't want one company to have a hundred oil fields because if they do that, then they push out other people from the marketplace.
[27:26] They like control the price. There isn't like competition between them and other people that harms the consumer. That's kind of the basic thought. But in digital age,
[27:36] what I see is that what these companies are amassing, in my view are data monopolies. Right? Where I feel like a lot of the antitrust people, they're saying, okay,
[27:48] Meta can't buy another company that's like them and what they do. But then they can go to a whole different industry,
[27:56] buy a different company,
[27:58] and then they merge the data that they have now with something else. And what they're getting is a more fulsome picture of an individual and that,
[28:09] that powers this ad ecosystem where that data is also sold to like data brokers. So it's not about the data that a data broker has is not about how many shoes you bought on Amazon.
[28:23] It's about,
[28:25] you know, who this person is, what they do, what are their behaviors. And we're seeing,
[28:30] I guess we're seeing two things. We're seeing companies that want to target people from these very detailed dossiers, what I call a data dossier of someone.
[28:39] But then also we're seeing it being used by law enforcement without warrants. To track people. And so those are some of the problems. What are your thoughts?
[28:49] Vibeke Specht: I really like how you talk about the data because, like, because what you're saying is back when Standard Oil was a thing, oil was oil. Now data is the new oil, right?
[29:00] And that's like an old saying, but it's still very,
[29:03] very much the truth.
[29:04] Also with AI companies,
[29:06] they need so much oil, slash data in order to be able. And that's also interesting, right? Is this like really a sustainable and innovative way of building AIs? Are we really solving the right.
[29:18] The right problems? Because so it's ties in to so much of. We want to have a healthy, thriving, innovative marketplace. We need to have a situation where not a few actors have so much money and power and control, which they do have now, right?
[29:37] And so they can control who we vote for,
[29:40] what we think.
[29:42] And so a lot of these,
[29:43] both Mirta and Google have platforms like YouTube or Facebook, Instagram, whatever. It's like we are so polarized as a society in the US and also here,
[29:55] people have their own news feeds, right? We don't see the same things.
[30:00] And so this like, accumulation of power is so incredibly dangerous. So this is like the core problem we see now and today with the LLM bonanza.
[30:13] We got so much money going into startups that are leaning into this AI, this way of, like into this concept. And at the same time there's a lot of. Because it's interesting because we've been saying, yeah, well, this can help us solve world hunger and climate change and cure cancer.
[30:31] But at the same time, we have now reports of really important research in cancer that is not getting the funding it needed because all the money is going into this new.
[30:43] So it's really affecting us in such a fundamental way everywhere in our society.
[30:49] So I think definitely, I don't know if it's like on the top of the thing or like the core of the thing, but I think antitrust is definitely extremely important to get right now and to see this.
[31:00] But then we see like the effect of the extreme lobbyism. We also have here in the eu, where we have more lobbyists in Parliament than we have lawmakers. We now see the effects of that with Omnibus,
[31:13] where they're trying to actually get at the core of how you define personal data.
[31:19] And this is like. And it's framed like it's in interest of the small companies, just as that's like always what they say, right? It's for the small companies. But this is like their playbook.
[31:28] This is what's good for the big players more than anything to keep the status quo.
[31:35] And the argument is always like, yeah, but this, we need to do this because we also need to have like this really good,
[31:41] strong growing companies just as the US has had, which is also like, wait a minute, how good is that? Actually it's not that the US is amazing. You guys have so much inventions, innovations, et cetera, but are really like these big tech companies.
[31:56] Really what we think is like we need in terms of innovation is that the goal, is that what we want to have here?
[32:03] Would that make our society better?
[32:06] I don't think so.
[32:07] Debbie Reynolds: I want to talk a little bit about apps and SDKs and technical and financial dependencies of those. What are your thoughts?
[32:19] Vibeke Specht: Yeah, so this is where I'm like, this is what I do every day, right? So we run a run a startup called Peak Privacy and we're basically focusing on mobile applications.
[32:28] So we're a software company and we enable app publishers or anyone who's interested to scan their mobile applications to see what they're actually doing,
[32:39] because the devil is into details. So everything we've been talking about now also applies for the as usually you think about Internet through websites, right? And you that's how you access.
[32:49] But Today more than 60% of all web traffic is a mobile app derived.
[32:55] So it's a lot of traffic that goes in. So a lot of companies you wouldn't think would have applications in the B2B sector has a lot of applications for all sorts of practical things that they can do.
[33:07] But the structural problem is that you usually don't build applications from the bottom up, like coded like kind of like you don't build houses brick by brick and make the brick yourself.
[33:17] You get the ready made building blocks because that makes things a lot faster.
[33:23] But what we know then is that these ready made and often free building blocks come with dependencies because the companies that are supplying them are the usual suspects in that tech sector.
[33:37] And it's also really hard because when you're as an app owner, you don't know what they're doing until you actually test them, right? When a user has the phone in their hand and are testing your apps, that's when you see what kind of network calls they're actually doing.
[33:50] And what we see is that companies are surprised because they didn't think they were doing these calls because for the purpose of their apps, this is not something they were expecting.
[34:03] So it's really hard for app publishers to do the right thing because they don't know so that's kind of what we help to solve. So we're like, we test the apps and enable them to see what's going on.
[34:15] And then we also explain what the data means from a GDPR and a privacy perspective. But also tap into the NIST2 directive because it's about vendor due diligence. You have to know your whole supply chain also when it comes to what your apps are doing.
[34:29] And then also help companies to like, tap into. Check the ISO 27110 boxes and SoC2 boxes and not keep this like part of their company in the dead angular blind spot.
[34:41] We have a library of SDKs, of course, and that's growing, we see. So we're building that.
[34:46] But then at the same time, it's not always the SDKs. You can build in trackers in other ways when you're building the.
[34:52] And there can be network calls derived from other components in your app, but the trick is to become aware of them and know what to do. And then you need to do that repeatedly because every time you do a new release or an update, something can change.
[35:08] You need to always be like, do your due diligence and rescan and have the audit reports ready. So there's the technical part where we're like, oh my God, why don't people know that?
[35:18] And of course they don't, because this is complicated.
[35:21] Then there's like, okay, but what does it mean?
[35:23] How do we. We also have. Because we are also like,
[35:26] understand as you probably know a lot about when you're working with companies,
[35:30] it's like the tower of bauble organizations. If everyone is working in the silos and then there's like a Bermuda Triangle between security, privacy and development,
[35:39] and the information just don't. And the privacy person who actually cares about the issue or has this on his lap or her lap,
[35:45] they have a really hard time asking the right questions because they don't know what the right questions are. So what we basically do here is like break the silos and try angle and make it really transparent for everyone, every stakeholder,
[35:58] to know what's going on and fix the issues.
[36:01] Debbie Reynolds: I'm glad that you're working in this space.
[36:03] I advise a lot of companies who have mobile apps and a lot of times the people who are developing the apps, or like you say, they're taking the building blocks and putting it together,
[36:15] they're all thinking about, oh, these are these cool features that we can do. They're not really looking at,
[36:21] you know, what is the app trying to look at?
[36:24] What are they plugged into?
[36:26] What information are they pulling back?
[36:29] Like,
[36:30] a good example is like,
[36:33] so let's say app maker created a flashlight app, and all it's supposed to do is show you a flashlight, right? But you may look in the guts of the app and it may be recording information,
[36:45] it may be tracking your location, it may be doing a lot of things that maybe a person who.
[36:52] A person who wanted just a flashlight, they didn't want surveillance,
[36:56] they didn't want their location tracked, or, you know, something as simple as I did a video a while ago about push notifications,
[37:05] I push notifications. That sounds very innocent, right? So, okay, I get a notification.
[37:09] Actually, it's. It's trying to figure out where you are,
[37:12] right? So part of that push notification is finding out your location. And so, yeah, thinking about that,
[37:19] all those different angles with apps and mobile apps are very different than the Internet in terms of when you go to your computer. And so a lot of people don't know the difference in those things.
[37:30] Actually, companies like mobile apps because they can get a lot more granular information about a person and their device and how they use it on a mobile app that they couldn't get in the same way if someone was just on a computer on a website.
[37:44] What are your thoughts?
[37:46] Vibeke Specht: It's interesting that you also see how I'm not surprised because you are in the intersection, law and tech, right. And really knowledgeable in both areas.
[37:55] So.
[37:56] And it's just as Ariel,
[37:59] because she talks about complexity by design,
[38:02] that things are opaque,
[38:04] and she's like, when you're asking and asking, you still don't get it. Then you have to realize that you're not supposed to get it.
[38:11] Right. This is like, so the reason the situation is so dark or like really confusing.
[38:18] And it seems like, how are we ever going to fix this?
[38:23] That's the feeling. I don't want to be like a conspiracy theorist, but if you like this business model, if you want to keep on getting as much data as possible, of course your interest is not for people to look into these issues, Right?
[38:37] And it's so hard for consumers to do the right thing, you know, because we don't have all the time to become tech experts,
[38:45] et cetera. Right? But we shouldn't have to. There should be certain product safety regulations that keep us protected and makes like this marketplace.
[38:56] There should be a much higher baseline for what is okay and not okay and also not just for consumers, but also for the companies because they don't have an interest in sharing their customers data.
[39:11] With other parties. What are they getting in return?
[39:15] Right.
[39:16] Of course there are app publishers out there that are hustlers in really weird game apps and et cetera. And they do not care. That's their business model too. Right?
[39:25] But the most grown up companies don't have an interest of leaking this kind of data unknowingly and jeopardizing people's safety and sickness.
[39:34] Privacy,
[39:35] but also their own safety and their own privacy. Right now we're publishing this report where we scanned apps in Scandinavia with focus on Sweden in the finance sector, in the health sector and some other five sectors.
[39:51] You know, where we're looking at these like applications that people can't be without.
[39:55] You have to do your banking, you have to do certain things. Right.
[39:58] And so there are millions of users and it's what we see is that companies are like, oh,
[40:04] am I on this list?
[40:06] Because they don't want to do. They do not want to be on the list and they don't want to do the wrong thing. But how do you then do the right thing?
[40:13] Debbie Reynolds: Right.
[40:15] Very cool. Very cool. Excellent work.
[40:18] So, Vibika, if it were the world according to you and we did everything you said, what would be your wish for privacy anywhere in the world,
[40:29] whether that be human behavior, regulation or technology?
[40:34] Vibeke Specht: That is such a good question. It's such a hard question. Very hard question. Oh, that's such a hard question. If I could wish for anything.
[40:47] Debbie Reynolds: You can have more than one. Some people have more than one wish.
[40:51] Vibeke Specht: What is your wish?
[40:52] Debbie Reynolds: Oh, look at you.
[40:54] Turn it back on me.
[40:56] I wish I. Well, I've always wished this well for the US I wish that privacy were a fundamental human right, which it isn't right now.
[41:06] That is codified in the European Constitution. Right. And so we don't have that here in the US Also. I always thought when the privacy discussions came up that there would, instead of Europe or South America or different jurisdictions having their own laws, that there would be almost like a international treaty about privacy at just a higher level.
[41:35] We as countries agree.
[41:37] You know, I feel like a lot of regulations about how we disagree about stuff like, okay, can we just agree that personal data is this,
[41:46] or can't we just agree that data brokers are bad? I don't know. So I wish there was kind of a higher, like a global something that people can agree to.
[41:59] Yeah, no, not to say that we're trying to be the same, but like there's some base foundational principles that we can all adhere to. Say we agree fundamentally on this, even though we May come at it from a different place.
[42:14] I guess that was my wish.
[42:16] Vibeke Specht: But that's beautiful. I really like that a lot.
[42:19] And it also highlights because where we have democracy,
[42:24] you know, democracy is like the least worst way of governing people. There's no perfect society. There's no perfect.
[42:31] We should always be careful of driving towards perfection. Right. There's no such thing.
[42:37] But, you know, you could definitely understand that some countries in the world would never sign something like that because for them, that is inherently not part of what they wish for, at least not from the ruling class.
[42:49] And if I could wish for anything, like, quickly right now,
[42:53] because I think a lot about how polarized we are as human beings and how unhealthy it is for our kids to.
[43:02] To enter the Internet and be stuck in algorithmic flows.
[43:07] So if there's anything like I could wish for quickly, it is for the ban of those kinds of flows. And business models.
[43:17] You're not allowed to.
[43:19] To addict people that way. You're not allowed to build algorithms that make people sick and society sick.
[43:28] And so because it's not just like cigarettes, it's worse. It's much worse than cigarettes. It's better. Everybody smoked than they were on.
[43:38] On these platforms and flows.
[43:40] I think if we do that,
[43:42] then I think we could start having a much healthier conversation in our societies in general.
[43:50] That's a much better foundation. Yeah. To protect our democracy and have good debates and good conversations. Healthy conversations.
[44:00] Debbie Reynolds: I like that. I like that. Right,
[44:02] so. Right. So instead of having someone put something in your feed that tries to direct or change your action. That's the issue. Right. So the issue for me is like, okay, you want to change my behavior by showing me something that would make me make a decision that I may not have otherwise made if I had better information.
[44:30] Vibeke Specht: Yeah. Right.
[44:31] I don't want the platform to influence who I vote for or who I. Or that I stay home and do not go and vote at all,
[44:39] or I don't want to talk about, like, misinformation per se, but it's the whole, you know,
[44:45] engagement equals enragement aspect that I feel is quite troublesome.
[44:53] Debbie Reynolds: I agree. I agree with that. I agree with that. Well, thank you so much. It's such a pleasure.
[44:59] Vibeke Specht: Thank you, Debbie.
[45:00] Debbie Reynolds: Talking to you today.
[45:01] Vibeke Specht: Pleasure is mine.
[45:04] Debbie Reynolds: I know the audience will find it as enlightening as I have. I definitely check out connect with you on LinkedIn and check out your book, for sure.
[45:11] Vibeke Specht: Oh, absolutely. I love to make new friends and talk about these issues.
[45:17] Debbie Reynolds: So.
[45:17] Vibeke Specht: Yeah.
[45:18] Debbie Reynolds: Yeah. And I just want to throw out the name of the book from GDPR Confusion to privacy First Marketing.
[45:26] Vibeke Specht: Yes. It's a very long time title. So.
[45:31] Debbie Reynolds: Very good. Very good. All right. Well, I'm sure we'll talk soon. Thank you so much for being on the show.
[45:37] Vibeke Specht: Thank you so much. Thanks. Bye.
[45:39] Debbie Reynolds: You're welcome.