E259 - Andreea Lisievici Nevin, Privacy and Tech Lawyer, owner at PrivacyCraft (Sweden)

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.

[00:26] Now, I have a very special guest all the way from Swift Sweden,

[00:30] Andreea Nevin. She is Privacy Digital and AI lawyer at PrivacyCraft.

[00:37] Welcome.

[00:38] Andreea Lisievici Nevin: Hi, Debbie. Very lovely to meet you again.

[00:42] Debbie Reynolds: Yes, it's excellent to have you on the show. I love to have people, obviously from different countries, because I think you have so many things that we could talk about, even though we're in different jurisdictions.

[00:54] You know, there are so many commonalities. And so I would love for you to tell your background and how you became the owner or the principal of your own company.

[01:04] Privacy Craft.

[01:06] Andreea Lisievici Nevin: All right, well, the story begins quite some time ago, and it starts in Romania, which is where I'm originally from and where I graduated law school. Or maybe I can start even earlier, because before law school, I did the unthinkable of going to a programming college, an informatics college.

[01:25] So it was a pretty weird combination. At that time,

[01:29] everyone was asking me, what am I thinking to do these two very different things together. However, of course, today, looking back, it's perfect, really,

[01:40] because I don't think today, especially in a field like privacy,

[01:44] you can be very good at understanding what's happening with technology without having some level of technological background, which of course you can also get today. But for me, it was a really unexpected benefit to do it so early.

[01:59] So I graduated law school, I worked in a very big law firm in Bucharest for many years,

[02:07] more than 10, I would say.

[02:09] And then I started my own boutique law firm specializing in data protection. I was already working with data protection, but this was happening in 2016,

[02:19] which is when the GDPR had been approved. And I knew that things will get rather big.

[02:26] So I built this boutique law firm to offer services only geared toward data protection,

[02:33] GDPR, actually compliance.

[02:36] And in 2019,

[02:38] I started working at Volvo Cars, which is when I moved to Sweden. I was the head of data protection compliance for three years. And all of a sudden, I was not an outside counsel anymore.

[02:50] I was working in house. I had to lead global privacy program with different laws applying with different requirements.

[02:58] And it was quite a different type of challenge than I was used to before.

[03:05] And then I started working at the Boeing company, but still in Sweden, I was focusing on digital aviation services.

[03:15] So all of a sudden, seeing the World through the eyes of a processor for companies around the world.

[03:20] Again, very different than what I had been doing before.

[03:23] And last week,

[03:25] last year I actually decided to go back into consulting and build my own consulting company. I registered as a lawyer in Sweden and here we are.

[03:37] So I would say along the way I also of course branched out into other types of data compliance because the field has really expanded quite a bit and not just with AI.

[03:48] And I also started to teach at the Maastricht University in the Netherlands. I am a visiting fellow at the European center for Privacy and Cybersecurity.

[03:58] And there I one of the courses that I teach, for example. Actually I'm not even like teaching it myself fully of course, but I am bridging the gaps between various jurisdictions in the Global Privacy Officer certification program.

[04:14] So all in all, I would say quite many things have happened since I was a young baby in law school and I've seen this feel from being very minor to being hugely important and becoming a board level topic of discussions.

[04:35] Debbie Reynolds: That's quite a journey, I think.

[04:37] I love to talk to people who were in it like before the GDPR came out because we know what it was like then.

[04:44] Like no one really cared about this. It was just, you know, it was like an asterisk on the bottom of a to do list that no one really thought about.

[04:50] So the fact that organizations or corporations started to take it more ser seriously as a result of the gdpr, I think that's been a major reason why this has grown into such a big field.

[05:03] And even though I know a lot of people rail against the GDPR really did set a bar and set a standard at least started the conversation for companies to understand what they should and shouldn't do.

[05:16] A couple of things you've done in your career that are really fascinating and I think you're probably the perfect person to talk about this. So you've worked inside law firms,

[05:26] you worked in corporations,

[05:28] you are a professor, a fellow at a university, and now you have your own consulting business.

[05:35] So that's a lot of different hats.

[05:38] How did you navigate all those changes?

[05:43] Andreea Lisievici Nevin: With some difficulty, I will admit, because it's very tempting, especially after having been many years a lawyer in a big law firm. In my small law firm,

[05:56] it's not super different.

[05:58] But then transitioning into an in house role was quite a challenge because I knew the legal rules really well,

[06:09] extremely well,

[06:11] because I happen to read a lot, I know a lot of things from many jurisdictions, but that's not sufficient. I would say that that's actually sometimes secondary to knowing how to make things happen.

[06:26] And by that I don't mean appeasing other people and saying the right things, but rather operationalizing compliance. Because explaining what the rules are, explaining why they are the way they are, how they've developed, what needs to be done is different from actually looking at, okay, what processes are there in this company,

[06:47] especially in a group,

[06:49] and how do we extend these processes or how do we embed a new process that achieves the desired outcome. There's the difference between saying, for example,

[07:00] oh, when you need to respond to a data subject access request, you need to provide this and this and this information,

[07:06] let's just say to keep it simple. But actually when you're working in house, you need to know how to make that happen. Like, okay, we get the request in this way, it goes into this system.

[07:17] Who is allocated to it then who actually needs to provide the information? Because it's never the privacy people that have to actually find where that information is. They just package it.

[07:28] Someone else needs to be, and actually many someone else's need to be responsible to provide that information and to do it in the required time. So there's a lot of.

[07:38] Of course I'm going to use the keyword accountability to be built. And especially some years ago,

[07:46] I would actually say it's not great today either.

[07:49] But some years ago, especially during 2018, 19, it was terrible because I'm sure I'm preaching to the choir here, but the privacy team, if there was even a team or the privacy person were seen as okay, you exist in this company, therefore you do everything and you tell us that you've done it.

[08:10] You come to me, business person, so that I give you information that I do the work.

[08:15] It was quite difficult. So I had to find ways to adjust what I do, how I do what I request from people, when, how do I request it, and also how to make sure that different requirements along different jurisdictions are at least capable of being met through processes that don't differ as much.

[08:39] So I would say that that was a challenge that I had not really foreseen.

[08:46] Of course I made it through and I learned quite a lot of things. And of course now I have. Part of my business is coaching other privacy professionals so that they don't have to meet these things alone.

[09:00] And that's actually when I realized that not being prepared is by design.

[09:06] It's not a bug, it's a feature. Everyone will at some point be in situations that they're not prepared to or they haven't really been in those situations before. So they just need to work it out.

[09:18] It's best if you have who to do it with, if it's best if you have someone more experienced to, to guide you or at least to discuss with and change some ideas.

[09:29] Another quite interesting point was when I switched from working in a European led group of companies to working in a US led group of companies because the mindset and the company culture are different.

[09:47] Even if the approach to compliance is, let's say, similar in the sense of okay, we meet the requirements, we need to have processes in place and so on. Willingness is not that different.

[09:58] I would say,

[09:59] however,

[10:00] the attitude toward how and who does things is different and I think that comes from different,

[10:08] even risk management understanding and experience.

[10:12] And also one of the big stepping stones that I came to realize sometimes the hard way is the difference between what the privacy function and especially the data protection officer means in Europe versus what the privacy function and especially a chief privacy officer means in the US culture.

[10:31] And I was not really either of these roles formally obviously, but I think what they mean in these two different types of cultures really informs attitudes toward people in the privacy function in general.

[10:46] Because in the at least this is my experience in the US centric US based type of culture, privacy people and especially the privacy leaders are much more active, much more involved in decision making, the ones to even take some risks every once in a while.

[11:02] Of course this depends on various internal rules, but they're not just advisors and they're definitely not what the Data Protection officer is in the eu, which is literally not the person to take any decisions.

[11:17] It literally only is the one want to tell you how things should be if something that you want to do is where it should be or not and sometimes,

[11:27] but not necessarily how you can get it there.

[11:30] So there's a very, very different construct of privacy teams I would say. And it's also a bit of a both learning journey individually and education journey externally to try to bring people at really the same table talking about the same things because we use the same terms but they don't mean the same thing all the.

[11:52] Debbie Reynolds: I never really thought of it that way because I have clients in the US and I have clients in Europe as well.

[11:59] And I think that's true.

[12:02] I think that totally is true. Right. Where you're probably, you're probably wearing a lot more hats probably in the US and you're expected to do a lot more. But I think one of the things you touched on it, I want your thoughts on this is it's very important for someone to be is successful in this type of role is really knowing how to communicate and reach out to these teams.

[12:24] Because a lot of like you say,

[12:26] once you wear that hat, sometimes people feel like you have to do everything. And why are you asking me this question? And I have to do more work and this isn't part of my job description.

[12:36] So how do you foster or how do you try to create a situation where you can have conversations with people in different areas of the business, get their cooperation or even their championship of what you're trying to do as the data privacy officer?

[12:56] Andreea Lisievici Nevin: Oh, that's a very broad question that basically sums up building a privacy culture.

[13:04] And it's not easy. Whatever I will under whatever I will answer, it's not really going to capture everything and it will be an oversimplification guaranteed because there's really no single way to do this.

[13:18] Of course, there are a few components to it and one simple component is to preach really what is your role, what you are there for,

[13:30] what you are not there for, and basically when the others can and should come to you and when you will and should go to them and what for.

[13:42] But the thing is,

[13:43] this all has to live through accountability,

[13:47] through internal processes,

[13:50] through roles and responsibilities.

[13:52] And not least, I'm only saying this just because it was leading to this point, but through tone from the top.

[13:59] When you are in any role in the privacy team, even if you are at the top of a privacy team,

[14:06] even if you are considered an executive, a vp, whatever is your role,

[14:10] you will have a tone from the lateral only.

[14:13] So unless the business leadership from the highest level is the one to preach the importance of complying with data protection rules along the lines of this is a value to us.

[14:26] We want to do this because, for example, this is what we think is right and so on, not just to tick a box.

[14:32] Without this,

[14:34] whatever you say and whatever you do will be a checkbox really for the other ones.

[14:42] So you can't really build a privacy culture alone, by yourself, meaning in the privacy team or from the privacy team. You really need partnership at the top level of the business leadership.

[14:57] And in a way, they are the ones to build the privacy culture and the privacy accountability with your help.

[15:05] You are basically the one to equip them with the right message. But if they don't do it, you can't do it. It's very, very, very difficult. It usually only happens when something really disastrous happens in the company and it touches on data protection, because then people learn the hard way Obviously.

[15:22] But if nothing really bad happens, then you really need accountability through tone from the top.

[15:31] Debbie Reynolds: I agree with that. Right. Because you do have to be able to understand how data flows through the organization.

[15:38] You have to be able to understand and talk with people in different levels of the organization and get their cooperation. And it's much like you say, like I used to say, it's like being stung by the bee.

[15:49] So once you're stung by the bee, then you fully understand like what you're supposed to do. You get that level of cooperation. But otherwise it really does take the tone from the top.

[15:59] So I totally agree with that.

[16:01] I want your thoughts about certification.

[16:04] People call me all the time about this certification. Is it helpful? Hurtful? I mean, what is your stance on that?

[16:11] Andreea Lisievici Nevin: That's a complex question with a lot of answers. To be honest, I don't have a very straightforward answer. I have a more complicated answer though.

[16:18] I have a few certifications myself,

[16:21] yet I'm not one to preach certifications from anyone of any kind.

[16:27] Because there's a big difference between certification and knowledge.

[16:31] Certification only proves that you know the things that are in the scope of that single certification program at that single moment in time. It's really a badge to show that you know some things, especially to people that are from outside of the field and they can't really gauge what you know.

[16:51] It's not really useful.

[16:53] For example, for me, if I want to hire someone or if I want to evaluate someone, I will really not care about their certifications. Because certifications don't prove that you know what the company where you work for needs you to know.

[17:08] It can prove that you know some data protection rules or some privacy management rules, which. Anyway, there's a big difference between, like I was mentioning, knowing the law and knowing privacy management or governance in general, really in any level, in any sort of compliance.

[17:24] But this is why I think certifications can only come on top of actual knowledge.

[17:30] If a privacy professional or any other professional is looking at a certification program as a learning journey,

[17:37] as a breaking into that kind of like, let's just stick to privacy, right? Or AI. Nowadays you cannot go for a certification program thinking that this will give you the base,

[17:49] I don't think that works. I would go for a very, very, very in depth type of certification when I already know quite a few things.

[17:59] So I think this is how I would separate them. Because if you just go for certification in the first place,

[18:06] it's going to walk you through the education. Of course, preceding that certification will walk you through a Few things,

[18:14] but not all of the things.

[18:16] And I'm not a fan of that. I think that in order to know one specific data protection law,

[18:25] one jurisdiction, I mean, or doing privacy management,

[18:29] or doing AI governance, by the way, management and governance here are totally interchangeable or any other, you know, compliance related governance or the substantial rules.

[18:42] It's much more important to have a comprehensive education about that scope that you're looking toward and then, you know, not study for the certification. Like study the field and then see which parts of it are in the certification.

[18:58] Because if you look in any certification program,

[19:00] what's in scope of the exam is only a part, let's say 70%, 85% of the actual scope of knowledge that you should have.

[19:11] And if you study for the certification only, you're going to miss a lot of things always.

[19:17] And of course not all certifications are created equal.

[19:21] There's also quite a bit of a decision point to be made as to which certification to go for. And some are more famous than others,

[19:30] which I don't think is necessarily justified.

[19:33] But it is what it is and I think certifications are very important today,

[19:38] at least at a certain level level more entry to mid level, I would say in the field for employers, for potential employers. And I look at job descriptions.

[19:49] Most if not almost all say that it's preferred to have one or several or equivalent types of certifications. And my guess is that this happens, like I was saying before, also due to the fact that it's hard for companies that don't have already special specialists more senior in that field to evaluate these people.

[20:12] It's of course very useful to be able to refer to an external examiner, let's say that already looked at what these people know and already has assessed a certain threshold.

[20:28] Debbie Reynolds: Yeah, I agree with you 100%. I have the same feeling that you do about certification. Right. Because if I know a field and this has been true for any field that I've ever been in,

[20:40] like that's the last thing I would look at. Like I would not care about their certification. Right. So I would want to know what is their fitness to do the job.

[20:49] You know, a lot of. And I want your thoughts here too.

[20:52] I feel like a lot of what we do in data protection really boils down to communication.

[20:59] So can't really teach someone to communicate if they're not a really good communicator. Or a certification isn't going to make someone, someone a good communicator.

[21:09] So I think I agree with that. So I don't think there's anything wrong with the pursuit of knowledge. And I do agree that people,

[21:17] and I do advise people, especially they're trying to break into the field or they want to maybe level up a bit, they feel like a certification could definitely help them.

[21:26] But, like, I know people are super senior. That was like, you know, I, I was doing this before it was ever a thing, so why would I get certification? That doesn't make sense.

[21:35] So I totally agree with you on that.

[21:38] Andreea Lisievici Nevin: Yeah. So we are literally almost saying exactly the same thing here, so I'm happy to hear that.

[21:47] Debbie Reynolds: So what's happening in the world today in privacy or technology that's concerning you most?

[21:58] Andreea Lisievici Nevin: It's hard to pick one, to be honest.

[22:01] It's a few things and I,

[22:04] I think I have to go first and foremost with the push for deregulation,

[22:08] which I think we see in more directions than one. And obviously the US is kind of leading on the front of deregulation nowadays, especially with the new AI plan.

[22:20] But there's this whole push and pressure on the EU to ease up on regulations. And when GDPR came into force,

[22:29] Anna Bradford coined the term the Brussels effect. And now I think we're seeing something called the reverse Brussels effect,

[22:37] where instead of the approach to regulation from Europe is in a way being pushed back.

[22:45] And there are big tech and even governments that say this is a hindrance to business and it should happen less.

[22:54] I don't support that view at all. I don't think regulation is a problem.

[22:59] And I find it a little bit,

[23:02] maybe even funny to look at China that also doesn't have a problem with regulation. There's this big AI race and so on, but China doesn't have a. And businesses don't have a problem with China being quite heavily regulated even, and it seems like they're doing really well even so.

[23:20] So I don't think regulation is the problem. I think the way in which the regulation is perhaps APPL and having uniform rules in several jurisdictions, that is a real burden.

[23:34] And just to use automotive as an example, there's the UNECE regulations, which I'm sure you're very familiar as well, because we have cross paths in automotive as well,

[23:44] where it's quite more than just the EU or just a few countries coming together and having a more uniform set of rules on homologation. So something like this would be ideal, of course, if we could have in other fields as well.

[24:01] I don't really think that we're close to having that happening, unfortunately,

[24:05] because I don't really know why I assume countries are little reluctant nowadays to give away some of their sovereignty, which might be more of a perceived problem than a real problem.

[24:19] But I will say this.

[24:21] We have in well wider than the European Union in Europe we have the Council of Europe convention 108 which is old by now. But when the GDPR was approved,

[24:33] the convention itself was modernized into convention 108 plus and it is still not in force because it has not reached the minimum level of member states, the minimum number of member states to recognize it.

[24:48] So I wonder why that is.

[24:50] It's quite a strong signal, I would say because despite the fact that we have it's the same states that were and are members of Convention 108, somehow they didn't want to jump on the bandwagon of the modernized rules.

[25:07] And yeah, I'm not fully sure why this is happening, but I think it's a pretty strong message that uniformizing regulation,

[25:14] of course more broadly than the European Economic Area is not very easy to do.

[25:22] So while I think this is necessary, I also see it that it's difficult. So we will see how this part will happen if at all.

[25:31] There is, like I was saying, the deregulation push, which also doesn't help here.

[25:35] And I think that it will be practice and facing potential or actual sanctions in multiple jurisdictions that will in the end lead to a factual uniformization.

[25:47] Despite the fact that I did think this also after GDPR and we're still not quite there.

[25:54] But also let me say this, 7 years of GDPR in application is not a lot.

[26:01] Of course we did have the previous directive in Europe, but it wasn't really at the same level of importance of enforcement. It really didn't have the teeth that GDPR has.

[26:11] So I think looking back at things after seven years and saying that it's not working that great or there are still things to be done is a little early.

[26:20] So I think we need to have this conversation again in five,

[26:24] maybe even 10 more years and see how it's going then.

[26:28] Now in terms of other things that worry me and not just the deregulation part and let me also say why I'm worried about the deregulation part.

[26:37] I don't believe in self regulation. I don't believe in we're just going to make corporate rules or non binding rules and we're just going to stick to them.

[26:47] I think that especially when we talk about really important things like human rights, which data protection and protection of private life are human rights in, at least in the eu also other countries around the world and also safety,

[27:00] which a lot of AI and other digital types of product services really touch on. I think we need rules, we need really a societal control rather than individual control over how things happen.

[27:16] Because this is also the problem that we see with laws that are very focused on consent.

[27:23] Consent is seen as the ultimate control given to individuals.

[27:27] No, I don't agree.

[27:29] And of course I'm not against asking for consent when this is a good idea. But I think that having a societal control that says,

[27:38] okay,

[27:39] these are the baseline limits, the baseline conditions, and only above these you can actually put something on the market and either ask for consent or have another legal basis, whatever is the choice there.

[27:53] I think that's better because with consent you put the burden on finding out what's happening, why is it happening, how is it happening, how long is it happening, who is it happening by, and so on.

[28:04] On the person that really just wants to access a product or a service, or sometimes not even wants to, but needs to.

[28:11] So I don't think this is fair.

[28:13] It's also, by the way, why I really like legitimate interest as a legal basis in the EU instead of consent. But anyway, I'm advocating a lot more for this level of societal control rather than individual control.

[28:26] So this you can only do through regulation. Of course to some extent it can also be done through non binding rules, but then you don't have uniformization,

[28:35] then you don't really have the same rules applicable to everyone. So when it's optional, it's optional.

[28:40] Debbie Reynolds: I agree with that. Wow, that's like a lot of stuff to think about. But I will tell you,

[28:47] when the GDPR came out, I was asked to be on television to talk about the GDPR in the US and why it was important and why it was a big deal.

[28:56] And that was in May of 2018 when it came into force. Right? It's so funny. Let me back up. So on May 25, 2016,

[29:06] when the GDPR officially became a law,

[29:10] I thought I would wake up that morning and like everybody would like know about this and everyone would care about privacy and data protection. And in the us,

[29:19] like nobody, like there was no nothing on the news, like no one said anything. And I thought,

[29:25] oh my gosh, this is terrible. Right? And so running up to the time when enforcement came into play, that's when the kind of news agency started picking up on GDPR and talking about it.

[29:37] And it's funny because people still ask me about that interview that I did on TV about gdpr, because a lot of the things I predicted actually came true.

[29:45] But although I know I have a lot of friends in the EU that are frustrated by some of the things that have happened since the GDPR come out, it has had a major impact to me on US,

[30:00] US companies,

[30:01] companies in other regions. Because without it, you can't really talk about data protection without talking about gdpr, because there really was no baseline before that.

[30:14] Andreea Lisievici Nevin: Right.

[30:15] Debbie Reynolds: So there was nothing to look at. Nothing. You couldn't benchmark yourself against anything. Right. And so I think the GDPR has been majorly influential in the US even though we don't have a federal law.

[30:28] What we saw after the GDPR was passed, not only in the US and other jurisdictions we saw there,

[30:34] other jurisdictions pick, maybe cherry pick certain things from the gdpr. They want to model their laws after that. And even in the US on a state level, we've borrowed a lot from the gdpr, maybe not everything,

[30:47] but a lot of the terminology and things like that have come into the US in terms of law. So for me, I think that's a good thing.

[30:56] Even though we have not yet gotten to a situation where we have any type of federal law. As you can see, we call the U.S. united States. I think we're more like 50 countries.

[31:11] It's not like states because I mean, just going from one state to the next, like it's like a major, major difference. So I think that's really interesting. But I want your thoughts about, you know, now that we, we're looking at the EU AI act, and I feel like it's going to be very similar,

[31:27] probably even more so, because now we have the US saying, you know, we don't want to regulate anything in AI. We want everything to be a free for all.

[31:37] But I feel like if you have a jurisdiction that has some rules,

[31:42] some regulation,

[31:43] some foundation, it's got to be very much like the GDPR where that has to be almost like a standard because no one ever, no one else has actually done that.

[31:52] So you have to kind of find a way to benchmark yourself against something. But I want your thoughts.

[31:58] Andreea Lisievici Nevin: I think that's true.

[32:00] And of course, you know, lobby is lobby, and it will continue to be so. And we still have about a year until the full application of the AI Act.

[32:11] But I will also say something else, and that is that unlike the gdpr, that,

[32:18] as the name suggests, is general and applies to any processing of personal data, which by the way, even that concept has evolved greatly,

[32:26] the AI act is not really the same.

[32:29] I'm not even saying that it's a product Liability regulation, unlike human rights and human protection, if you will,

[32:38] regulation. But the AI act doesn't cover all AI.

[32:42] So I many times see this attitude that oh, the AI act will come and all,

[32:49] everyone that does something with AI will have something to do under the AI Act.

[32:53] That's not exactly true because unless you do okay, obviously you can't do prohibited AI in the eu. That's just a few. And high risk AI is okay, not as few, a little bit more, but it's still a very finite list.

[33:10] So most of the AI act will address high risk AI. If you're not in that list,

[33:17] you have few publications, mainly transparency.

[33:22] So I think that the effect will not be as great.

[33:27] Also because of this very different way of application,

[33:31] you might actually not be in scope of the AI act even if there is AI involved.

[33:38] So while I agree that once the AI act starts to fully apply, some of it is already in force, of course it will uniformize practices.

[33:47] And especially where we talk about general purpose AI AI, I think that might be the biggest uniformization effect.

[33:55] If we're talking about AI systems,

[33:58] then I see the impact a little bit lower because a lot of it will just not be in scope.

[34:04] Debbie Reynolds: I agree. And once I looked at the EU AI act,

[34:09] frankly I was like, why are people so up in arms about this? Because it's like probably the typical company, the way that they're using AI wouldn't even apply.

[34:19] Like this law wouldn't apply to them based on what their uses are. But the thing that I do like about the EU AI act is that they do rate those uses based on harm.

[34:32] Where we're more like, let's innovate and let cars run over people and we're innovating, right? So I think thinking about it in terms of harm, and I think it's a lens that we haven't seen as much and technology,

[34:46] maybe we've seen it like to say like products liability,

[34:49] like people understand that, right? But when we're talking about software, they don't understand it in the same way. So I think it will be interesting and I love that point of view where I agree that I think some,

[35:02] you know, a lot of the uses, the everyday uses that people have with AI won't be impacted by this act,

[35:10] at least some.

[35:11] Andreea Lisievici Nevin: And I'm not saying that that's good and I'm not saying that that's bad. It's just something that we need to be mindful of.

[35:17] It's, it's just not gonna be as big a wave as gdpr was I. I don't think.

[35:23] Debbie Reynolds: Yeah,

[35:24] well, B, GDPR is still a big wave. I've seen. I actually saw a Guy on, on LinkedIn, he was talking about how he hated GDPR because it just was such a hassle,

[35:35] you know, with his business. And people were asking like, what are you doing? You know, they were like a lot of, in my view, a lot of what GDPR was doing was trying to,

[35:45] to find a way to let people know what they should be doing with their data. Right? So I feel like companies that are very good at data management, they're probably already doing a lot of what's there and maybe they need to just level up a bit as opposed to you just have nothing in place,

[36:07] you're just doing wild, crazy things. But I want your thoughts.

[36:12] Andreea Lisievici Nevin: You just utter the keywords. Because data management or data governance, however we want to call it, is really key and not just to GDPR compliance, to data compliance in general. Because if you look at GDPR or any other data protection law compliance as okay, it has this requirement, this requirement,

[36:31] this requirement, this requirement,

[36:32] you end up in a checklist approach.

[36:37] This is the checklist, this is what you need to do.

[36:40] And maybe if you take process by process in your company,

[36:44] you might be able to tick all of those boxes.

[36:49] But the thing is, you are not going to tick, if you will, the box of systemic compliance. Because if you don't have something as simple as okay, we have the data in this process which goes through these systems and here's when it might be repurposed into something else for another purpose,

[37:08] for another system, for another service, whatever it is,

[37:12] then you're going to miss completely analyzing the lawfulness of this repurposing. And this is a very, very, very important non compliance area that I see a lot on both sides of the Atlantic and in my clients and other people's clients, to be honest.

[37:28] Because if you don't have an ownership internal in company ownership over data, and if you don't have a process where these owners get to check and say this can be used or reused or not, then you're going to completely miss on secondary uses or reuses and you're just going to,

[37:49] on the face of it,

[37:51] have compliance, but you're not going to have real compliance. You're just going to have a little bit more than paper compliance, which is really not sufficient.

[38:01] Debbie Reynolds: I agree with that, I think,

[38:03] and I want your thoughts here too. I feel like companies are very good at collecting data, but they're not as good at tracking the data all the way through the data life cycle.

[38:14] And that's where a lot of the problems happen in data protection. What are your thoughts?

[38:19] Andreea Lisievici Nevin: Absolutely. And first of all, there's this free for all mindset. Once the data is in the company, we can use it for whatever.

[38:26] Right? Which still a lot think.

[38:29] And secondly, Even with the GDPR's requirement to have records of processing activity, you're not really that far from it. Because in theory the records of processing activity could be some static,

[38:43] let's say Excel sheet or document, or even if it's a more complicated system,

[38:48] it's not necessarily tied to the actual systems, to the actual data.

[38:53] It's not going to follow the data. It's not going to alert you when that data moves. It's not going to alert you that the retention time that is listed in the records of processing has been reached for this data point and not for the other one.

[39:11] So you need actual systems and actual processes that preferably automatically do these things for you, which nothing in the GDPR will do, not even the records of processing activities, not the dpo.

[39:25] And all of these things which are really external, they're looking at how things are happening. They're not making the things happen.

[39:33] Debbie Reynolds: Very true. I agree with that. A lot of common sense here.

[39:38] So if it were the world according to you, Andrea, and we did everything that you said, what would be your wish for privacy or data protection anywhere in the world, whether that be regulation, human behavior or technology?

[39:53] Andreea Lisievici Nevin: I would like for data protection compliance with any rules to not be seen as burden. Not something that we need to do in order to be on the market, but rather as something to be done because it's the right thing to do.

[40:13] So basically ethics, the ethical approach to using data.

[40:17] I would like businesses and any entities really to see the value in respecting people's not only choices, but just respecting the data about people. Because in the data you have people in whatever data.

[40:32] And of course the trickier it is and the more you go into profiling and inferring data and so on, the more sensitive these conclusions and revealing they can be.

[40:42] So today I still see a lot of the attitude toward data protection compliance being another thing for us to do before we can put this on the market.

[40:53] But at the same time,

[40:55] the same people don't really flinch when there's paid vacation,

[41:00] when there's taxes to be paid.

[41:02] These things have just become second nature. You just know that they're there and they're part of of the whole mindset really.

[41:11] But with data protection, it's like, oh, why does this exist? Why do we need this?

[41:16] And I would like it to become as mainstream and really maturity, to evolve, to see this embedded into this is how things need to be done.

[41:26] Because there is value in protecting people,

[41:31] in minding people,

[41:33] in respecting people, and in not. Not processing data if people don't want to.

[41:38] Debbie Reynolds: Really?

[41:40] I love that. I'm giggling as you were saying that. I agree. I think it should be table states. Right.

[41:45] So your data, if you're managing it the best way that you can, you're probably already in a good posture.

[41:54] You just may need to do a couple of other things to get this, you know, into your.

[42:01] The ethos of the organization. It needs to be part of culture. Right. It can't be something that,

[42:06] like, tacked on at the end or thought about at the last minute. So making it more foundational, I think will make it easier for companies,

[42:13] and I'm hoping in the future, even for newer companies,

[42:16] that it's kind of embedded in the DNA of those companies and it'll be easier for them.

[42:22] Andreea Lisievici Nevin: What do you agree?

[42:23] Debbie Reynolds: Excellent. Thank you so much. This is amazing. I love talking with you. You have such great insights. I'm sure the audience will love this much as I do.

[42:32] Andreea Lisievici Nevin: Thank you so much, David. Thank you for the invitation. And have a good day, everyone.

[42:37] Debbie Reynolds: Yeah, thank you so much. I'll talk to you soon.

[42:40] Andreea Lisievici Nevin: All right.

 

Next
Next

E258 - Terry Bollinger, Technology Analyst, Research and writing in Physics, Computer Science, and Artificial Intelligence MITRE (retired)