E265 - James Robson, Data Protection Officer, The Labour Party, Privacy and Data Sharing Specialist
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello,
[00:13] my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.
[00:26] Now,
[00:27] I have a very special guest on the show,
[00:29] James Robson. He is the data protection officer for the Labor Party in the uk, correct?
[00:39] James Robson: Yeah, that's right. Thank you so much for having me on.
[00:42] Debbie Reynolds: Well, I'm excited to have you on. I always love to see people and the things that they share on LinkedIn and you obviously caught my attention and I thought, hey, let me ask James and he'll be on here.
[00:54] I love the things that you post. It's really interesting. We're definitely going to get into the thing about data sharing for research. But can you talk to me a little bit about your background and how you became the data protection officer at the Labor Party?
[01:10] James Robson: Sure.
[01:10] I sort of come from a bit of a non traditional kind of non techy, non privacy, non legal background. I went to film school back in 2003 and did a film degree in National Film School of Wales and came out of that not really knowing what I wanted to do with my life and had an opportunity to go to London.
[01:32] I'm a Cardiff, I'm from Wales. Originally. I had the opportunity to go to London and just get involved in television production. So started out as a runner pretty much soon after university and then built a little career in television.
[01:47] So became a runner, researcher, associate producer, all on the factual side of television. And the longest gig I had for those in the UK that they'll know the TV show, it's still on, it was with a show called Homes under the Hammer with Martin Roberts.
[02:04] And we went around the auction houses and found people to come on the show and then went to see the houses before they were done up and then after they had done up and you had the end reveal and things.
[02:15] And that world was fascinating and I really enjoyed understanding what was behind the working parameters of a television show and a television production company,
[02:28] an organization.
[02:30] And what my passion sort of has really developed and I kind of, I'll fast forward a little bit shortly.
[02:37] Is understanding how things work,
[02:40] the nuts and bolts of how bits and bobs flow. Why does information go this way? Why do people like this? Why do people work like this? Why does somebody get interested in something?
[02:51] And so at the end of that sort of three, four sort of five year career, I made an Internet drama series that I put out in in WA where I'm from.
[03:01] So it was a couple of years of my life before I was sort of in the wilderness and thought you know what? I don't think I want to be a filmmaker.
[03:07] I've done it, I've done the hard yards.
[03:09] And then came back to London because I nipped back to Cardiff for a bit and managed to get into sales and I started from the bottom. So restarted a career sort of just about turning age 30 and I was training people.
[03:28] I found out I was really good at kind of retaining security and technology knowledge and then onboarding new starters and found passion in being able to deliver that information to people so they could articulate it themselves.
[03:43] And so I was headhunted a few times in that sort of industry and essentially became a sales manager in the IT security industry and built up the careers of kind of new starters,
[03:55] kind of first time jobbers in the IT security sales industry.
[04:00] And within that I got to build learning and development platforms to be able to support that. So learned how data flowed within those kinds of platforms and things like a workday that that's a very big LMS platform.
[04:14] Built out a salesforce,
[04:16] a platform for managing leads and you have the contacts and you have the, the organizational details and you update the data and those kinds of things and and then became more focused around having to articulate the new developments in,
[04:31] in fabric technologies.
[04:33] So the whole big thing back then was the one hop from the edge from at one point to another which, which actually never took off and never became a thing and then sort of spawned into well, what is the latest next generation firewall and then the virtual firewall and everything shifting to AWS and I sort of came to the end of that career again.
[04:53] It was another sort of four or five years of doing that and being quite successful building people and there's like a litany of people in the industries around that that I'm still friends with that that I'd hide into the industry.
[05:06] But I just got an opportunity with an ex boss who said there's this thing called the GDPR coming. This is around about 2016 and do you want to have a look at it?
[05:16] Because I think your skill set might be.
[05:19] Might be good for you might. You might enjoy it and a little skeptical. I. I'll be honest, I'd never read the as it was then the Data Protection act of 98 in, in the UK most legal texts bore me to death.
[05:31] I just was not enam not understand that world. But when I started to see what that was and what that potential societal shift that was coming because of the oversight of privacy with the EU gdpr.
[05:48] I thought wow.
[05:50] And there was two wows.
[05:52] The first one was if I become a consultant doing this, I'm going to understand how more companies work.
[05:58] I want to get underneath it and understand the foundational building blocks of an organization because data runs through everything,
[06:06] of course, you know, hr,
[06:08] finance,
[06:09] senior leadership,
[06:11] every aspect of a business,
[06:13] marketing, of course.
[06:16] And so I just was fascinated by it because I wanted to understand how things worked and it gave me the way in to be able to ask those questions that almost nobody gets to ask.
[06:30] But as a data protection consultant, I found I end up maybe learning more than most people within an organization because I'm talking to the senior leaders and the heads of departments and people who understand how those departments function, but all of it, and I see the whole organization as one,
[06:51] whereas they only look after their own little section. They know that very, very well. I end up becoming very, very good at unpacking an organization within the first couple of days,
[07:02] bearing in mind it was just usually a three day gap analysis, which is what we,
[07:08] well, what the consultancy I was working for was selling,
[07:11] that was the in thing.
[07:13] And so there's a number of jobs in between and I, I kind of, I get you're talking what, almost,
[07:22] almost 10 years in the industry now and I've gone from consultancy to in house specialist data protection officer for a set of research charities which I became very, very passionate about and I'll definitely talk about that later.
[07:36] And I think that just stood me in good stead in at the beginning of 2023 to apply for the job of DPO for the Labour Party because they seem to be hiring people that,
[07:51] whether this is something I should say about myself or not, but people that seem to have a bit of heart and they were kinder than people in corporate. It felt that way especially when I went to see them and managed to get the role.
[08:04] But that's,
[08:06] I guess in a NutShell, the last 25 years of my life.
[08:12] Happy to dig into any part of it.
[08:15] Debbie Reynolds: I love the fact that you've had such a diverse career and information and data and how you move that path into different career opportunities.
[08:27] I think it's fascinating too. I'm a data person. So for me, I thought especially the GDPR was very exciting for me because I thought, oh, I was going to wake up one day, everyone's going to care about data, right?
[08:39] And so when the GDPR first came out when it was first passed in 2016,
[08:46] I thought,
[08:47] wow, I'm gonna wake up tomorrow and everybody's gonna care about data in the U.S. like, there were no news articles about it or anything. And I'm like, this.
[08:55] This is gonna be like a huge thing, right? And I started talking about it. And so two years after that, in 2018, I was asked to be on a. On television to talk about, you know, the GDPR and why it was important and why people in US should care about it.
[09:10] And it's hilarious because people still ask me about that interview. I'm like, hey, this is a big deal. You all need to wake up and figure out what's happening. So I thought it.
[09:20] There was a question that you posed on LinkedIn that I would love for you to talk about. And I think this is just amazing.
[09:29] What if GDPR wasn't a compliance checkbox?
[09:34] But the spark for innovation in data.
[09:37] James Robson: Governance throwing it right back at me?
[09:42] I think it is.
[09:43] I absolutely believe it is,
[09:46] which is why I pose the question. It's a very leading question because,
[09:50] you know, how do you actually set up a new system where you're able to use data in a way that is useful?
[10:00] You have to go through the checklist of, well, who has access to it, why they have access to it. The basics,
[10:06] how we telling people what we're doing with their data.
[10:09] So ticking all those kind of GDPR boxes.
[10:13] But then there's the ability of, well, the breadth of knowledge that I have around the uses of data and the GDPR lead me to go, well,
[10:22] what have you thought about this use case and maybe using it in a different way. Let's take the example of kind of children's social care data.
[10:32] So there isn't a system currently that I'm aware of in the UK or there's one, in fact. But there isn't generally a system where a social worker will be able to see what is happening with their family.
[10:46] Maybe somebody in the family's been arrested, so they need to contact the family and make sure the family and the children are okay.
[10:54] Maybe somebody's going to hospital,
[10:58] so those data sets aren't joined up.
[11:00] But then if you start to think around the problem, it's like, well, how do we get more data in front of the social worker so they don't have to go kind of knocking on doors to go,
[11:12] is my family okay? What's going on? Has this happened?
[11:16] Is there anything that I need to do?
[11:19] And a few years ago now, there was an organization called Insight Bristol with an incredible guy called Gary Davis. They set up something called the Think Family database.
[11:29] And what he did was map 35 data sets, public sector data sets that would feed into a Power BI dashboard that could be logged into for a social worker.
[11:41] And so if you start to go, well, that's incredible because the only way that that used to happen was something called mash Multi Agency Sharing Hubs.
[11:52] So sharing information between the public services.
[11:56] But that would be maybe an online round table where the social work is asking one at a time,
[12:03] each person,
[12:04] any information on this family, this child,
[12:08] doesn't it just make sense to point that together and go have all together? And that's a secure feed that is only pointed towards that social worker to be able to do their job.
[12:20] So you end up thinking around the problem.
[12:23] And I think the whole kind of innovation. Why is GDPR a tick box when it can be just innovation?
[12:31] You can have data is the key to a lot of things. And I really think that we're sort of in the data dark ages. We're not using data enough and things aren't joined up.
[12:42] And if you can work, if you have enough thoughts and enough creativity around what you could do with that data,
[12:48] then you can use the mechanisms like the GDPR to make it a reality.
[12:54] I remember a talk that the Information Commissioner in the UK gave a few years ago to a select committee in Parliament and he says, you know,
[13:07] the GDPR Data Protection act of 2018 of 2018 is a manifesto of data sharing. It's the data sharing manifesto. It's not a don't do, it's a how to.
[13:21] And that sort of really stuck with me, especially within the kind of work that I was doing before I was in labor, because I was working for six research organizations concurrently.
[13:34] And I had the opportunity to build out a data archive within the Office for National Statistics for Children's Social Care. So that's where my kind of link to that database and awareness of that Think Family database comes from.
[13:48] And I had to do a lot of work to understand kind of that ecosystem of data sharing within research for future use, for secondary use.
[13:58] Do you need consent? How do you use it if you're not going to be able to get consent?
[14:04] All of those questions kind of pop up, but kind of looping back around to try and answer that question before I get far too passionate and go down a rabbit hole with you.
[14:13] I think it is the innovation. It does open the opportunity for,
[14:17] for the ability for a reuse of data as this type of data. Here what else can it be used for?
[14:26] And can it be of societal benefit? And I think that's the key to all of this.
[14:32] Debbie Reynolds: I agree with that. I agree.
[14:34] So it's funny that you say that, because I see people,
[14:38] especially a lot of business people, ranting on LinkedIn and different places about how GDPR killed innovation and how it's like, slowing them down and stuff. And I just kind of giggle, just like you are.
[14:50] And I'm like, oh, my gosh.
[14:52] I think of it like, let's say like a house that someone was hoarding a lot of information or a lot of stuff, right? So they do have a lot of stuff, but the stuff they have isn't really useful because they have too much of it.
[15:07] And so if you take that analogy, and if you sort of clean house and figure out what you actually need and what you actually are using,
[15:15] then the stuff that you may have less stuff, but then it's.
[15:19] You can figure out what the best uses for it are and you'll be better for it. So to me, all this data hoarding really is creating a lot of risk,
[15:28] and it's not creating a lot of business value because people are just. Because I've seen companies say when they run out of space on the server, they, well, let's just buy another server, right?
[15:37] And so they're not really solving the problem. They're just,
[15:40] it's making the problem snowball where they're creating more risk for themselves. But then a lot of companies,
[15:46] first of all, they don't know what they have,
[15:48] and then they're probably not using what they have to the best of their ability. So being able to clean house or figure out why they have stuff and ask those questions, I think really does help them.
[16:00] But what do you think?
[16:02] James Robson: No, I completely agree. And as you're saying that, it reminds me of a conversation I had the other day of, well, what if an individual had control of their data and you only needed one copy of that data that can be shared, and so it's not duplicated, it's just accessed.
[16:19] And so think of the server space saving and the kind of the ecological benefit of only having one copy in one location that is accessed by multiple organizations and government departments and people as they need it.
[16:34] So you don't need that duplication. Because however much you can buy multiple vendor products on unstructured, unstructured data discovery,
[16:47] structured data discovery, what you got? What's the itinerary?
[16:51] Well, there's still a huge market out there for that stuff. It means we're not solving that problem, that it should have solved that, that problem. So we have an easy dashboard.
[16:59] We know where it is and, and what we're doing with it. But the truth is people are still using data in a way that makes it easy for them to do their job rather than truly being concerned about whether or not there's accessibility to a folder that they happen to be using.
[17:17] And however much kind of GDPR beats the drum,
[17:21] there's still a, I think there's now a younger generation coming into the marketplace, the work marketplace, that don't have the,
[17:31] haven't had it beaten around the head for a number of years.
[17:34] And so they're just using it in any old way that just makes it easy. And so open access for that kinds of data and the technology hasn't been implemented by organizations to make it built in.
[17:47] So you can't do that kind of stuff.
[17:49] Yes, you've got CISOs banging their head against the wall trying to implement this and get budget for things. And the age old,
[17:57] I guess joke slash argument is there's always money for a Christmas due but not enough money to secure the cyber operations of their organization each year.
[18:09] And so it's never seen as having enough return on investment to put enough protections and technical protections around the data because it generally doesn't give a return. The only return would be, is it's intangible because are you going to get fined if you have a regulator that is willing to show its teeth?
[18:30] In some jurisdictions it does, in some it doesn't. So we're still in this sort of weird dark ages, even though we are almost a decade from the first iteration of,
[18:42] of the EU gdpr.
[18:44] It's very, very bizarre and we're nowhere near out of the woods.
[18:50] But the technology exists.
[18:52] It can happen.
[18:54] I was speaking to a gentleman that is very connected with the ieee and I forget what that stands for, but it's the Technologist Standards Agency. I think it's based in Canada or the U.S.
[19:08] anyway, one of those. And he's working on a concept where the individual is able to dictate the technical terms of the use of their data by every organization.
[19:19] And so it would be a standard to be achieved almost like an ISO 27001 in an organization.
[19:26] And he called it my terms and that makes sense.
[19:30] But I think it's also quite a high bar for lots of organizations to hit when we're still having organizations, not necessarily even reaching the bar of cyber essentials.
[19:42] I'm not sure if you have that in the US or in other jurisdictions. But the self assessment questionnaire, just to try and mitigate what, 90% of potential cyber risks that could potentially happen.
[19:54] But it's still self assessment.
[19:57] So it's a very, very bizarre phase and I think it's going to get even more bizarre and people are going to be more worried and I'm worried that people are stopping being worried about the GDPR now and stopping being worried about these laws because organizations are not necessarily being fined or called out or getting enough reputational damage.
[20:20] When you're hearing of the daily cyber attack, that is meaning all of us are losing our data on a day to day basis,
[20:29] possibly hourly now, if you know where to look on the,
[20:32] on the cybersecurity magazines and things. So it's a very, very curious spot that we find ourselves in right now.
[20:39] Debbie Reynolds: So the organization you're talking about is ieee,
[20:42] the Institute of Electrical and Electronics Engineers. Ieee.
[20:48] I actually am part of an industry connection group and we're actually trying to create or move towards a way to assess,
[20:56] have companies and also individuals assess the human centricity of data sharing.
[21:02] So yeah, it's a pretty interesting organization.
[21:05] I agree with you there. I think part of the problem,
[21:10] and I want your thoughts because you've worked with a lot of organizations and the point that you brought up around data duplication is really spot on because a lot of the risks that,
[21:22] that companies have is because they have so much data and it is duplicated. You take care of one copy, there's 10 other ones somewhere else. And part of that I think is the way the software has been made traditionally.
[21:35] James Robson: Yeah.
[21:35] Debbie Reynolds: Where you take the same data and you duplicate it over. Marketing needs the same data. So they're going to put it in the marketing database or HR uses data, so they're going to put in the HR database and that creates like a lot of duplication.
[21:48] And I think the future,
[21:50] I hope the future will be a situation where instead of exchanging data,
[21:55] there'll be some way to broker an answer for data so that you don't have to transfer it. So like for example,
[22:05] let's say if you wanted to buy beer in a pub or something,
[22:08] it may not. Your, the pub may not need to know your date of birth. They just need to know if you're of age to buy beer. Right. So the fact you're of age to buy beer, that's not going to raise any alarms with someone.
[22:22] But it is if someone is saying, well let me collect all this personal information about you and then you don't know what happens to it after that? What do you think?
[22:32] James Robson: No, I agree. And if you take that one step further,
[22:35] maybe that pub will then say we're collating all of this and we're going to sell it to Meta or Grok or OpenAI or whatever organization,
[22:45] maybe a marketing broker.
[22:48] And so we make money off the data that we're collecting from you. And we won't necessarily tell you that we're doing it. We just think it's okay. I mean you can lean into the cookie ecosystem quite quickly and the pixel tracking ecosystem right there, which if anything is sort of this strange at speed brokerage of who is going to have eyes on my product on the little box in the screen because I know something about them from
[23:21] them clicking 12 times on some random website.
[23:24] So all of those things are very, very concerning.
[23:28] And my real kind of thing is, well,
[23:31] why don't we have the control over allowing people to do that or not? And there are a lot of kind of organizations out there that are trying to be the digital ID for an individual,
[23:43] have your own kind of data identity Wal and so you can control the on or off or the yes or no to consent or permission for organizations to have that data or not.
[23:56] And there's nothing ubiquitous, there's nothing kind of standardized yet.
[24:00] But I mean really there should be a world where every use of every individual's personal data should probably be logged on some sort of decentralized ledger technology,
[24:11] a blockchain type technology where who's used it, why they've used it, when they've used it and if they've tried to change it, but then they can't change it because the nature of dlts decentralized ledger technologies is it's duplicated and it's immutable.
[24:27] So you'd be able to know where that ***** in the chain actually is when someone's trying to breach your data or use it in the wrong way.
[24:35] So it really should be that and it should be that for almost all information.
[24:41] And then if you maybe had a toggle on an app where you said I would really like my information to be used for societal benefit purposes and you have the categories of that everybody wants to,
[24:54] seems to want to get their hands on the National Health Service database, the 70 years worth of health records that is meant to be the largest health record, that data set ever,
[25:08] ever produced in the human race. Of course that has huge value.
[25:14] But then people don't necessarily know who is accessing the health records and for what purposes. And you find there Are potentials where there's hes data.
[25:25] I think it's called health.
[25:27] I forget what the YES stands for. But they, they are allowing kind of de identified and I think possibly anonymized data now but to be used by pharmaceutical agen to do research on the type of ailments that they feel their drugs are suited for so they can do that research on the data that's being collected.
[25:49] So there are those sorts of data stores. But why don't we have control?
[25:54] We've got to tell people in the law and the GDPR of course what we do with that data.
[26:00] But then how accurate is that on a real time,
[26:05] day to day, moment by moment process.
[26:09] And so I find that hard. And if you read any of my privacy notices, they're they're pages long because actually I want to work out exactly what is going on with every single piece of data and why and be transparent with that.
[26:26] And I see so many of those notices that, that aren't transparent enough in my view,
[26:32] but they are passable if you like to tick the GDPR box.
[26:38] But it still doesn't really do justice to the whole idea of knowing what's going on with your data if you need to find out and it's being used by some agency or organization.
[26:49] So yeah, again I'm back to the almost being a little bit skeptical that we're still at the very beginnings of being able to maintain privacy of personal data throughout all organizations at the moment.
[27:04] Debbie Reynolds: I agree with that. I think organizations are very good at collecting data. They're not good at kind of the end of life of data. So they're not good at giving data back.
[27:14] They're not good at deleting stuff because they probably had never had to do it before.
[27:18] And so it just be, it's just very difficult. And then a lot of the software and technology isn't made to.
[27:26] A lot of the technology and software is made to remember data, not to forget it. So it's kind of the technology is almost counter to what a lot of these regulations are trying to actually achieve.
[27:38] But I want your thoughts. What's happening in the world right now that's concerning you most?
[27:45] Privacy, Data protection or technology?
[27:48] James Robson: Yeah, it's a great question and I knew it was going to pop up and I kind of put a bit of a list, but it's very difficult to answer. I think one of the things that is concerning me and it's not actually on my list is the fact that people will eventually be cuckold into their own Space far more than they are now in the echo chambers that we find ourselves in.
[28:15] It's like you like something you watch something,
[28:18] you're going to be fed more of that,
[28:20] more of what you like and you end up, and this is now,
[28:24] you end up in that echo chamber of well, I like that, I'm possibly bored of that, but I don't know how to get out of this and see something else because algorithms are saying that James is a 45 year old guy that loves Star Trek and all I get is Star Trek feed.
[28:41] It's like, that's great, but what about Babylon 5? What about all the other great shows that go on? I'm never going to know about them.
[28:49] And my fear actually is now with this proliferation of AI,
[28:54] it's not just the echo chamber, it's absolute containment and then direction by the AI into exactly what you should think because it thinks this is the way you are going because that's how it's seen so many other people being recorded as going that way.
[29:13] Therefore you will go that way and you won't be able to get out of it.
[29:17] Right to the point where it becomes so many hallucinations on what the AI thinks is right for you. It's just feeding you kind of false information with something that's shiny that you may actually like but is complete bump.
[29:33] Bump as in it's total rubbish, nonsense information.
[29:37] And we're all stuck in those life cycles.
[29:42] And there's been reports of,
[29:43] what's the term now? AI psychosis,
[29:46] where people are now even relying on ChatGPT and the other LLM models to run their lives and reliant on everything.
[29:58] Yeah, you've got the memes on social media where you've got kind of a couple on a date talking to their LLM, talking to their AI, saying what should I say next?
[30:10] They've just said this, what should I say next?
[30:12] The AI says it to them and then they say what they should say next because this is what this person said to them.
[30:18] So you end up being completely controlled by, by the AI because you are so reliant on vast amounts of information that you are having confidence in. And I, I'm going to say false confidence in that it's going to be correct and right.
[30:32] And so there, there are going to be kind of, maybe I say are going to be, it's like, well, or maybe there's going to be lots of people that are just stuck in this loop where they don't really know what reality is or how to interact with People or reality in the right way,
[30:49] but are still asking AI how to live,
[30:53] and we forget how to live.
[30:56] And it's very, very weird.
[30:58] There's that, which is no small potato, right?
[31:01] Debbie Reynolds: Yeah.
[31:05] James Robson: And the other one is kind of quantum, just breaking everything.
[31:09] So the off constant of is it a one, is it a zero? And it exists at the same time in the same space,
[31:17] but figures things out in a nanosecond compared to 12 days or a year of normal computing power.
[31:27] You end up breaking all encryption and having access to all the information,
[31:32] no matter where it is, no matter what organization,
[31:35] no matter what security parameters you have in place.
[31:39] Therefore, you can start to gain access to people's identities far more successfully than is happening now.
[31:47] Yes, there's the Dark Web. Have your Tor browser jump in and buy someone's credentials.
[31:54] Of course, that still go on now,
[31:56] but it would just be on an exponential level where there will be digital identity of you and me and everyone that are interacting with everyone else, thinking they're real people,
[32:08] but then having their own digitized lives.
[32:12] And,
[32:13] man, I'm getting really dystopian digitized lives and running bank accounts and in lots of money, forever smart enough to run those kinds of systems.
[32:25] So we've got to find a way to not rely on the traditional systems. And there has to be this monumental generational technological shift away from where we are.
[32:38] Well, it's going to be the next 20, 30, 50 years because it's going so quickly.
[32:45] And,
[32:46] yeah, hey, I'm going to stop there because I want to talk myself into some sort of,
[32:50] you know, matrix. Matrix, Terminator,
[32:53] you know, sort of dystopia.
[32:57] Debbie Reynolds: Well, I, I agree with you. I agree with you wholeheartedly. I, you know, you have to go there because you have to think about what could be, like, the most harmful things.
[33:05] But I've always thought,
[33:08] you know, for a long time, some people still think that everyone sees the same thing on the Internet, and we don't. Right. Because the algorithms are showing us different things.
[33:18] And a lot of the polarization I feel that we see in life now is because your algorithm is telling you something. It's telling me something different. And what these companies are doing, they're trying to get information, they're trying to sell a product.
[33:35] Right. So not necessarily give you the facts or the things you need to know. It tells you what you want to know.
[33:42] And the analogy, I always say, is that I've always said, is that people think the Internet is like a library where it has all this information and knowledge and you can go to different sections, but it really is a library that's built just for you and it's only showing you things that they think you want to see.
[34:00] But I think that with AI, what's going to happen is that, and this is what you alluded to, and I think it's totally true,
[34:07] you are not going to be able to get out of that part of the library. So you're not going to be able to see what's in the other part of the library because it's going to keep you in that area.
[34:18] And so that's what I'm very, very concerned about because then it's like you have so much tunnel vision but you can't get out of that.
[34:26] So, yeah,
[34:28] I think that's so true.
[34:30] James Robson: I think unfortunately there are threat out, threat actors out there that are designing that pathway for us in a geopolitical, systematic destabilization of the westernized world.
[34:45] I genuinely think that that is going on because why point it that way? Because what the other way of thinking. There's so much amazing things that can be done with those kinds of information.
[34:58] So let me give you an example. I read the book Careless People that was written about Meta and the beginnings of Facebook and how it ran itself. And one of the standout paragraphs in there is about how the algorithm was designed to recognize when a girl who was feeling low and sad was then fed adverts for beauty products because of being in that low mood.
[35:29] What's the flip side to that?
[35:31] Like they're in a low mood.
[35:33] Feed them something positive that is actually going to improve their life rather than try and push something on them to sell something to them because of their emotional state.
[35:47] So it's trying to flip that narrative from that total dystopian craziness that I was sort of talking about earlier to how do you actually make it better?
[35:59] It's, it's easy to be contentious.
[36:01] It's so easy. It's contention is created.
[36:04] In fact,
[36:05] one of the things I dislike about LinkedIn and that I know that's how we know each other is, and it's a bit of an aside, is how our industry tries to pull apart other people in our industry by saying, oh, there's something wrong in here, you've done done that wrong.
[36:20] I'm going to get attention by saying you've done it wrong and call you out on the Internet and that. I just really dislike that within our industry.
[36:30] So that's why my posts are more about, look at this amazing thing over Here,
[36:34] look at what this research organization has done with data to try and improve people's lives,
[36:40] to try and make something better than we have. Rather than just point something out that is wrong, let's actually do something better.
[36:53] Debbie Reynolds: I agree with you. I agree with you. Well, that's one of the reasons that attracted me to you and your work. And I've been following you for a long time. You always pose really interesting questions.
[37:03] You always have very thought provoking things. And I think you're one of those people who like to share your information, your knowledge. And so I really appreciate,
[37:14] I want to talk with you a little bit about data sharing for research. This is a very interesting area and I think you touched on it a little bit earlier.
[37:21] And I actually had someone on the show who had talked about this topic a bit. He's a cybersecurity area. But he was saying personally that he had at one point he had had cancer, he was in remission.
[37:34] But even though he may not have wanted his personal details shared,
[37:40] he felt like if a researcher or a drug company could learn something from his health issue,
[37:48] he would not mind sharing that information.
[37:51] So to me, I find that fascinating in the research area. And so having organizations really understand,
[37:59] you know, how can I respect someone's wishes and how can I protect their information,
[38:05] but then how can I also leverage the information that I have to do something good for other people?
[38:12] What are your thoughts?
[38:12] James Robson: Yeah,
[38:14] yeah, absolutely. I think the first time we interacted I was speaking on a panel about privacy enhancing technologies in London. I think it was a privsec,
[38:24] maybe five, six years ago or so.
[38:27] Debbie Reynolds: Yeah, yeah.
[38:28] James Robson: After you very kindly kind of pinged me a note saying great talk, really enjoyed that and we're not met, so thank you so much.
[38:35] Always be mean to say thank you.
[38:38] Yeah, absolutely. So I worked for something called the Evidence Quarter which unfortunately doesn't exist anymore. But it was a we work office for the research houses within another set of organizations called the what Works Network.
[38:56] So what works for children's social care,
[38:59] what works for higher education,
[39:01] what works for homelessness?
[39:04] They don't all have those names, the what Works name. They all have kind of really amazing different names like the center for Homelessness Impact being directly funded from a government department like the Ministry for Housing Communities and Local Government mhcrg.
[39:20] But it's the concept of well,
[39:23] what would I be happy sharing and how does that work?
[39:28] And we've got something incredible in the UK it's the Office for National Statistics and it's used for the large Government data sets to be able to build policy and creative policy ideas for societal benefits and societal changes from the data sets that it ingests.
[39:49] So kind of the everything from the NHS to education data is in there.
[39:58] You've got kind of the benefit system is in there, you've got kind of, I think weather systems are in there. But. But what it doesn't have is all of the research that is going on throughout the country.
[40:12] It doesn't just automatically go to this one centralized place that everyone knows about. So when I was working for the what Works network and the Evidence Quarter as a dpo, I had the opportunity to set up a data archive for what works for children's social care.
[40:26] And that was a labor of love where I managed to be able to have the ability to match the children's social care research data that we were collecting from schools and local authorities for asking questions like, does having a social worker in a school benefit the kids who are maybe kind of for children who have special educational needs?
[40:53] So sen and ask that question and be able to collect the data and talk to social workers and talk to the schools and talk to academic and then get the academic data from the national pupil database and match it to maybe health records and also social care records to go,
[41:13] well, having a social care work,
[41:15] was that helpful to their academic achievement after two or three years?
[41:20] What is the answer?
[41:22] But unfortunately, those data sets that are being collected aren't automatically then matched to this centralized database in the ons and like I said, lots more of them. And so if you go, well,
[41:35] what if you have a homeless person and you can find out what their educational background was and does that have a correlation to them becoming homeless?
[41:46] And maybe in the benefit system,
[41:49] how much were they getting? And does that have a correlation to them becoming homeless?
[41:56] And wouldn't we want to know that information?
[41:59] And maybe somebody that is no longer homeless and was homeless,
[42:04] wouldn't they like to be able to use that background of information to go, well, let's help somebody else not be homeless and not become homeless because of all the markers and the pointers and the dots of information that you can put together to go,
[42:22] there are markers here for this individual in this state, in this way,
[42:26] they may become homeless.
[42:28] What services can we put in front of them to say,
[42:33] actually go this route? It's almost, it's the opposite of the let's give beauty products to a young girl feeling bad. It's,
[42:43] you know, let's point them in a direction online that will actually make them feel good.
[42:49] A motivational speech might pop up or a thought provoking book recommendation that might be useful, that might have been useful for the person that's no longer homeless.
[43:01] And we all have these lives that we have huge amounts of information that we hold, maybe traumatic experiences,
[43:10] difficult times that we've all gone through that we would not want to wish on our worst enemy. How is that information helpful for those people also experiencing those kinds of things?
[43:24] Low mood depression,
[43:26] any sort of psychosis? How did you get there?
[43:29] And what would you have liked to have had happen around you,
[43:34] but to knock you off that path of actually going down that route?
[43:38] And so what is that? And that takes research. It's massive amounts of resource and data to be collected from courageous individuals that are willing to give their time and knowledge and then centralize that in a way where it can be accessed in a privacy preserved way.
[43:59] You don't need to know who that individual is, where all those dots relate to. And it can be given back to you in a system that might need to be curated.
[44:09] Maybe it's a trusted research environment that has a secure access login that may even need to be a secure pod that is not connected to the Internet, it's only connected to the database, so it can't be leaked anywhere else.
[44:24] And you can create, I think you can even create it in virtual environments where you can access it from home and you wouldn't necessarily need to go somewhere else if you, you have some sort of proctoring system watching you and your keystrokes and what you're doing with that data and logging it.
[44:44] So you are doing the right thing.
[44:46] And you've also met the credentials of being a safe person to access that data because of who you are and what you're doing.
[44:55] I've relied a lot on the Office for National Statistics, what they call the five safe mechanism.
[45:02] So you have safe data,
[45:05] so it's useful and it's the right data that you're getting. It's a safe setting.
[45:10] So how you're accessing it,
[45:13] is it a safe setting you're accessing it from? Or safe technical parameters that you're under? You're a safe person.
[45:19] You've basically been vetted. You are from an organization that has a reputation for doing good and doing good research with that data,
[45:28] and, and you have safe outcomes, safe outputs,
[45:31] which is before it's even released as your paper. And the uses of that data, there's another pair of eyes looking at what you've done to go, okay,
[45:43] maybe you're not doing this for Societal benefit purposes. You're doing it for a quick buck here and there. We're not going to release this, we're not going to allow you to release it.
[45:53] So you have this kind of still sort of human oversight which probably can also be curated through some sort of AI structure. Believe it or not, I'm sure it can be developed.
[46:03] But that's where my passion has really gone around.
[46:07] What is the huge, valuable, positive data to really have benefit for our lives?
[46:15] I mean,
[46:16] you know, you can extrapolate outwards and just. You'll find it, Start thinking about it, you'll find it.
[46:21] What data do you need?
[46:22] It's that age old joke of, oh, there's an app for that. It's like, well, no, whatever you want to do to improve humanity, there's data for that.
[46:33] Go find it, go use it, go grab it, but do it in the right way.
[46:37] The problem that we really have is people haven't necessarily worked out how to monetize it so they can live from doing that great stuff from their data.
[46:47] There's always a reliance from a benefactor that is maybe putting money into a system or from a government grant, maybe a UKRI or ESRC sort of process where they've had to bid for that money as well.
[47:04] So we need to get really smart around this.
[47:08] Yeah,
[47:09] and I love that.
[47:12] Debbie Reynolds: That'S a wise, smart way. And I love that you're thinking in that way.
[47:16] But if it were the world according to you, James, and we did everything you said, what will be your wish for privacy, your data protection anywhere in the world, whether that be human behavior, technology or regulation?
[47:29] James Robson: Wow,
[47:31] what a question.
[47:32] Anywhere in the world.
[47:34] You know, you asked the small questions, Debbie, you know,
[47:38] just the little ones, the easy to answer questions. I really want to thank you for that. I really want to thank you for giving me such an easy question and anything.
[47:48] This is really tough. It's really tough.
[47:51] Debbie Reynolds: It's a hard question.
[47:53] James Robson: Yeah, it is.
[47:55] Debbie Reynolds: But you can pick more than one. Sometimes people have more than one or I mean, it's just mind blowing. It's mind blowing. It is.
[48:03] James Robson: And you know, it gets my mind racing with, with the amounts that you could do if you had better oversight and regulation around it. And everyone points to,
[48:14] well, if you had the regulation, then it would work.
[48:16] Well, we've got a regulation that's pretty good and pretty robust,
[48:22] but it doesn't work.
[48:24] And so there has to be another way.
[48:27] And I think it has to be technical logging of all uses of personal data and a curation of that Data whether or not it's for benefit and who it's benefiting.
[48:42] And I think there has to be an oversight and a logging mechanism of this.
[48:46] Now I could easily go down a path of like, well, that's suddenly dystopian, who's watching the watcher,
[48:54] who is going to be doing the oversight.
[48:57] How do you trust the technology that might be able to do that?
[49:01] But I think it ends up having to be that way. Where there are incredibly trusted certification mechanisms that are technical regulation,
[49:18] I think this is it. It's well respected technical oversight that stands up to any public scrutiny.
[49:28] So you know that it's delivering what it says it will deliver.
[49:33] And if any organization is signed up to it,
[49:37] then you know that they are adhering to the exact of what that standard is.
[49:44] And it has to come down to trust.
[49:47] How much trust do you need in an organization for the organization to exist?
[49:52] Because if we end up in a world which I think we're going to get there at some point where we're not going to use money anymore,
[50:00] we're going to use our data as currency and the uses of that data and different types of uses of that data having different levels of meaning and currency for us, and values for us,
[50:12] then you need to trust those systems that are holding that currency,
[50:17] holding you.
[50:19] Because where are you if not you are echoes within technical systems that go on forever unless we can work out how to do it properly.
[50:34] And I think it has to come down to you need to trust an organization far more than you've ever trusted it,
[50:42] that it will do the right thing.
[50:46] It's almost a number of years ago, we'd always look at like the BBC British Broadcasting Corporation.
[50:52] It is the best.
[50:54] It is amazing, the branding, the proliferation of that knowledge about how good the BBC is,
[51:02] the quality of production, the quality of programming, the quality of host.
[51:08] You know, if it's got BBC stamped on it, and I'm talking maybe 20 years ago and further back, you know, if it's got BBC stamped on it,
[51:16] it's top quality.
[51:18] Nothing got past the BBC if it wasn't the best in the world.
[51:24] Children's television,
[51:25] factual documentaries,
[51:27] panorama,
[51:28] all these kinds of TV shows that proliferated and still have TV shows that are copied all over the world and are syndicated.
[51:37] Where is that in privacy?
[51:39] Who do you point to and go,
[51:41] that's the BBC of yesteryear.
[51:44] Yes,
[51:45] that is it. I trust that I know that I am putting all of my currency and all of myself through that because I know I can trust it.
[51:55] And Maybe it comes down to, can you trust that person?
[51:59] And why do you trust that person?
[52:01] Because you can only trust the person that has set up the technology.
[52:06] And maybe that's where we need to get to. Who's that person who is going to be brave enough to put their head on the block and go, go. I'm going to go out there and be trustworthy and take all the flack that I'm going to take for being trustworthy.
[52:20] And I'm almost getting biblical, which is very weird. I don't mean to,
[52:25] but where is that? It's almost going to go full circle.
[52:29] You have to have that of something. I think that's it. I think that's my answer. Hopefully that's answered your question.
[52:35] Debbie Reynolds: That's a. That's a good answer. That's a good answer. It's such an expansive space. You can go anywhere with it. But, yeah, I think.
[52:43] I don't know. I mean, I think there are going to have to be a multitude of ways to solve this problem. It may not be one way. That's definitely one angle.
[52:51] My view, I would love to see,
[52:53] definitely minimization, but architectures that prevent data from being used in a technological sense. Right. So almost like you get a puzzle and it's only one way to put the piece in the puzzle.
[53:09] I almost see data in the future like that, where it's just certain things.
[53:14] Even if you have it, you can't use it. It was not the right use. So I don't know.
[53:19] This is fun. This is fun to think through all these things with you. Well,
[53:24] well, thank you so much, James. It's been a pleasure to have you on the show, and I hope that we have a chance in the future to collaborate.
[53:32] James Robson: I hope so, too. Pleasure's all mine. Thank you so much, Debbie.
[53:36] Debbie Reynolds: Thank you. And we'll talk soon. Soon.
[53:38] James Robson: Take care.