E7 – Rohan Light of Decisiv

December Podcast Guests_Rohan Light.gif

Data Diva Rohan Light

37 minutes

SUMMARY KEYWORDS

data, people, blockchain, metaphor, ethics, Rohan, world, question, conversation, bias, LinkedIn, privacy, professionals, human, ai, business, identify, quantum computing, problem, create

SPEAKERS

Debbie Reynolds, Rohan Light

 

Debbie Reynolds  00:08

Hello, my name is Debbie Reynolds. This is "The Data Diva" Talks Privacy Podcast, where we discuss Data Privacy issues around the world, with industry leaders sharing information that businesses need to know now. Today, my special guest is Rohan Light, who's the CEO of Decisiv AI. He is an AI auditor, a data ethicist, humane data framework creator. He works a lot with people to analyze code and work on things like autonomous systems, data ethics, bias, privacy, trust, and Cybersecurity. Welcome, Rohan Light. You do a lot of posting and a lot of things about AI governance strategy, just almost any angle related to Data. So I'd love for you to talk about what you do.

 

Rohan Light  01:01

Thank you. Yes, Rohan Light from New Zealand, and it's early in the morning, but there's an advantage to that. Because I can, we can reach quite a large amount of the world from here. And I think that's the first answer to your question there. Because we are at the bottom of the world, we're a landlocked island. We are, in many ways, the most international of people because we have the furthest to go to reach the major centers wherever they are. And I think that helps New Zealand. It gives us a chance to look at what's happening with the rest of the world and then figure out how it all connects together without being part of the big political conversations in the field. So that's the first thing. The second thing is, you know, I'm a generalist. I've deliberately gone for breadth rather than depth in the data world, simply because it's so colossal. And my original provisional basis is a risk. So in risk, you need to be able to create a coherent frame for many things, most of which will have high levels of uncertainty, simply because we just don't know what's going to bite us. Those three things are largely why I managed to put up a bunch of stuff out there.

 

Debbie Reynolds  02:40

You're really good. I'm like every time I turn around, and you have something new. And it's always interesting. So it's never surface, in my opinion. Yes, very, you know, very targeted to specific things. But I feel like you're definitely elevating the conversation about data and all the different ways that data can impact people. And it's very thought-provoking.

 

Rohan Light  03:02

You're too kind. I think, here we are clearly at an inflection point in both society and societal terms, political terms, economic terms, and like it or not, data professionals, we are at the crossroads of much of that is our professional role going forward to ask the next question, not the first question or the easy question or the deeper question. And we have to do it in such a way to maintain inclusion. So we can't frame a question in order in a way that takes people out of the conversation we have to frame things in. What that means is we can't shy away from the tricky stuff. Now, I do have a secret advantage. So I have worked in publishing. And one of the things and specifically in a form of publishing called pad works, which is a 30-page booklets 30 imprints of these small booklets and they get pushed out. And they are halfway between a big treatment and a tiny and a small one. And one of the elements involved in it is to find strong quotes. And when we look at the range and breadth of professionals venturing into data governance, we can, in their work, we can find invariably, one or two really strong paragraphs, really strong sentences. And this is where the author has touched on something that resonates and is probably in some way, shape or form, slightly dramatic or crazy or will make some people annoyed or angry and it's you know, from Facebook etcetera, part of our job is to take the anger out of these conversations So.

 

Debbie Reynolds  05:05

Right, exactly, exactly. I know I used to toy with microblogging. So it would force me to be able to say, shorten what I want to say, you have to distill it down to kind of its essence. Yeah. And I think that's a hard skill to learn. But I did it for a lot of years. So I try to make things as simple as possible. When I'm talking to people, I distill it. Yeah, that's a good segue into metaphors. I know that you like to think we need better metaphors. Yeah. Explain that now, and I would love a comment on it.

 

Rohan Light  05:46

Yeah, sure. Metaphor is crucial. So metaphor, and its equally attractive cousin analogy, crucial to our ability to identify what is actually going on. Now, we have seen since 2007, 2008 people figuring out that the data economy was something unique and different. And I was seeking to utilize storytelling techniques. And one of them is metaphor, and we've all heard Data is the new oil, which is a common one, right? But when we really, really go into that, and we look at the probably quite destructive nature of oil extraction, pollutants, the way that it sort of locks us into nonsustainable ways of life, and we go well, actually, we don't like data to do that. Another one that we hear is to be data-driven. So we, well, we just tease that out of it. Do you mean? Do we mean driven like a chauffeur? Or do we mean driven like a herd of cattle? Which would you like, though, the metaphor I like the most is water or Data, humans will die four to eight days without water, and I, so we can, we can now look at well, many of the things we want to create for the world, will die. Unless we feed them really good water, aka data. Now, we see this in the AI field. Everyone knows now, not everyone, but many people know that if you want effective AI, you have to have good data because that's what the thing is trained on, which is, of course, another metaphor. So part of our job, I believe, is to identify these metaphors, and if necessary, tease out the implications behind those metaphors and, where necessary, not to use them.

 

Debbie Reynolds  08:00

I totally agree. I feel like we have to make this simple. We have to be able to resonate and be able to communicate this to people at all levels. So for me, I feel like I have to be able to talk to someone who's 80 or someone who's eight years old—so being able to make it as simple and as plain as possible. You know, metaphors do help in that regard. My UK and EU friends don't like to have data as a new oil metaphor, because it talks it sort of they feel like a dehumanizes their rights to privacy to something as a commodity, even though it sort of is, I don't, I would say I have a twist on that. I say insight is the new oil. So the Data is useless if unless you can get insights from it, and the people who benefit the most from data are the ones who can gain insight from it.

 

Rohan Light  09:00

Yeah, the thing was with Data, you've touched on a really interesting point. And it's the different approaches to governance, the overall differences. We have an EU citizen-centric generalizations and possibly stereotyping in it, but for the purposes of conversation, the US is business-centric, China's state-centric, right. One of the things that our job as professionals for us to do is to work to harmonize those regimes. The thing about data and there are always two sides to it, at the very least, and Data is both something and about something right. So the instrumental approaches to data looking at the data as something, i.e., the commoditization. Now the about something will be in some way, shape, or form attached to a human. So the about something in the case of Facebook, obviously that people are the commodity, it's very personal. And other situations, though, like a data feed coming in off of an Internet of Things device, it might not be about people. But it certainly tells us a lot about the people who built the device in the first place, tells us a lot about the things that they thought were important. So no matter what the source of the data, we can always find a human attached to it. And as we've seen, and COVID, especially in those first few months, where journals were having to retract some of the work simply because they hadn't done enough vetting on it. At the core of that was actually realizing that you know, people were making too broad a statement about the about something the human side of data. And there's no easy way through this. And we see this in Privacy Shield, right how tricky it has been to maintain that move that transatlantic movement. And then we were clearly having to go to the next level, in terms of our nation grade instruments.

 

Debbie Reynolds  11:17

I think it's gonna be, you know, the tectonic shifts of how governments are using data and how they're trying to make these agreements amongst each other. I think there's got to be friction because the laws aren't as congruent as you would think they'd be, you know, to me, I think on an international level, I would love to see some consensus on stuff like, maybe hacking is bad. Like, maybe we can all agree that that's the case for, you know, certain things like that, so that there isn't as much friction between countries or try to come with these agreements. Privacy Shield is all over the place. And it has been for over 20 years. So I think it'd be interesting to see that I would love to talk with you about the E word, which is ethics. So ethics. Ethics is almost like a dirty word for people. Because I think I feel like people, especially some people, in privacy. They're so fixated on the law, you know, I think that there has to be ethics in there, I think ethics sort of bridges the gap between. That's what's missing, I think, right? So I love to, for you to talk a bit about ethics. This is one of my favorite topics. So I love to hear you talk about this.

 

Rohan Light  12:43

Well, even the most left of left anarchists have an ethical code, everyone. So it's not, it's not that people are ethicless, it's, it's that our codes of ethics at an individual level or a social or community, level, regional, national, they are not synchronized. And this comes back to the importance of language and metaphor because it's about connecting groups and finding the things they agree on. We're always going to find things we disagree on. That's the nature of humanity with the thing with, with ethics, another part of it on the part of the problem as the universality, the abstractness of it, we can all agree to be ethical, we will all disagree on what the particular elements of ethics are. And this is an important step, as it turns out, because for an ethical code to be applicable, to be applied ethics, we have to be able to get really close to the actual decision problem and weed out the dilemmas we have to identify what the dilemmas are, which will be a conflict between ethical principles. And we have to define whether or not there are false dilemmas and the role of law. As you know, law operationalizes ethics. So then we look back at the history of data governance from the mid-70s, where we had the FIPS (Fair Information Practice Principles) space regimes coming out of the states. Then we had in the late 2000s and early 2010s, and we had the technocratic response, oh, we can design for this. And now we have the current wave of data and AI effects coming on, and each time we are we are sharpening up the question, we are communicating those questions, and as a result of that, over time, we will influence the drafting of the law, and then has to be that way. It can't be the other way around. It can't be. Here's some law, you must now believe in this because it's the law. And yet, that's called authoritarianism and totalitarianism. And thank you very much. But New Zealand does not care one piece for that. Yeah. So the ethical thing is that the trick is, get out of the clouds, get close to a dilemma, identify the horns of a dilemma, and then think your way through the problem.

 

Debbie Reynolds  15:40

Right, I am very happy to see that there are groups coming together creating frameworks that, you know, eventually will inform people as they create laws. Unfortunately, harm comes before the law, right? Typically, something bad has to happen before that before it actually becomes a law, which is unfortunate. Yeah. What is it that you would love for kind of businesses to be thinking about regarding data? And, you know, like, we were talking about, like being data-driven, or figuring out how to use data? What are somethings they need to be concerned about as they sort of taking this journey?

 

Rohan Light  16:24

Yeah, okay. On the one hand, they can safely ignore many of the, I believe, sorry, they can safely ignore many of the concepts and precepts that came out of the first wave of Big Data. A lot of that was BS. It was marketing speak. The second thing I would say to them is, remember, Data is a measure of human activity. So either your customers' activity or your activity. Either way, you know, you already know it, it's just that all this data work is showing people different ways to think about what they do. And here's the third one. Information hides on the other side of order. What that means is, whenever we're presented with a data description of what we do, we are pushed to ask questions. Once we start asking questions, we can find the information hidden on the other side of order. So it actually comes back to our ability to form questions about the world and about what we do. And that is at the heart of entrepreneurialism because the root of the word is to move risk around, you move risk around your business, and you do that because you have, hopefully, answered a whole bunch of questions about what you're doing. Now, to bring that down one step, the thing to ask yourself is, has your business and industry been disrupted? Well, the happy answer to that is yes. Because everyone has been, right, so what we're looking at for business as this is your opportunity to look at yourself and think about the world that you want to create for the people following after you. That's about applying deeper questions to what you think you already know.

 

Debbie Reynolds  18:31

Yeah. Excellent. I love that. I love that. I agree with you about the sort of the Big Data push where people were using, like business intelligence, and they were gathering all this information. Yeah. And then, at the end of the day, they couldn't really answer the question that they had. So yeah, they have a sort of, you're right, you have to start with the question first before you start gathering data. Because other than that, you just have a bunch of junk that you can't really, it doesn't really help you. So I think that's really important.

 

Rohan Light  19:03

This is what the social activists are correctly upset about as they look at the data, assets, or problem assets, probably data resources that organizations have. And they go, well, to what extent is this stuff inherently racist? You know, to what extent has this Data been prepared by white people? And the data subjects are all black people? If you run your big data machine over that, then you are going to simply reinforce that particular paradigm? Yes. Which comes back to well, if you're asking different questions, and you have genuine about answering those questions, I guarantee you will create data

 

Debbie Reynolds  19:55

Absolutely, I would love then to segue into bias. So this is something that you post about a lot. Yeah, you always have thought-provoking things about bias. Obviously, this is something that concerns me personally. Yeah. As I'm working in technology, give me your thoughts about, you know, bias in AI and algorithms.

 

Rohan Light  20:15

It seems now that the majority of observers recognize that the natural biases of humans move into data, and then from data and moves into the systems we build. The important thing to recognize is that all humans are unique based upon their point of view. So it's their perspective. So it's where they stand and how they look at the world. Now, that is a bunch of biases. Because of a limited range of perception, we're crowding a whole bunch of things out, we're choosing to privilege certain things over other things. So actually, it comes down to self-awareness. And the degree within which you are willing to say, yeah, actually, the majority of our Data is racist. So, golly, gosh, we better do something about it. Or we go no, no, no, it's not racist at all. It was X, Y, and Z controls over, you know, 20 years. And, you know, both of those positions may be correct. So the first point is self-awareness. What biases do we carry? The second point is, I've seen long lists of biases and it's cousin, the fallacies. There are somewhat some that are universal confirmation bias, as always top of my list, which is where we, unconsciously or in, in some cases, consciously grab the Data that confirms our view, because, at a very deep level, we don't like to be disconfirmed, we don't like to have our views challenged. And that's the big one. And I think when I look at, ungoverned and unmitigated Big Data machines, they're just gigantic confirmation bias creators, right, you know, we, then that's, that's what's driving the algorithm and accountability conversation, because that's why we want to have a look at the code, right? Absolutely. When we basically want to see, to what extent have you really reinforced your worldview, because if it's wrong, you're in trouble. And if it's wrong, and you are following a conventional business model and data governance, you have made it someone else's problem. They're going to pay. They're going to pay the price. So all of a sudden, it very quickly comes back to what does it mean to be human? And what does it mean to be human with each other?

 

Debbie Reynolds  23:05

Right? You do a lot of work in looking at codes from AI for bias and different things that would sort of taint the data in some way. So I feel like a lot of this is, you know, someone has ideas, someone thinks they know the answer to the question. And then they build a system that's going to give them the answer they think is right. Yeah. What are your thoughts about that?

 

Rohan Light  23:29

We can grow and develop our capacity for being contradicted. However, all of us will stop at a certain point we'll go, oh now, I've had enough of that. That's just crazy. And when we do that, what we're actually showing ourselves is the extent to which we can grow, you know, grow our perspective, remember, data, fundamentally feeds science, which is that the elimination of answers to questions until only a few answers remain. So with regards to try and come back to your question, the thing I would say is probably establish 1, 2, 3. 3 thresholds, the first threshold as the safe change threshold, this is the change that we're comfortable talking about from as we've probably been talking about it for a couple of years, in the context of digital transformation, so that is the safe stuff, then the next thing is though, it's this is a bit flaky, and a bit weird. Often these ideas are coming through as a result of COVID not because they new ideas, but because we haven't seen them before and grappling with those ideas as our current challenge. Once we break through that middle threshold, we are then able to really look at what's coming—the big stuff. If you're interested in the big stuff, you have to, for instance, form a provisional view on quantum computing. If you get into quantum computing, let's just assume that you have to have some degree of understanding of quantum mechanics because we want to know how this thing works. And once you get into quantum mechanics, then you really see that third threshold, the absolutely howling moon, bat crazy threshold of ideas that we actually might have to move on from, so it's this, this these grades of managing our own reaction to being disconfirmed.

 

Debbie Reynolds  25:42

I love it, and quantum computing is, that is the next question. That is the next thing, and especially, I think it's gonna have a huge impact on encryption because the machines will be able to do things that now like right now I consider encryption like a fire safe, where if it burns a number of hours are only safe to a certain point. So because of some encryption, it takes a long time to break, and quantum will totally breakthrough that that layer, where things that you thought will take hundreds of years to decrypt will be decrypted in a number of hours. Yeah, I'd love to get your thoughts about it. What is it that's happening now that that concerns you most in the world relates to Data?

 

Rohan Light  26:35

Well, it's a macrocosm microcosm thing. We globally, that looks fairly obvious to me, we've lost the ability of the center to mediate the political and social extremes, while the political extremes when it has now become a social issue. That's the first point, I think. And that's what I actually hope that data professionals can fill. I think we have a role to fill out that middle ground and mediate different views of the world. That's the first thing. The second thing is, there is a heck of a lot of provisional change created by the growth of data and a large part of it when it comes down to essentially as "oh damn, do I really have to learn again"? And the answer is yes. One of the things that really good data science does is it identifies the way we've been BS-ing ourselves. And that the only way to get past that is to learn your way through it. So I look at those two things, one, need to create a mediating center within politics and society, and therefore Data, and two, we need to help people reengage with learning.

 

Debbie Reynolds  28:00

I love that. I think that's true. That's so true, especially in privacy, where you're working with people from all different countries and all over the world and different sort of legal regimes and different cultures, you have to be really sensitive to that. And I feel like people who are really good at privacy, they're really great at being that center person, you know, that that mediator, the person, is sort of bridges that gap. But I feel like there's so much this needed in terms of education because these are human problems. So you can't just say, you know, data processor or data controller, the data subject, consumer problem, it's a human problem that, you know, it impacts all of us if we don't get this right. Yeah, if it was the world according to Rohan and people had to listen to everything that you say, what will be your biggest wish related to privacy? Like if you have anything, wave your magic wand? What would you want?

 

Rohan Light  29:00

My magic wand would be privacy professionals, waking up, and recognizing two things. First thing, recognizing the fundamental weaknesses and FIPP (Fair Information Practice Principles) based approaches, and also in this fantasy world, and their dreams, they realized, oh, this is how I can complement and supplement my FIPPs based toolset. How about how about that? Very good. Very good.

 

Debbie Reynolds  29:24

That's excellent. That's excellent. I'd love your thoughts on biometrics. To me, it's such a huge thing. It's like, the technology is so far ahead of where we are thinking where our heads are. And I just want your thoughts on that. I feel like the data sort of run away from us there. Yeah.

 

Rohan Light  29:56

If we had secure digital wallets, If we had, you know, safe control, centralized control over our data self, I'd feel a lot more comfortable. But we don't have that. And so, therefore, the likelihood of exploitation is high in our favor. The scientific community that is doing genomic research is, in many ways, setting the scene for how we govern biometrics. On the other side, we because many, if not most jurisdictions, have little or no regulations in the space. There's nothing stopping an idiot from creating an app and spending a bunch of money with a really cool marketing crew. And voila, you have you know, things like affective computing, you know, oh, my goodness, the computer, the looking through the camera on my laptop, that computer says I'm happy today. That is utter tripe. And yet you find it entertained reasonably at quite high levels. And we go, Oh, my Lord, come on, mate. This is astrology, not astronomy. This is astrology. This is just making stuff up. Stop buying this is at the moment. Yes, it is unbalanced. I really like the work of the Nuffield Council of bioethics in the UK. Their work is so so good. And I'm hoping this is why I'm this is why I like blockchain. This is why I like distributed ledger technology. That seems to be the way forward to enabling people some degree of control over the biometric self.

 

Debbie Reynolds  31:53

I think blockchain has gotten sort of a bad rap where people, you know, interesting technology, and they haven't figured out the best ways to really implement it. But to me, I'm seeing a lot of interesting applications of blockchain in the identity space. Yeah. So I feel like that's gonna get a lot of traction.

 

Rohan Light  32:13

Yeah, sure. The first blockchain wave in 2016, 2015, 2016. That attracted the magic bullet crowd. Oh, my God, here's a magic bullet. And then, after a while, when the magic bullet crowd encountered the really hard technical element, they fell off, and they went to the next thing. So what we're seeing now as a second wave and blockchain and distributed ledger and yes, those use cases are really started to sharpen up now the data collaboratives/data institutions /data trust work coming out of Gov lab in the US and the ODI in the UK, are really good indicators of how far we have come with the trust-based governance structures that we rap blockchain technologies around, right. So and in that sense, that's good because we want to see it. We want to see this stuff. blueprinted before we build it, you know, blockchain is much more than Bitcoin. It's much more than FinTech. Yeah, absolutely. It goes deep into supply chains. It goes, you know, deep into personal control. And my goodness, if we can get just one step for many people around personal control of data that will, that I hope anyway, it will solve that will take a lot of heat out of the system.

 

Debbie Reynolds  33:45

Mm-hmm. I think so too. Right. Exactly. Exactly. Well, you're always fighting the good fight. I'm really happy about that. Please tell me how people can contact you how they could follow your work. What's the best way for people to connect with you?

 

Rohan Light  34:02

LinkedIn? Just going back to your conversation about microblogging? Yeah, I just treat LinkedIn as my microblog distributor.

 

Debbie Reynolds  34:10

Oh, absolutely. Yeah.

 

Rohan Light  34:12

Yeah. The conversation LinkedIn has managed to curate as it's suitable for business and suitable for professional conversations where, you know, people know that taking strong political and strong social views on LinkedIn, it just stands out. And if it's too hard, if it's too full-on, it will. It will make people scared. Anyway, you won't generate a conversation if it's too safe. People go, like come on, man—no one needs yet another anodyne or bromide. I don't need more boilerplate stuff. If you can find that middle ground. LinkedIn is a great place to have a conversation, and that's where people should hit me up Rohan Light

 

Debbie Reynolds  35:01

Rohan Light, I know, I was talking with someone, and they were talking about, you know, AI and algorithms and stuff. And I'm like, oh my god, you have to connect with Rohan. He's the guy. He's the go-to person. It's so funny because I was emailing you about this interview, and someone had asked me about that. And I was like, oh, I just talked to him, like, you know, the day before. Yeah, totally connect with him. So I always send people your way when they're talking about, you know, ethics, ai data because you just post some of the most interesting stuff. And I, I wish I could comment on every single thing. So you just have someone stuff that comes out, but I always try to try to like it, you know, if I can try to comment.

 

Rohan Light  35:42

Yeah, isn't it great, you know, seeing that scene, when we see them the likes image on LinkedIn? For me, it's a. It's an indication that there's further conversation to happen. And I think that's our task to find to find ways to talk a little more deeply about these, these issues that are affecting everyone.

 

Debbie Reynolds  36:07

Well, you are succeeding, my friend. Well, thank you so much for the interview. This is fantastic. I would love for people to definitely, have to follow Rohan on LinkedIn. He just has so much good stuff. So very good. Well, thank you so much. Again,

 

Rohan Light  36:25

Thank you for having me, Debbie. And this has certainly been a great start to my Thursday.

 

Debbie Reynolds  36:35

 Thank you.

Previous
Previous

E8 – Mike Bryant of Knox Capital Holdings

Next
Next

E6 – Patrick Kelley of Critical Path Security