E248 - Damilola Adenuga-Taiwo, Cybersecurity, Security Compliance, Payment Systems

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:30] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:43] Now I have a very special guest on the show. He's actually introduced to me by another guest who will be on the show. His name is Damilola Adenuga-Taiwo.

[00:57] Is that right?

[00:58] Damilola Adenuga-Taiwo: Yes, that's right. You got it right the first time. Yes.

[01:04] Debbie Reynolds: One of my favorite professors in college was a Taiwo. So very cool, very cool. I, I love your names because they mean stuff as opposed to, you know, names that people just make up off the fly.

[01:15] So that's really cool.

[01:17] Damilola Adenuga-Taiwo: Yes, very meaningful names. Very popular name where I'm from as well.

[01:22] Debbie Reynolds: Well, so you are, you manage security and compliance. And so today I really want to get into your background. So tell me how you came to be in the security and compliance space.

[01:39] Damilola Adenuga-Taiwo: I don't think anyone has a conventional route of getting into this space. Mine was quite unconventional. You know,

[01:46] one of those finished from school, started out teaching right off the back of my master's program and I was teaching in a postgraduate program,

[01:55] teaching concepts on security and,

[01:58] you know, technology to pretty much my mates at the point,

[02:02] right. I'll show up in,

[02:04] you know, as a professor and people would look at me, I'm like, you guys, you should be in this class.

[02:10] But, but you know, it's was one of those things. So that's where I got started and that was able to leverage some of that classroom lecturer and professor experience into starting out consulting.

[02:21] So I got my first consulting gig in at Richter, a company called Richter in Canada and specialized in sort of privacy assessments, cybersecurity based assessments,

[02:33] standards, standards based security, standards based assessments. And so I worked a few roles,

[02:40] mostly some, some of them were consulting, some of them were more internal.

[02:45] And I got my, was able to get some of my security badges and certifications and some of the other education experience that led me here.

[02:55] Debbie Reynolds: Yeah, to me this is just such a wide space.

[02:58] I, I feel like when people think about cyber or security,

[03:03] a lot of times in the technical realm,

[03:07] they almost feel like if you're technical in any way, that you do everything in technology.

[03:14] So just tell me a little bit more about how you know the area of cyber that you focus on.

[03:21] Damilola Adenuga-Taiwo: So I focus currently on the cybersecurity governance, risk and compliance space,

[03:27] specifically on the PCI That's a payment standard programs and the ISO standards as well, ISO suite of standards.

[03:38] So my work cuts across some of these standards.

[03:42] And so I focus on also, you know, risk assessments, risk management, everything that has to with both security and privacy, risk management and the governance aspects of it too.

[03:53] So my work cards cuts across the GDR and the sea, even though I'm mostly focused on the sea.

[03:59] Debbie Reynolds: Yeah,

[04:01] well, I'm glad we're talking today and I'm glad you mentioned PCI compliance because PCI has fascinated me in a lot of ways because first of all, it's a very strong standard and is one that's been widely adopted even though there isn't like a quote unquote law saying that you have to do it.

[04:22] And so to me, I feel like this may be a pathway for maybe privacy or some other types of frameworks to really get adoption. But tell our audience about what is PCI and kind of why it's important as a standard.

[04:39] Damilola Adenuga-Taiwo: The full name is PCI dss. That's a Payments Council Data Security Standards.

[04:46] So it's a suite of standards and requirements that made up from the PCI Council. So the PCI Council, think of them as the payment brands,

[04:55] the ones that, the main ones, the popular ones, that's Visa, MasterCard, Amex, JCB and Discover.

[05:02] So they came together and said,

[05:06] we operate this payment,

[05:08] we are the brands, we operate in this payment ecosystem.

[05:13] However, with everyone that operates in this payment ecosystems, especially taking cardholder data,

[05:18] which is like normal people data, like everybody people data,

[05:22] we would want them to operate with a certain standard in mind of security requirements.

[05:27] So they produced, they came together and produced a suite of requirements which is pretty much the PCI dss.

[05:33] And so everyone who, every entity that operates in the space,

[05:38] if you store,

[05:39] process or transmit payment data,

[05:42] you're required, you're bound by the PCI DSS framework.

[05:47] So be it the,

[05:49] the banks,

[05:51] be it the payment processors, the ones you know, you know, the, the squ, the stripes, the idns, the paypals,

[05:57] be it the normal merchants,

[05:59] everyone is bound by these requirements.

[06:02] Now does everybody do these requirements? Does everybody like implement these requirements? Like you're not expecting like barber shop or mom shop, pop shop to implement such requirements. But the overall principle is that everyone is bound by these requirements regardless.

[06:20] Debbie Reynolds: I'm just thinking. So I work with a group of a cadre of people who work in the auto ecosystem and they are talking about trying to create some type of standard where they themselves can say, hey, we as a group, we want to have this, whatever the standard is around privacy and then how do we push that down to kind of the ecosystem?

[06:41] And I feel like PCs has probably been the most successful way that I've seen that done.

[06:49] So basically they're like, okay, us big players, we're going to come together, we decide that we're going to operate in this way. It's not like they're competing against each other.

[06:56] They're like, it makes our lives easier if we have the standard. And then they push it down on their third parties to do it. Even like your example about a barbershop, like the barbershop person may not know anything about it, but maybe they use a merchant account where that is implemented.

[07:13] So I think that's really cool.

[07:15] Damilola Adenuga-Taiwo: That's absolutely true. All of Visa, MasterCard discovered they used to have their own individual standards.

[07:21] So imagine like you are a merchant, like a bigger merchant, and you took payment and your payment traffic consisted of taking visa payments, taking MasterCard payments, taking Discover payments. And so you would be bound by each of these standards individually.

[07:35] And that's how we used to be.

[07:37] And then they just came together and formed that PCI Council and was like, you know what, it doesn't make sense for one entity to have to abide by five different requirements.

[07:46] So let's just come together and form this unified set of requirements where everybody is bound by.

[07:52] And so there's so many,

[07:54] there's different factors to the level of requirements in which a merchant or an entity is bound by, depending on how much their mode of payments or the avenues in which they take payments.

[08:06] For example,

[08:07] they take payments via online means or E commerce means versus car Present, that's over. Like your present. The merchant's present payment terminal means what the requirements could be deferring.

[08:22] Right? So the mode of payments, also the,

[08:25] the amount of traffic, payment traffic in which the entity takes every year determines the scrutiny of, of that requirements or the suite of requirements for that entity.

[08:39] Debbie Reynolds: Yeah, it's so funny. So I'm very impressed by the financial sector in general about how they can do innovation. And it's so funny because, you know, I hear these talks all the time where especially like an AI.

[08:52] We'll talk about that later.

[08:53] Like an artificial intelligence, they're like, you know, regulation will,

[08:58] or even self regulation will stop innovation.

[09:01] And I feel like in highly regulated spaces there's still a ton of innovation. Right? So to me, at some point it makes sense to create some type of standards where people kind of know the rules of the road.

[09:14] And it really helps because not Every,

[09:17] especially companies, you may have the same goal and different clients, but trying to do everything differently from scratch is just really tough.

[09:27] Damilola Adenuga-Taiwo: Absolutely, absolutely. I think it makes sense that especially with the sensitivity of the type of data collected in financial institutions,

[09:37] pretty much almost the same as healthcare institutions,

[09:39] you're collecting a lot of sensitive data here. Cardholder data is pretty much seen as one of the more sensitive forms of data.

[09:45] And so that comes with a lot of scrutiny, especially in today's world,

[09:50] for sure.

[09:51] Debbie Reynolds: So I want to talk a little bit about compliance. So a lot of times when people hear compliance, they think of legal, of a legal standard. Right. But compliance is really much broader than that.

[10:01] So tell me a little bit about compliance as it relates in the security area.

[10:08] Damilola Adenuga-Taiwo: Compliance is operates in the world of an entity or a company.

[10:14] I'm trying to figure out how to explain this better. But compliance and I guess the difference between they're pretty dissimilar when we talk about law and compliance, law and standards.

[10:25] I guess one major difference is one could be punishable either criminal offenses or civil offenses.

[10:32] You pay a fine or you go to prison,

[10:34] things like that.

[10:35] For standards, for example, like this PCI standards,

[10:38] the fine or the consequence for not abiding is pretty much you pay a fine or you get, you get compromised, you pay it fine, etc. But when it comes to compliance and the overall concept of it is making sure that an entity stays adhere to a strict set of requirements for that entity.

[10:58] So if that entity for example, wants to be security compliant or is to be security compliant, there is a certain set of rules that that entity is supposed to abide by.

[11:10] Right. So that rule says you must have a 12 character password or you must use multifactor authentication or you know, you must log a certain pieces of element or events that happen in the system.

[11:23] These are all the types of rules that could exist when it comes to compliance based standards.

[11:28] So everything that exists in that space of making sure that an entity abides by a certain rule set,

[11:36] be it the entity actually implementing the rule set or the entity being audited based on that implementation,

[11:44] all of that exists and encompasses the compliance space.

[11:49] So that's pretty much compliance in a nutshell. There's a lot more to it.

[11:54] But this is bare bones.

[11:57] Debbie Reynolds: Yeah, definitely. I want your thoughts on friction. So I hear this. So I work with people in all types of industries, but almost any industry I can think of in the digital space, they don't want friction when someone is either making a payment or trying to use their service.

[12:14] Right. Because they want to Retain that person as a customer. And as there are more kind of regulations or standards and things that are coming out,

[12:24] you know, brands are concerned about friction. So tell me just your, your philosophy around,

[12:32] around the balance of, you know, there are things you have to do, but then you want to do it in a way that makes the transaction smooth and not, you know, make the customer upset.

[12:43] Damilola Adenuga-Taiwo: That is totally, that totally makes sense. And every entity is trying, obviously trying, you know, for every private entity, a profitable entity, the end goal is to make profit or make money.

[12:54] And so easing up the means of taking payments or it's probably an advantageous approach to that entity.

[13:03] When it comes to friction, there are things,

[13:06] the PCI dss, for instance, and the standards come up with a rule set of things that you must do.

[13:12] And so one of those things, for instance, is not storing payment data.

[13:17] And if you're storing payment data,

[13:20] you have to abide with a stricter rule set. So if you have payment data, actual payment data in your databases, for instance,

[13:26] these are the things that you must abide by and doesn't have to do with just electronic databases. You could actually be storing paper records that contain payment data. Right.

[13:35] So in easing up that friction, if you're going to be collecting credit card payments, for instance, usually it is state law, I believe it's state law or federal law, that if you're going to have recurring, or a customer adhere to like or agree to recurring payments on their credit card,

[13:53] that they must fill out a credit card authorization form.

[13:56] Right.

[13:57] So there's different. That's where the friction comes in. Like, okay, for us to start charging your card for this gym membership every single month, here's a form that you need to fill.

[14:06] Right. So that acts. That adds that extra bit of layer into that process which could be seen as friction.

[14:14] So it's mandated that the merchant do it.

[14:17] Merchants have generally found creative ways to do it in a way that it's just seamlessly introduced into that transaction. Right. So that's one of the more common processes that we've seen that just adds that extra layer on top of just a normal transaction,

[14:36] introducing some level of friction per se.

[14:40] Debbie Reynolds: I mean, as you're talking,

[14:42] there are so many things that I think that are happening in finance that we can adopt and in privacy.

[14:48] So just, just the example that you gave, that once you do a transaction, you shouldn't store the payment information. Right.

[14:55] And so from a privacy perspective, why, I wonder why companies aren't doing that. That will, like, lower their risk, like completely.

[15:02] Damilola Adenuga-Taiwo: Yeah. So it's okay. So it's not that you shouldn't store it,

[15:07] you can store it. It's just now you're exposed to this major scrutiny for storing payments.

[15:15] So therefore that level of scrutiny serves as a deterrent for people that actually want to store it. Right. Or for entities that want to store it. So therefore when you know, PCI based analysts or security analysts that deal with pci, they get introduced into an environment or a merchant's environment and they say,

[15:34] hey, you know, could you consult for us?

[15:36] Usually the first things that professional or consultant is doing is the first question is most likely going to be do you store credit card data?

[15:46] And if the answer is yes,

[15:48] the next thing would be how can we not store credit card data?

[15:52] Because then we're faced with this hundred other requirements that we really don't want to do. Right? And it cost us a lot of money, a lot of time, a lot of resources.

[16:03] So the shorter path and the less resistant path usually would be to,

[16:08] to not to take the path of not storing credit card data.

[16:12] So yes, there is some learning, sir. And I actually think it's something that privacy law deals with,

[16:21] storing data until it's not needed anymore.

[16:24] So those are the things in which ideas can be borrowed from both realms. I would say in areas where does it really need to be stored? Is it long term transactional, where that data needs to be retained on the system?

[16:39] Because we also have this down the road problems of what's the meaning of when it's no longer needed?

[16:46] Right.

[16:47] That's the usual column question,

[16:49] what does no longer needed mean? Right. So there comes the separate set of challenges of subjectivity when it comes to requirement. As we know,

[17:00] many of the requirements introduced especially by Europe are not nitty gritty enough.

[17:06] Where,

[17:07] where people, where they say don't store it past 30 days, don't store it. It's like don't store it until it's longer, longer needed. And that introduces interpretation.

[17:17] And, and so I think there could be, this could be ideas borrowed from, from two different realms. I could see you laughing now.

[17:26] Love to hear your thoughts.

[17:27] Debbie Reynolds: I agree completely. I think, I feel like, especially people who think about compliance in terms of a legal sense,

[17:34] they're very irritated with something like the GDPR where it says something like privacy by design.

[17:39] Right? So doing something before something bad happens. And then also they don't like the fact that it's not prescriptive enough.

[17:47] Right. Where they, they're not saying store data for three years, right. Because people are like, okay, we could do that we'll automate this or whatever. But saying wait until delete data or remove data or something to it once your purpose has expired.

[18:00] Wow, that's like more deep thinking about what you do with data and having a data life cycle. So I think one of the things that privacy regulations has done and probably no other regulation type that I think has done is saying delete data.

[18:18] Right. And companies don't like to delete data.

[18:21] I tell people software is made to remember stuff, not to forget it. So I feel like you're almost swimming upstream because a lot of the applications or tools that you make, they aren't made to get rid of stuff.

[18:31] And so that really has to be like a business decision and more thought and process around why you have data in the first place. Where maybe in the big data days we were like, let's collect as much data as possible.

[18:44] We don't know why we want it. We were going to collect so much, you know, and then maybe we'll think about what we're going to do with it later. So this is like more circumspect.

[18:52] What do you think?

[18:53] Damilola Adenuga-Taiwo: Absolutely, absolutely. You know, standards, like the ISO suite of standards, ISO 2700 ones, for instance,

[19:00] you know, they, they talk about retention as well. They, they ask you to, they ask the company to document retention period for each of the types of data in which they collect and store.

[19:10] All right, so which means a company has to give thoughts to financial data,

[19:15] pii,

[19:16] all the other different pieces of data. And still.

[19:19] So it's, it's definitely one of those pieces where,

[19:23] you know, companies are forced to think about retention as a whole. And that's where the appreciation for the privacy regulation comes into play because it forces companies to think about it and think about the length.

[19:39] And for a company that says they're compliant and displays that we're GDPR ready on their website, or we're ISO 27001 compliant, or we're SOC2 compliant. These are companies that are audited every year and have to go through an assessment and where the auditor is asking you about, have you given thought to the period in which you store this data?

[20:01] And can you show me when last you deleted data,

[20:05] can you back this up? Can you back up your policies on the retention periods in your policy? And that's where the appreciation for compliance as a whole comes into play, because it's actually forcing entities and companies to do what they say they're going to do and what the regulation says they should do.

[20:22] Debbie Reynolds: Yeah, well, what's happening in the world right now in terms whether it be cyber security or privacy or data that's concerning you right now.

[20:32] Damilola Adenuga-Taiwo: I think the most common thing everyone is focused on right now is AI.

[20:37] I do share some of the concerns of AI. The other day I was watching Ingo's YouTube shorts or something and then I saw an AI video of someone who was very popular.

[20:51] And the only reason how why I did not believe that video is because I saw the original version of that video which was saying something completely different.

[21:01] And this was the same video,

[21:04] same person,

[21:05] same background,

[21:06] same everything,

[21:07] just saying something completely different.

[21:10] And I was worried,

[21:12] like this is not good because if I didn't see the original,

[21:16] another person would definitely take this.

[21:19] All sorts of grains of salt.

[21:21] And that gets me worried. That definitely gets me worried. So creating AI fakes,

[21:27] one of the, one of the things that I'm looking at and I'm like, who's doing something about it?

[21:33] Can we do something about this? Can regulation do something about this? Is this already out of control?

[21:39] We are about to get to that tipping point where things are out of control, things already in the wild where we can't take them back type of thing, right?

[21:48] Can the social media platforms do something about this? Right? Can they? Is there going to be a scan for uploads into the platforms to scan for AI generated content? Because no one's going to be putting this is AI generated content on their videos, right?

[22:05] No one has time for that.

[22:06] Everyone's trying to get views and likes.

[22:09] So it's one of those things that worries me and worries me not just for our generation, but for the future generation.

[22:16] And, and so this applies, cuts across all different, different,

[22:20] different spheres of society. From the music we hear, from the videos we ingest, from the information we read,

[22:27] all of those things could be fake and could be faked.

[22:31] And, and that, that really surprised. That's, that really concerns me.

[22:34] Debbie Reynolds: I'm very concerned as well.

[22:36] I know a lot of times when we see them or we see it in the news, it's because it's like a fake of a famous person. Right? But you know, these same technologies can do fake things of people who aren't famous, right?

[22:48] And they're, we're seeing that used a lot for fraud,

[22:51] especially in like the voice area. Like I tell companies all the time,

[22:55] like I don't think voice is a good authentication method.

[23:01] Yeah, it's just so too easy to fake. I mean there are, there are applications that you can get for $20 a month that will clone your voice and Anybody else's voice.

[23:11] Right. So it may be in the past when it, when it was harder to do,

[23:16] maybe that was a good authentication factor, but I don't think it is anymore.

[23:20] Maybe someone's image may not be a factor in the future, especially as these deep fakes get worse.

[23:28] Damilola Adenuga-Taiwo: Yes, absolutely. I have a special appreciation for the highly regulated industry, such as a financial industry, because within the financial space thoughts have to be given early on to things like this.

[23:43] And that's where you see internal policies stating that be it a customer, be it an employee, everybody has to go through some sort of video authentication. But even that might not even be enough these days.

[23:54] States right,

[23:55] with, with what we're just talking about. So there, there has to be forward thinking,

[24:01] thinking ideas to,

[24:03] to mitigate against these risks.

[24:05] It's just that with, with a lot of things that we know, regulation and compliance as we know it,

[24:12] it's usually catching up to technology.

[24:15] Some societies have done a good job of releasing regulation as quickly as possible that as. But as we know it, regulation comes out after a few years of deliberations,

[24:28] countries coming together and before we know it, everything is blown out in the open before you. Even that compliance or regulation catches up to anything. So that's the stuff that worries me.

[24:39] The speed of regulation, the speed of compliance, the pace of technology,

[24:45] the heavy pace of technology. Actually those things concern me today.

[24:50] Debbie Reynolds: And I do want to talk a bit more about AI and the reason why I want to talk about this is because since like for example, ChatGPT came out and everyone kind of went crazy about AI,

[25:03] people,

[25:03] a lot of people don't realize that AI has been existed for decades and a lot of companies have used artificial intelligence,

[25:11] but a lot of that use and development was very purpose built.

[25:16] And so they have more narrow risk because you know, you're not saying AI is going to do everything for you, but maybe it has a particular thing that it does,

[25:26] right? And it's very narrow, so that risk is very small. And so now you have this wild west thing where people have the false impression that AI can do everything and you know, we can just put any type of data in there and it's going to come with, you know,

[25:42] a magic diamond is going to pop out the other end. And that's just not the truth. But I just want your thoughts on the AI and the speed of people being hungry to leverage AI.

[25:56] Damilola Adenuga-Taiwo: That's quite a lot to unpack there.

[25:59] It's AI with ChatGPT and the likes and every other application that does similar things it's one of those things that you use it the first time, you use it the second you use it first time,

[26:12] you're kind of skeptical.

[26:14] You use it the second time, you're like, okay, I'm enjoying this now, and use it the third time. Now you can't go back,

[26:20] right? So it's,

[26:22] it's one of those things where it just gets intertwined into your life so easily because whatever you thought you couldn't do before within a matter of milliseconds can get done.

[26:35] And so now people are.

[26:37] So what concerns me about that,

[26:40] I was thinking about this the other day was now people can think they can take on a lot more,

[26:44] right? Like, okay, I used to be able to do this in like four hours. Now I can get it done in five minutes, which means I can do like way more of this, right?

[26:53] Are there health implications to that? Right. Taking on a lot more. Do we need to check that?

[26:59] Those are, those are the things that it's great that it improves productivity,

[27:06] but as we know, even the remote working,

[27:09] we've come to enjoy it, we've come to accept it.

[27:13] And then there's studies showing that people who work from home probably take on a lot more without even you knowing it,

[27:20] right? So how much? So because you're not spending the time going to the office and coming back. So how much? So something a tool that can actually help you automate all of these processes that you used to do manually.

[27:32] All right? So I feel like people are going to feel they're going to take on a lot more because they're going to feel like they can take on a lot more.

[27:41] It's deeply concerning.

[27:43] However,

[27:44] I guess the major question as to whether it's. It's kind of like those ness. I don't want to say it's a necessary evil, but it's probably getting there at some point,

[27:54] right? It's like, okay, now, no, no, now people can't do without. I can't do without. People can't do without using Chat GPT anymore, right? So it now becomes intertwined into your life.

[28:05] Like, this is Chat GPT becomes your therapist,

[28:09] it becomes counselor, it becomes your teacher. I'm concerned like this, these things,

[28:16] these things actually concern me. I mean,

[28:19] the other concern is it replacing and has always been a concern, it replacing all of these different roles in people's lives.

[28:26] Because now if you have a counselor that you could go ask all these questions in ChatGPT,

[28:31] I'm like, okay, maybe I don't need to pay $150 anymore per hour seeing my own therapist because,

[28:38] you know, so these things are readily available to human beings.

[28:42] Human beings are definitely going to stretch out the use of these tools to.

[28:47] Out of everyone's very imagination. And that deeply concerns.

[28:52] Debbie Reynolds: Yeah, I agree, I agree. I think there's just a point and maybe that that line isn't there yet where people take it too far.

[29:02] So just like you say like therapist thing, so maybe like bull, would you Google something about some type of psychological thing to get some ideas? But now you have something that actually can talk back very specific to your situation.

[29:19] And so the idea that someone with a chat bot that doesn't really doesn't have a brain, doesn't have a head, and it's just spitting out what the next plausible word or the answer.

[29:31] And I think it gives people the impression because humans,

[29:36] you know, I think this is one reason why,

[29:39] you know, the anthropomorphizing of things is a problem.

[29:44] Right. Because it gives you the impression of humanness when there is no humanness there. Right. So there is no ethics. There's no,

[29:52] no real, real thought. And so the fact that people have feelings and have emotions and are reacting in human ways is a problem with these things.

[30:05] Damilola Adenuga-Taiwo: Yeah, all of these models are not always right to. They're not always right.

[30:10] I have someone who told me to have a conversation with ChatGPT on the ride to work every day. You know, they just put it, put it in the voice mode and they have like a conversation with ChatGPT on every different topic.

[30:21] Any topic comes to mind. Conversation.

[30:23] I'm like,

[30:25] is this problematic of some sort? This, that is that. Because everything we start to do as humans replaces something else.

[30:34] If we want to stop a habit, the usual practice is to replace that habit with something else.

[30:39] Energy cannot be destroyed. It can only be converted from one form to another. And that's what they usually. The therapists would usually. And the counselors usually tell us,

[30:47] replacing one habit with another. What is ChatGPT replacing in your life now?

[30:52] And not to sound like this right now, but that I often think about these things and being in, in the compliance and regulatory space also forces me to see these things firsthand.

[31:04] And this technologies firsthand and the use and how stretched out to use can be in different spheres and different fairs.

[31:13] That these things are the things that keep me up at night for sure.

[31:18] Debbie Reynolds: I never even thought about that. When you just talked about the voice mode and someone talking to it on the way to work. That is bonkers.

[31:28] Damilola Adenuga-Taiwo: I like this is. This is Definitely problematic.

[31:34] Before you know it, it's too late. Yeah, that's a point where now you can't go back.

[31:40] Now you got to have this conversation. This is your coffee every morning. This is your fix every single morning.

[31:49] Debbie Reynolds: Oh my goodness. Well, tell me a little bit from your perspective, what is the difference in your view between privacy and like cybersecurity or security?

[32:01] Damilola Adenuga-Taiwo: That's a good question.

[32:02] I've always thought privacy,

[32:06] it being especially in sort of many different societies being a fundamental human right versus security.

[32:15] Not saying security isn't,

[32:17] but it isn't as stressed when it comes to so meaning in the practical applications of this.

[32:25] This is embedded in law in many different societies versus the concept of cybersecurity really comes up.

[32:33] Yes, it comes up personally,

[32:35] but it really comes up when it comes to businesses.

[32:39] Right. And we think about anywhere the cybersecurity phrase is used or it's used in a sentence, it's usually an application to businesses and protection of business data and collect and bin data that is collected by that business.

[32:55] Right.

[32:56] Privacy, on the other hand, applies cuts across both business and consumers.

[33:02] That's how I like to think about it. But I'm not also saying that security does not apply to consumers. It does. It's just in the commercial use of the terms you will usually accord privacy to consumers.

[33:20] Cybersecurity of the businesses as. Because privacy is interwoven into law and laws of society and that's why it applies mostly to consumers.

[33:30] That's the way I kind of see it. And the major differences I think from the similarities, I do think they're brothers and sisters.

[33:38] And when it comes to like their.

[33:41] How they're applied how in the principles behind some. Many of.

[33:47] There are principles behind many of the principles, for lack of a better term. Right.

[33:53] Many of the things that you see thought about from the same place, I would say they're embedded into each other as well. You would usually see privacy requirements embedded in cybersecurity frameworks and vice versa.

[34:06] Where privacy frameworks contain cybersecurity requirements, for example, encryption and protection of technical protection of pii. Right.

[34:17] So they're embedded in each other. Concepts are largely a very, very similar. I would say but one is ingrained. For example, when you say data deletion and requests for data deletion.

[34:32] Right. That's applying to a person being able to call into a business saying I want my data deleted.

[34:39] Right.

[34:40] But there isn't the equivalent requirement when it comes to cybersecurity standards and frameworks. It's usually make sure Your encryption is AES 2, 5, 6,

[34:53] or make sure there's not that.

[34:57] Make sure a person can do this.

[34:59] Right, or a consumer can do this when it comes to that consumer's data.

[35:06] Right. So that's the way I kind of like to approach both concepts.

[35:11] Debbie Reynolds: Well, I never heard of anyone say it that way. That's really, really cool though.

[35:15] I think, though, I just want your thoughts. I feel like, especially like, like cyber security training.

[35:21] So, like companies have done, trying to do cyber training different ways. Maybe they have Cyber Week, Cyber Month,

[35:28] sometimes they sprinkle it throughout the year and things like that. And some employees,

[35:33] in my view, everyone's like rolling their eyes like, oh, I don't want to have to do this training. I don't want to have to do X, Y and Z. But I think when you can bring it to a personal level or understand what the potential impact is,

[35:49] I think it helps to make it resonate. So as opposed to saying, I'll click on this link,

[35:54] right, you say what happens when you click on that link and those types of things. But I want your thoughts.

[36:01] Damilola Adenuga-Taiwo: I think cybersecurity training, I guess training in general,

[36:04] as you mentioned very correctly, like,

[36:08] it's very frowned upon.

[36:09] It's usually they open up a training platform for two weeks and people generally will get that training done the day before it's due, right? Because it's not something they're thinking about.

[36:21] It's not their everyday bread and butter. Right? So, but is it,

[36:25] is it interesting? Is it impactful?

[36:28] Are there pieces that we can pick from the training that only impact our everyday lives?

[36:35] That's absolutely true. We can take a lot of learnings from that training,

[36:39] especially when it comes to what you talked about. Fishing.

[36:43] Fishing, regardless of where you are, you're working in a business or you're just browsing the Internet.

[36:49] That is applicable wherever you are. And it's one of the most common form of compromises, and it's still probably one of the most common form of compromises. And so that's usually forms most of the training content, a lot of the training contents and the continuous.

[37:06] So we talk about training and we talk about awareness.

[37:09] Right? So the continuous awareness effort is usually framed around fishing for a lot of companies because that's the most common form of compromise when we talk about, like,

[37:19] persons working in that business. So I think it's very important that trainings are continue or need to be instituted in businesses. And a lot of the regulation actually agree with, with that thought, right.

[37:32] That it's embedded in regulation. It's embedded in standards and frameworks.

[37:37] Debbie Reynolds: One of the things that annoys me in those types of trainings is when people say look for an email that may have misspelled spellings or maybe this person from another country.

[37:48] And that hasn't been true for a long, long time.

[37:51] Right.

[37:51] Although it's like cyber criminals. They have Grammarly, they have spell check, they have ChatGPT now so they can write perfect emails. So that's like a terrible.

[38:01] Damilola Adenuga-Taiwo: No excuses anymore.

[38:07] Debbie Reynolds: Yeah,

[38:08] exactly. Well, what are your thoughts? I mean, to me, I feel like the threats are getting hire and I think some of those old trainings, they are not really answering what's happening today, which is you can get an email from a brand that looks exactly like the brand, you know what I mean?

[38:24] And they may know something about you to give you the impression that, oh, this is from this company. Right. So I would totally trust it. Yeah.

[38:32] Damilola Adenuga-Taiwo: Have you seen the efficient emails these days?

[38:35] Debbie Reynolds: Oh, they're so good.

[38:37] Damilola Adenuga-Taiwo: So good. So sophisticated. I was looking at one the other day and I almost fell for it. It was one of those things where you're like almost about the click and you're like your cyber security pulls you back and you're like, okay, you know what?

[38:52] I'm better to be skeptical, better to be safe than sorry. And better to check these things.

[38:57] Just as the modes of compromises keep getting sophisticated, the attacks get sophisticated, the modes of attack get sophisticated. The language around efficient, like the. When we talk about financial industry and compromises around that in Canada we have something called like the Interact E transfer.

[39:15] That's the way that's kind of like Zelle for the United States,

[39:19] it's E transfer. So you get an email and the email says an amount of money has been deposited into your account.

[39:26] Click here to receive the amount.

[39:29] Now that email looks like your financial institution. It looks like what will come from a Chase bank,

[39:35] right? Or TD bank or anything like that. Like it looks, takes the colors, the branding is all the same.

[39:41] So it's all sophisticated and you're looking and like I wasn't expecting any money. But I'm curious though. This is a hundred and fifty dollars click and you get led, you know, to that page and you put in all your details.

[39:55] Usually even Google Chrome already fills out your passwords for you these days. Right.

[40:00] And then you log in and either your details get stolen through a man in the middle attack or it's actually someone who's phishing you.

[40:10] So that person did. So that email for sending and receiving money looks the same Right.

[40:18] So it would usually the person is sending you money, was what you're expecting to receive, but the email is actually the person is requesting money from you.

[40:28] So you click that link and all of a sudden you get led to your financial institution,

[40:33] you log in and then you say click, click, click. I'm expecting to receive money. And all of a sudden you just lost $150 to this unknown person.

[40:42] Right. And that's somebody who is trying to buy something from you. A Facebook marketplace or like a ticket for a Taylor Swift concert,

[40:51] you know, something like that. Right. So those things,

[40:54] that's the way I've seen the more sophistication and phishing attacks these days.

[40:59] It's not really by it's emails, but there's a lot more molds to phishing these days. And I think that's where the training programs need to adapt to these more sophisticated modes of compromises.

[41:15] In agreement with you for sure.

[41:17] Debbie Reynolds: Well, if it were the world according to you, Dami, and we did everything you said, what would be your wish for privacy or security anywhere in the world,

[41:26] whether that be regulation,

[41:28] human behavior or technology?

[41:31] Damilola Adenuga-Taiwo: Huh. That's a loaded one. If I. In a perfect world, in my perfect world.

[41:39] Debbie Reynolds: Yeah.

[41:41] Damilola Adenuga-Taiwo: In my perfect world, when, as it relates to regulation, privacy, AI,

[41:47] I think it has to do with humans.

[41:49] I wish we would not believe everything on first sight.

[41:54] I wish there were extra verification checks to everything we did.

[41:58] I wish people were not as stressed and vulnerable as to get compromised in, you know, a phishing campaign or any of those different other technologies. Because usually the things that lead humans to slip up, as we say it is vulnerability.

[42:16] Right. You're usually in a vulnerable state. Someone takes advantage of that.

[42:20] And that someone doesn't have to be.

[42:22] It doesn't. That someone or that something doesn't have to be a compromise. It could be a technology,

[42:28] all right. It could be social media.

[42:30] You're just in a state of,

[42:32] you know,

[42:33] you're in a state of limbo in life and then TikTok comes through and you all of a sudden you're soaked in. Six hours scrolling down on TikTok, you know, consuming all this information that isn't healthy for you.

[42:44] So to pack that all up, I would say I wish we, we as humans just had more control.

[42:51] I wish we had more.

[42:54] I wish we had. We will hold back on certain type of things. I don't want to use the word discipline because I think that might be too insensitive.

[43:02] But for lack of a better term, I. We. I wish we could apply discipline to several areas of our lives, especially as it relates to, to technology and AI.

[43:13] Debbie Reynolds: I agree. I think we kind of lack that discernment. And I think part of that is that human excitement about something that gratificate that instant gratification, right?

[43:24] So that instant gratification or that, or the thing that gets you so worried and concerned, that makes you not think in a normal way. That's what especially. That's what fraudsters want you to do.

[43:36] That's what brands want you to do. They want to see, send you the thing and say, hey, we only have, you know, three pairs of this shoe left that you want it, gotta buy it right away or, or, you know, in terms of fishing, they're like, oh my God,

[43:48] you have to click on this link, you know, within 24 hours to get your prize that you're supposed to get. Right. And so they try all types of tactics. But a lot of this is really trying to automate like social engineering, truly.

[44:02] And it works, unfortunately.

[44:04] Damilola Adenuga-Taiwo: Well, no. Things people could look out from, just not just a segue a bit things humans can look out for and the general public could look out for.

[44:12] As you mentioned, if you receive a message that is time bound, you should do an extra verification on it. If something requires you to move or do something or take an action within a certain period of time,

[44:25] you should look at that a second time or pay some extra attention to it.

[44:29] And the person uses names that are familiar or tries to coerce you in some sort of way,

[44:38] you should look at that a second time.

[44:40] And sometimes these things don't take much. Right. You only just need to look at where that email is coming from.

[44:46] Is it really the email? Is it really from my bank or is it really someone posing to be my bank?

[44:52] And so those are, those are the indicators when it comes to phishing that usually people can look out for when it comes to their everyday lives.

[45:02] Debbie Reynolds: That's great advice, excellent advice. Well, thank you so much for being on the show. This has been great.

[45:09] Damilola Adenuga-Taiwo: Thank you so much for having me, for sure. This is amazing.

[45:12] Debbie Reynolds: Yeah, it's amazing. I love all the, the information that you imparted and it's really, really. I'm looking very closely at, you know, PCI and, and finance because I think we can learn a lot in privacy from that.

[45:27] Damilola Adenuga-Taiwo: Absolutely, absolutely. Thank you so much, Debbie.

[45:29] Debbie Reynolds: You're welcome. Thank you so much. We'll talk soon.

[45:32] Damilola Adenuga-Taiwo: All right, take care.

[45:47] Debbie Reynolds: Of it.

Next
Next

E247- Michael Robbins, Co-Founder, Learning Pathmakers, Builder of human+digital learning ecosystems