E260 - Jon Bello, Partner at the Medialdea Bello and Suarez Law Offices, IAPP 2025 Vanguard of the Year - Asia (Philippines)

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:14] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast, where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:27] Now, I have a very special guest on the show.

[00:30] All the way from the Philippines,

[00:32] we have Jon Bello, who's a partner at MBS Law Offices.

[00:38] He's also an IAPP 2025 Vanguard of the Year award winner.

[00:44] You're the Asia NPC Privacy Advocate of 2021 awardee.

[00:50] You're also a speaker and professor and ex associate general counsel or. Well, first of all, we're both smiling ear to ear because this meeting or this podcast has been many years in the making.

[01:04] I know I met you many years ago. You had invited me to speak at IAPP knowledgenet presentation from the Philippines. And it was incredible.

[01:15] So the people that you got to speak there,

[01:18] just the mix of different people and just the ideas from the different countries, I thought was amazing.

[01:24] I think also I spoke, spoke at one of your courses, I believe.

[01:28] Jon Bello: Yes, Cyber security class.

[01:30] Debbie Reynolds: Yes, I spoke at one of your courses. That was really cool.

[01:34] But I've been trying to get you on the show forever and you know, we, we, we were in D.C. a couple times. We missed each other.

[01:42] And you got me like the most beautiful scarf, so I'm gonna wear it on one of my videos and I'll shout you out.

[01:47] Jon Bello: Thank you for that and I'm definitely glad to be here. Debbie.

[01:51] Quite an honor and privilege to be finally to be a guest at your podcast.

[01:58] Right, so. Yes,

[02:00] finally.

[02:01] Debbie Reynolds: Yeah. Well, first of all, I have to say, and I'm going to let you introduce yourself. When I heard that you won the Vanguard Award of the year, I was so excited.

[02:11] It's so apt. And it's so, so fits you because you're such a champion for privacy and you really bring people in. You understand the value of understanding just privacy and data protection all over the world, not just in one place.

[02:29] But I'm very interested in Asia, very interested in the Philippines. One of the things I'm very impressed with the Philippines is that your country puts out a lot of good guidance for people around privacy and data protection.

[02:45] And it's kind of a, like an everyday part of life where they kind of expect that. And so I think,

[02:52] you know, the fact that you got that award, I was so happy and so proud of you. Sorry I wasn't able to be there to meet you, but at some point when you fly back to the U.S.

[03:02] we're gonna get together. I'm trying to make my way to Washington to see you, but. Yeah, why don't you tell me a bit more about you.

[03:10] You're incredible.

[03:11] Jon Bello: Yeah. Okay, so let's talk about what I actually do or my self proclaimed mission in privacy.

[03:20] So what they try to do is to connect,

[03:25] connect people as much as they can.

[03:29] And I may be. I am a believer in knowledge sharing and I believe that there's no particular person or particular organization or particular regulator would have a monopoly information,

[03:43] especially in the field of privacy.

[03:46] So what I always want to do is that create opportunities for knowledge sharing and that is when I try to organize webinars or sessions in whatever capacity, either as an IPP member or as an individual.

[04:05] There is really value in learning from each other.

[04:10] And that's why I try as much as I can to reach out to experts across different jurisdictions because there are always something nice, great, new that they can offer that can actually be applicable to other jurisdictions.

[04:29] Because experience is definitely something we should value. And this particular jurisdiction would have a lot of experience, this type of situation.

[04:39] They would. You would want to learn from them as well. And then. Yes, and that's that I'm a, I'm a fan also of being a lifelong learner and I try to learn from each other and I try to bring others along in their ride.

[04:55] So learn, learn from each other, Learn from each other is a very important value that, that I try to, to promote everywhere I go, actually.

[05:06] And how did I get in the field of privacy?

[05:10] That's by not really by design or simply by accident.

[05:15] I used to be part of the BPO industry.

[05:19] I worked with the BPO industry starting in 2007.

[05:23] They left in 2020.

[05:25] So the BPO industry, or the business process outsourcing industry,

[05:30] what it does that you have, let's say for example, you have a U.S. company that would offshore work abroad,

[05:40] then it's the local BPO company that will get that work done for you,

[05:46] right? So those jobs that wouldn't let customer service jobs, right, Those are jobs that are not really popular and anymore in other countries or not, or at least it's not the core function of a particular company,

[06:02] they would outsource that to countries like Philippines or India.

[06:08] And that is how the BPO industry became very, very prominent in the Philippines. Because you have a lot of Fortune 500 companies,

[06:18] right?

[06:19] Outsourcing work to the Philippines. And that's really a big economic contribution to the Philippines. Because right now,

[06:29] the second top GDP contributor to the Philippines would be the BPO industry,

[06:37] next to OFWs or overseas Filipino workers sending remittances to the Philippines. And that's how I got involved in privacy.

[06:48] Because the BPO industry was the primary industry. It was really pushing for the passage of two laws, the Privacy Law and the Cybersecurity Law in 2012.

[07:01] And that is why I had the opportunity as well to get to look at the first drafts of the privacy law before it's actually enacted.

[07:12] And,

[07:13] and it was finally for implementation.

[07:17] It happens to any organization, right? Okay. If there's a new law, there's a new regulation,

[07:22] who will take care of it?

[07:24] So if they're trying to implement the data privacy law in any organization in the Philippines, it will usually fall either in the bucket of legal bucket of it,

[07:38] right? Bucket of. Or the audit teams.

[07:41] In my case,

[07:42] my boss said, hey John, you're the lawyer, take over it. Take over the, the privacy role. That's how I landed in the privacy space.

[07:52] I'm sure like any other Filipino at that time, most of them became.

[07:57] Fell into that role.

[07:59] Not really by design, not planned, but really more of a requirement by the office for their.

[08:07] Debbie Reynolds: And so you really took the reins of that and wow, you're an award winner with iapp.

[08:14] You're just such a bright light. And I love the way that you bring people together. I think it's really important.

[08:22] One thing that I want your thoughts on.

[08:26] I'm gonna call you an international man of mystery, right? So you understand a lot of what's happening internationally, but the Philippines interests me a lot in data privacy and data protection.

[08:38] I guess I'm very interested in countries that passed data protection regulation between the time that the EU had their data directive and after their data Directive, but before the gdpr.

[08:57] The Philippines is one of those countries.

[08:59] And so it's been really interesting to see how you all have done that. But tell us a little bit about the Philippines and what makes it different in terms of, you know,

[09:11] it's really.

[09:12] Jon Bello: Different because I'll tell you why. Right? So culturally,

[09:16] the Philippines is very, very different from United States or Europe.

[09:22] Why am I telling that? Most Asian countries like the Philippines were so family centric,

[09:28] right?

[09:28] Being part of a group is very important for, for the Filipino. That's why we. If you notice in love, you see so many social events, so many groups, so bonded together.

[09:42] And that's why I'm gonna tell you, I gotta backtrack a bit.

[09:46] Way back in the 1960s,

[09:50] there was already a privacy issue being deliberated upon by the Philippine Supreme Court,

[09:57] okay. About privacy of communications.

[10:00] And they in fact quoted, quoted a. A Filipino writer about the general concept of privacy in the Philippines.

[10:11] Because according to that quote, what is privacy?

[10:16] There's no such thing in the Philippines.

[10:19] In fact, Debbie,

[10:21] there is no translation,

[10:25] there's no Filipino translation of the word privacy.

[10:29] There's not a gala version of privacy.

[10:32] So way back in the 60s, so it was thing, it was thought of a more race Western concept. And that is why fast forward,

[10:41] when the law was passed, people were really struggling about the concept of private privacy. Because again, we're so family centric, everything is not really, quote, unquote, a secret because everybody knows what's happening.

[10:56] And that's why before the, before the passage of the law in privacy,

[11:01] if you would look at our office IDs, right.

[11:05] When you look at the back of the office id, you will have a lot of sensitive information there. You have the Social Security number,

[11:13] tax identification number.

[11:15] Even your blood type is at the back of your id.

[11:19] And why, why do you have those? See, for convenience, if you go to the bag, they would usually ask for identification numbers, right. And they just show the, the flip side of the id.

[11:30] And for emergency purposes. Right. And you know, the Philippines is really prone to disasters,

[11:37] so there's an emergency.

[11:39] Having your blood type available will really be very helpful to the responders.

[11:47] So that's why it was really struggle for the Philippine or any Filipino organization, any Filipino individual, to embrace the concept privacy immediately after it was passed.

[12:00] It was still sort of an alien concept because of the Filipino culture, which I said family centric, a lot of openness. And there was that. I have no secret to hide.

[12:14] Anyway, that was the prevalent thought process. And that is why you mentioned about the National Privacy Commission. You have to give a shout out to the National Privacy Committee. They're really doing a very good job at educating the Filipinos, not just organizations, but the individuals on the rights to privacy.

[12:36] Why is it important to protect your data?

[12:39] What are the harms that can be that could happen in the event of a breach?

[12:44] Right. So, and the focus is that they're managing information upwards, downwards, sideways, right. So everybody would be part of the information awareness campaign.

[12:58] And that's why you see the Privacy Commission very creative in the way they sending information either to social media,

[13:08] right. Or through jingles or ads. So they're trying to target everybody,

[13:15] especially the children.

[13:17] And they're also focusing children because these are the sector that needs real Protection.

[13:25] So. And that's why,

[13:27] sadly,

[13:28] somebody told me, john,

[13:31] what else is there to protect for us? Everything's there on the web. We have all been breached. It's such a pessimistic outlook. And I told, I'm telling everyone we should, yes, understand the situation, right?

[13:45] So everything's out there. But at the very least we have. We must protect or look after that accelerates the children.

[13:51] Let's ensure. Okay, so while our data is compromised,

[13:55] then let's make it our mission and our vision to at the very least, protect the data of the next generation.

[14:01] And that's why, again, we need to give a good shout out to National Privacy Commission because really doing a very fine job in communicating privacy awareness to everyone from children and to the mover and to the senior members of society.

[14:17] Debbie Reynolds: I agree. And so for people who don't know what the National Privacy Commission is doing, definitely go on their website, check out the stuff they're doing, sign up for bulletins.

[14:29] They do an incredible job. Because I feel like in the US it's like, okay, read 80 pages of a privacy policy and click a box and all of a sudden you're like, educated, right?

[14:40] Where like you say, like, using like, jingles and different things and like every day there'll be something like, even if it's just one little tidbit. And so doing it that way really helps the education.

[14:53] It helps keep it in people's mind without like throwing so much information at them. And so I think I like that approach because it's very human and it talks to the things that people need to know as opposed to being, like, overly legalistic and so complicated that people can't,

[15:09] like, understand it.

[15:11] Jon Bello: Yeah. And just to add, right, so they're doing it in,

[15:14] in both English and in Filipino language, right. So that you can have a lot to capture more people in understanding the data privacy. But yeah, but you mentioned about the U.S.

[15:27] right.

[15:28] But you know what's funny is that,

[15:30] and I've always talk about this,

[15:33] the Philippine Constitution is basically framed or patterned after the Constitution of the United States state. We look at the Bill of Rights, say, almost identical,

[15:45] right? It's because the Philippine used to be a colony of the United states, right? From 1898,

[15:53] then to the 1940s. Right. When the United States finally granted us independence. So the basic government structure and the Constitution is patterned after the United States framework.

[16:10] And why am I saying that? Because it's. The Philippines is actually a mixed bag, right?

[16:15] We have common laws,

[16:17] right. Happening from the United States,

[16:19] but also we follow European traditions on laws. Right. Like the civil code which came from Spain because he also became a colony of Spain for 300 years.

[16:33] Right. We have, we have a mixture of European and US influences in terms regulatory frameworks. And that's why going back to the United States. Why am I pointing that? Right.

[16:46] So if we when talk about privacy we would look at, we would look forward the United States and Europe as well.

[16:55] Right. In terms of what should we do?

[16:57] What are the frameworks it should be.

[17:01] And that's why even before the data privacy law was enacted, certain industries, especially the BPO is already familiar with the privacy laws of the United States because they are required as a vendor of US companies in the health industry.

[17:18] The calcos,

[17:19] the financial institutions that outsource business of Philippines as vendors.

[17:26] They're also, and you know this very well, they are required to comply with the privacy security regulations.

[17:32] Let's say for example U S Healthcare, if they have people, they have companies offshoring work here and that means we're processing the Philippines health data of Americans then we need to follow the HIPAA rules.

[17:46] Right? So those are things have to be applied and that's why you have a lot of auditors coming to the US going to the Philippines doing penetration testings of US BPOs in their compliant let's say with the security standards the United States especially in applying to the those industry requirements,

[18:08] pci, dss.

[18:09] And that's actually those concepts I learned when I was in the BPO because I joined the BPO 2007. Before that I was a lawyer in a law firm.

[18:19] Right.

[18:20] That was how I got exposed to in a privacy concepts, industry based concepts. Right now I have a legal framework so the like saying the BPOs, right. They're actually complying with US industry laws on the US industry regulations and privacy at the same time complying as well with Philippine laws on privacy.

[18:50] And also they also have to be compliant with the gdpr.

[18:55] So there are certain industries here in the Philippines that are actually trying to comply with all three legal frameworks, especially those in the BPO industry.

[19:09] Debbie Reynolds: Well and that's fascinating. Thank you so much for that background.

[19:13] Now I know we talked about the Philippines but I want your thoughts on Asia in general around data protection and privacy.

[19:22] I am sometimes upset that we don't talk enough about that region where I feel you pretty mature in my view on those. Because those countries do have laws have had it for a long time.

[19:39] You know they're a very.

[19:41] What's the word? I don't know. I don't know the best words to talk about them. But you know, when I've read laws like in Japan and different things. Okay, that's interesting.

[19:50] You know, I like the way that they're thinking about it. So for people who don't, who are very thinking about, you know, US versus eu, we talk a lot about that.

[20:00] What is it that people need to know or understand when they're thinking about data protection or privacy in Asia?

[20:08] Jon Bello: Very good question. Okay,

[20:10] so what we've encountered so far, right, Especially when helping clients from either US or Europe,

[20:19] for example.

[20:21] So there's an expectation that this is already GDPR compliant.

[20:27] Therefore it should also be compliant with in the Philippines or in any other country in the Asia region. And that's really problematic because yes, the, the GDPR is really the gold standard for data protection in most especially in Asia.

[20:50] But the problem there, then of course you need to talk about the process effect. So the bus text that yes, we have the gold standard either in privacy or in artificial intelligence,

[21:02] then it should also be applicable apples to apples to the other countries, especially those who want to adopt this standard.

[21:12] The problem there as a background, like for the Philippines, it holds true for any other Asian country. So there's differences in culture,

[21:21] there are differences in government structures and of course security.

[21:27] Security situations will be different as well, country to country.

[21:32] So why am I saying that there will be a lot of complications in the application of a, let's say policy times.

[21:43] There's a thought process that, okay, let's have a single global policy for all countries,

[21:51] say for the Philippines or for Singapore or for Thailand or for Malaysia.

[21:57] The problem there's all these countries,

[22:00] right? Whose data privacy laws are most likely enacted in consideration of local cultures,

[22:08] right? Local requirements.

[22:11] So there will be differences,

[22:16] right?

[22:16] And the devil is usually in the details.

[22:19] So you cannot expect that your global policy,

[22:22] when you apply it to a nation country,

[22:27] it will be applicable hook, line and seeker. And there will be and I've seen some policies that are not really matching the requirements.

[22:38] The Data Privacy act, for example,

[22:42] in the Philippines,

[22:43] we you. You are required to a privacy manual,

[22:46] right? Not just a privacy policy,

[22:48] but you have a privacy manual.

[22:51] That privacy concept, having a privacy manual is something which many,

[22:56] many companies do not have. They have, right? They have policies on it.

[23:01] They have policies on privacy,

[23:04] but they don't have a privacy manual. So privacy manual is really not just a collection of privacy policies or security policies. It also would lay down the governance principles being implemented by particular company,

[23:21] right? They've seen many questions from clients John, what is a privacy manual?

[23:27] So why is the NPC asking for a privacy manual? We have privacy policy already, so is that sufficient? Yeah, I know where you're coming from,

[23:36] but there are some things that are quite unique to the Philippines which we need to address as well.

[23:44] Data protection officers, right? In the Philippines, a data protection officer has to be registered to the National Privacy Commission.

[23:54] If, let's say for example, you have an organization that's 250 employees or processing sensitive data of 1,000 individuals.

[24:06] In other countries,

[24:07] there's really no strict requirement on the registration of the dpo. But the Philippines it's a requirement. And according to the npc, a DPO has to be an organic employee,

[24:22] right? In other countries you're allowed to have external DPOs, right? DPO vendors. In the Philippines it has to be organic.

[24:32] But you can outsource the functions of a DPO to third parties,

[24:39] right? So that's at least the, the compromise.

[24:42] But still that outsourced vendor or the one providing a DPO functions,

[24:49] right?

[24:50] They have to be reporting directly to the dpo. Why? Because in the npc,

[24:56] under the Data Privacy act, one of the principles is accountability.

[25:01] So one way demonstrate accountability is having DPO as part of an organization.

[25:08] And that's again one of the unique things about the Philippines. And that's why,

[25:15] and to emphasize the point, right, you have to consider the other countries would also have the same types of nuances.

[25:25] And let's talk about quickly about artificial intelligence law. In Japan,

[25:30] it's a soft law.

[25:32] There are no penalties,

[25:34] right, for prohibited acts at the very least. But there's more room for innovation.

[25:41] The Japan law incentivizes for now, I suppose, innovation.

[25:47] So there's not really penalties right to be imposed under that law. Unlike of course in the eua, right, there are specific prohibited activities.

[25:59] And now in the Philippines there is still no law on AI, but it's actually a top priority of the government to pass a AI law.

[26:11] And what we've seen so far,

[26:13] I think one of the prominent features of the AI regulation that we are seeing is that there's a requirement for registration of high risk AI systems.

[26:27] There's a lot of obligations on the part of an AI developer and a deployer as well.

[26:37] It also addresses the proliferation of deep fakes,

[26:41] misinformation and disinformation caused by artificial intelligence.

[26:47] For example,

[26:49] in the proposed law that I've seen,

[26:52] if the developer already sees that the the AI would do harm,

[26:59] it must be able to have mechanism in place to completely shut down the System.

[27:06] So yeah, in that sense it's really for me it's a good law, but it still doesn't have. It's not. I think as the weight is different from the EU AI act is that it doesn't have an enumeration right or a listing of prohibited acts or activities.

[27:26] We're using emotion recognition technologies in the workplace, especially if it's going to be used for assessing a person's performance productivity.

[27:38] I think that is a pivoted use of AI in the eu in the Philippines right now,

[27:46] there is no equivalent prohibition under the proposed regulations I've seen.

[27:53] And what's funny about situation is that.

[27:57] Let's go back to the nuances, right? Or the similar. Now talk about similarities between Europe and the Philippines.

[28:07] The Philippines is strong adherent of human rights.

[28:11] So same holds true for Europe.

[28:14] In fact,

[28:15] the Philippines is a signatory of the United Declaration of Human Rights.

[28:21] In fact the Data Privacy act,

[28:24] when you talk about it, you cannot separate it from the right,

[28:29] the human right to privacy.

[28:31] So it's why unlike in the United States, most likely without privacy you would tend to veer away from seeing privacy, human rights. You're more going to say privacy is a human value,

[28:44] right? So I think that's more acceptable term for, for other countries that are not. But, but in the Philippines, privacy, you see, it's a human right.

[28:53] So okay, so going back to the situation of the EU AI act, right? So if it's prohibited in Europe,

[29:02] the emotion recognition technology,

[29:05] right,

[29:06] Assessing whether or not okay, you're sad, are you angry,

[29:10] will affect your productivity. What you have is your score. Later on are going to evaluate you based on your emotions, right?

[29:17] So if it's prohibited in Europe and in the Philippines, why is it not prohibited,

[29:25] right? So what's the difference? Should it be prohibited as what is prohibited there? Because they're saying that the the technology is based on pseudoscience or sites that really that have inherent issues,

[29:40] right? So why is it not prohibited in the Philippines as well?

[29:46] So on that note, this is something that our regulators need to look into as well.

[29:55] Because if you apply that technology in the Philippines now it is acceptable as long as if you apply the data privacy principles.

[30:04] Well, if you just transparency, if there's consent,

[30:07] right. Legitimate interest,

[30:09] then the technology will be allowed in the Philippines, especially of the consent of the individual. But what's nice about the European model,

[30:19] you remove the consent right in the situation because the European model,

[30:27] it recognizes the power imbalance between the employee and the employer. Ephemeral here.

[30:33] If you want to be an employee at a particular company, you're going to sign up on everything. Right. Even the consent forms allowing because you want to be hired, you want to get the paycheck.

[30:43] At the end of the day, who cares about that? Right. And I need money to feed my children,

[30:49] so especially the Philippines. Right.

[30:52] So I think that's for our regulators need to look into those things.

[30:58] Take it outside of the concept of consent,

[31:03] remove it. Because it's time that you also need to protect directly the individual. Sometimes the regulars need to use its influence. Right. For better protection of what they did.

[31:17] Do not leave it up to the company or the individual to decide it among themselves.

[31:22] Because in there are certain situations, I think we need to protect the individual as well and remove that voluntariness because at the end of the day it's not really voluntary,

[31:35] especially in the workplace.

[31:37] And the power balance. Right. Between the employer and the employee.

[31:42] Debbie Reynolds: That's so true. Right.

[31:44] It's funny that you mentioned that about the imbalance between the employee and the employer in the U.S.

[31:50] as you know,

[31:52] the CCPA in California,

[31:55] they have given employees rights around their data. And the other states are just going bonkers. Like they're so upset about this. Right. Because typically if you're an employee, you have no rights to privacy or anything like that.

[32:10] A lot of companies in other states are concerned.

[32:13] Actually I have clients where they have employees in California. So you need a separate policy for them.

[32:21] And other people are like, well, why can't I have those?

[32:24] You know, I'm in Arkansas, so I agree with what you're talking about, the power imbalance.

[32:29] And you do really want to target the human regardless of what their station in life is at that point. Right. Like say your grandma,

[32:39] she may not be an employee, but she needs protection. Or your kid may not be an employee, they need protection. So I like the way that you're thinking of that.

[32:47] Jon Bello: Yeah. Okay. So going back to the. These employees are correct. If I'm wrong, they are at will employees in the United States.

[32:55] Debbie Reynolds: Yes. Terrible.

[32:59] Jon Bello: Yeah. Because when I was in the, in the bpo, in the BBC, it's an American company headquartered in Orange County.

[33:08] So we always have conversations about employment and then again. Right. So John, why is it so difficult to,

[33:16] to let go of people in the Philippines?

[33:18] Right. But because unlike in the United States,

[33:22] in the Philippines that you really have to go through a rigid standard of termination process before you can terminate an employee. You have to go through a lot of requirements, due process and all that.

[33:37] Like in the United States,

[33:39] there are Nuances with.

[33:41] In different parts of the world, we talk about labor, but they would have different rights,

[33:47] different concerns, different ways doing things. The same holds true for data privacy as well.

[33:54] Debbie Reynolds: Perfect. Oh, this is amazing.

[33:57] What's happening in the world right now that's concerning you most around privacy or data protection. Like something that you see that's emerging that you're like, oh, wow,

[34:08] kind of concerned about how this is going.

[34:10] Jon Bello: Everyone's concerned about this, right?

[34:13] The risk to be brought about by artificial intelligence.

[34:19] Okay, why am I saying that? Definitely I am a fan of AI,

[34:24] right? And there's this, I think Harvard professor who said that AI will not replace humans,

[34:33] but humans equipped with AI will outperform humans who do not use AI.

[34:41] So it is something that we really need to embrace right now, either as individuals,

[34:48] either as employees.

[34:49] Right. Either as an organization as a whole.

[34:54] We have to embrace now artificial intelligence as part of life.

[35:01] Right. It's been there for a long time. Just now that the technology is more superior.

[35:06] And that's why you have this, the all this artificial intelligence that are really more making you productive or really helpful in so many ways. And first, generative AI, right? But we need to caution a lot of people in the use artificial intelligence.

[35:24] They're talking about AI literacy, right? It's very important to know about literacy.

[35:29] But I said we have not even mastered privacy literacy. And they're going to go to the next level of talking about AI literacy.

[35:37] Master. First the private literacy. But we have no choice. But you have to jump into the world though of AI because it's really in our face right now.

[35:46] And what we need to do is to educate everyone again,

[35:50] right? If you educate everyone, privacy have to educate everyone in the use of artificial intelligence,

[35:57] which we know, you know, there are so many studies showing that when you use generative AI hallucinates, right. It could give you the wrong information or could even some studies.

[36:11] I think there are really classic examples out there.

[36:14] If you engage AI in such a way that it might even encourage a person to commit suicide,

[36:22] right? Because of the exchange some prompts depends on the way you feed the AI, right.

[36:28] Then there are some cases that it might lead some people to their promote. Encourage suicidal tendencies as well, Right.

[36:37] In fact, there's this case we studied in my class about this person in AI when they're conversing with each other.

[36:46] It came to a point, point that the AI was encouraging that person is living in a matrix. They were the matrix. And that everything is not true.

[36:56] Right. You have to wake up.

[36:58] And also it Went to a situation when the a the AI told the person that if you believe you can fly then then you won't fall.

[37:12] All right, so because again they're in the matrix construct. But there are a lot of dangers as well. There are a lot of benefits. There are a lot of dangers.

[37:20] Well,

[37:21] again in the Philippines for example, right. We don't have a law on AI,

[37:27] nothing to regulate it.

[37:29] So we have to rely on policies that we have.

[37:33] But the polish out there are still what policy are you going to adopt?

[37:40] But there are many policies out there, right,

[37:42] but it's adoption of the voluntary.

[37:45] Right. Right. Now if you don't have policies, what do you do? You have to rely on ethics.

[37:50] Okay. So there are a lot of problem areas as well. So going back situation we need to raise awareness more now more than ever if we're raising over its privacy cybersecurity.

[38:03] Because AI tends to amplify things either positively or negatively.

[38:10] So yeah, we have a lot of things to do.

[38:14] I'm really concerned about AI.

[38:17] Again I like AI, there's so many benefits. But we also need to address the risks and challenges to the use of artificial AI. And then I don't want to be to mouth off what's already been said by so many people.

[38:32] There still has to be human oversight at the end of the day, right? Either when we talk about human oversight, meaning you as an individual that you have to do not necessarily trust what you see, what you get.

[38:47] And you know this very well as an expert. Yeah, Trust but verify,

[38:52] right?

[38:52] Debbie Reynolds: Totally.

[38:53] Jon Bello: That's one thing that we need that basic simple. That's very simple concept necessary more than ever to the people who use AI. We cannot rely on it.

[39:04] We can probably depend on it, but do not rely on it for the truth,

[39:09] Right? Because it's not really the source of truth. Because the way the jet is fine is based on probabilities that they sell in the truth. So what is the probabilities out there based on false assumptions?

[39:24] Then you will create a response that could possibly be wrong because respond the information based on false assumptions.

[39:32] So for example, classic case, right?

[39:36] In the Philippines,

[39:37] at the very least my opinion biometric Data is not 70% permission, right? It's not why it's not sensitive? Because it's not in the list of sensitive data according to the law.

[39:51] Because in the Data Privacy act you have a classification of personal information and what will be considered as tested personal information.

[40:01] This will be government identifiers, health data,

[40:05] race, ethnicity, age.

[40:09] But it doesn't include biometric data.

[40:13] As part of the listing.

[40:16] Right.

[40:17] So when I was doing research on AI, was playing around with AI at the time, right. To see how it responds certain things.

[40:27] It gave the inform gave to me an information saying that biomedic data is sensitive personal information in the Philippines.

[40:37] So I told the chatbot, no,

[40:41] sensitive information is not biometric data, rather is not sensitive in the Philippines.

[40:47] So it responded,

[40:49] you are wrong.

[40:50] Right. So the chapter, you are wrong. And they replied, okay, so challenge me. But anyway, so what? Then my next response was, and it's all documented.

[41:01] If biometric data is sensitive under the Data Privacy act, why is there a proposal in Congress to amend the law to include biometric data?

[41:15] So the next response was this. You are right.

[41:18] There's a proposal. Okay, so it's not sensitive for information.

[41:21] Yeah, but.

[41:22] Okay, so why did it come up with the idea that biometric data is sensitive for information?

[41:30] Because when it crawls through the web and gets information.

[41:36] It saw a lot of websites in the Philippines which includes biometric data in the list of sensitive information data. So because of there's so many websites out there.

[41:49] Right. So it got information based on probabilities.

[41:53] If there's so many website, they're saying it is processed,

[41:57] it must be set for information.

[41:59] And in fact it's not.

[42:01] And that's why they have a.

[42:03] An amendment in Congress to include biometric data as part of the Data Privacy Act. And that's why,

[42:10] as we mentioned earlier today, I may help to trust and verify what you get.

[42:17] Debbie Reynolds: That is a hilarious example.

[42:20] And I tell people,

[42:21] especially around privacy and laws and stuff like, do not trust the chatbot with that, because especially they're terrible at laws. Especially in the US we have a lot of laws that,

[42:35] because it's so hard to pass laws.

[42:37] A lot of times you have jurisdictions that are amending laws,

[42:42] so they don't understand the amendment part of it. Right. So they'll give you wrong information because they don't know, like the status.

[42:49] Jon Bello: There you go.

[42:50] Debbie Reynolds: Of what it is.

[42:51] Jon Bello: There you go.

[42:51] Debbie Reynolds: Talked about that, about the. That they were proposing.

[42:55] That just falls right in line with what my experience has been. I'm like arguing with the chatbot too. Like, no, you're totally wrong. This is Right.

[43:03] That's hilarious.

[43:04] Jon Bello: Really funny. I was really laughing at the time. I was really laughing because why. Why am I g. Conversation with the chat. But becoming too personal for me.

[43:16] Debbie Reynolds: Yeah, right. Oh, my goodness.

[43:18] Jon Bello: Well, but then again, right.

[43:20] We need. We need to utilize the power of AI we just offer a lot of solution.

[43:28] We cannot shine Artificial intelligence, we have to embrace it. Right? But we need again to put the, again the often used word like guardrails. Right?

[43:40] Debbie Reynolds: Yeah, yeah.

[43:42] It's funny because I think, I feel like media sometimes,

[43:47] I guess when you're in the middle, you're not that interesting. Even like reality shows, you have to be like one extreme or the next.

[43:55] And so a lot of the news we get in AI is like this really bad dystopian edge case. Or like, or like I'm going to lock myself in a shed for 30 years, be off the grid type of thing.

[44:10] Where really I feel like we should be more in the middle where we're like, okay, let's find out what's the best way to use it? What's the best way not to use it.

[44:18] Let's not pretend like you could just shut the door and it's going to go away. Or let's not forget, like, pretend like the article actually was published, like, oh, the world's gonna end in two years because AI, like,

[44:29] it's not gonna happen. But so just having more humans, right, thinking in that middle space about what makes the most sense, I think is the best way to go.

[44:40] Jon Bello: Yeah, you're right, because the media tends to overhype. Thick side. Either they overhype hypothesis positively.

[44:50] Right. Or overhype it negatively. Why? Possibly because sometimes you overhype the, the potential of, of AI and sometimes to the border, like you're almost selling snake oil.

[45:02] Right. With respect to the capacities of AI. But then again, the other side of the coin is that the negativity is something that we also need to be. To address.

[45:13] Right. Again, as you mentioned a while ago,

[45:16] sometimes they're saying the end of the world,

[45:20] but it's really up to us. That's why we need to put the guard. Because they're.

[45:26] Debbie Reynolds: Yeah, I always say, I always say, if you're so smart, let's go take a swim.

[45:32] Jon Bello: Exactly. Exactly.

[45:35] Debbie Reynolds: Well, if it were the world according to you, John, and we did everything you said, what would be your wish for privacy or data protection anywhere in the world? Whether that be regulation,

[45:46] human behavior or technology?

[45:48] Jon Bello: Yeah. Okay, so let's go back to the my very first statements.

[45:54] Right? We have to share knowledge as much as we can with each other.

[46:00] Right. Because we need to learn from both the lessons learned by other jurisdictions.

[46:09] We need to learn from our own mistakes.

[46:13] We need to share those mistakes with other people, other jurisdictions, other organizations, so they won't make the same mistakes. Let's not be too hung up with confidentiality.

[46:27] Right. You should be hung up on that. Right.

[46:30] There are many ways to share knowledge without necessarily compromising some sensitive information.

[46:38] We need to share as much as we can.

[46:40] Right.

[46:42] The only way for us to move further ahead than the challenges. I think cyber security,

[46:49] the bad actors are so great, right? In knowledge sharing,

[46:54] right? They're so great. Okay. Oh, this is. And,

[46:58] and they share knowledge. Very open, very public, right? Very public.

[47:03] So we saw this vulnerability or we have this new method of attack, new attack vector. And they so not shy about sharing information.

[47:14] If we learn of something,

[47:16] something, whether it affects privacy, whether affects security or the use of identification, intelligence, you should also be able to share knowledge as much as we can. This the only way for us to be ahead,

[47:30] to be ahead of the curve.

[47:32] Right. Otherwise we will even be falling behind.

[47:36] Right. So there's some data privacy in the world view. We have to put world privacy as a coordinated effort. Right.

[47:48] And my wish, for example data privacy was especially is that we should find a way to be able to have a common mechanism to enforce regulations. Right. For example, if you're in the Philippines, how can.

[48:05] Or Thailand or Malaysia, what if the organization is located outside of the country?

[48:12] What is the enforcement mechanism? Especially don't have a binding agreement between this Asian country and the organization,

[48:21] which is located offshore,

[48:25] right. So we need to have.

[48:28] Because even if we, let's say for example, we say, okay, our laws apply to you guys,

[48:33] right?

[48:34] And that's on paper. I think what we need to understand what is the mechanism to enforce that law across jurisdictions as well. I think that's what we need in data privacy, in cybersecurity.

[48:47] I think we're,

[48:48] we're better off because we have, you have treaties, right? You have those treaty of cyber crime and you have your signatory. And then as long as you have a similar.

[48:59] If it's a crime in your country, in a crime under another country, then you can extradite a person, right. So there's at least a mechanism in the field in the world of cyber security on how to enforce or regulations against cybercrime.

[49:15] I don't think the same holds true in the world of data privacy. Right. I think that that's something that like two,

[49:23] which is right. Number one,

[49:24] knowledge sharing is key and second,

[49:28] an enforcement mechanism that we can really rely on, right. On a country to country basis. So there.

[49:35] Debbie Reynolds: Those are big wishes. I love those. I love those. Right? And I agree with.

[49:40] Jon Bello: You can always wish, you can always dream. Totally.

[49:43] Debbie Reynolds: We totally can. We totally can.

[49:46] Jon Bello: Dreaming is free. But we just hope that it materializes right?

[49:51] Debbie Reynolds: That's so true. Oh, my gosh. Well, thank you so much. I'm so excited.

[49:56] Thank you. Happy having me, Debbie, to have you on the show. And yeah, I'm sure we'll be able to talk soon and collaborate some more, but it's fantastic. And I'm so happy we were able to meet here online line again and hopefully next time we meet in the future in person.

[50:13] Jon Bello: Yes. Finally.

[50:17] Debbie Reynolds: All right. Talk to you soon. Thank you so much.

[50:20] Jon Bello: Thank you, Debbie.

[50:21] Debbie Reynolds: All right, bye.

 

Next
Next

E259 - Andreea Lisievici Nevin, Privacy and Tech Lawyer, owner at PrivacyCraft (Sweden)