E241 - Phillip Mason, Global Privacy Program Manager, Corning Incorporated

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:13] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.

[00:25] Now,

[00:26] I have a very special guest on the show,

[00:28] Philip Mason.

[00:30] Philip Mason is the Global Privacy Program Manager at Corning, Inc.

[00:36] Welcome.

[00:37] Phillip Mason: Thank you. Thank you. Thank you so much, Ms. Debbie.

[00:40] Honestly, I can't believe I'm here talking to you. You have had so many incredible and brilliant guests on here,

[00:46] and I'm not putting myself down by saying I'm not of their caliber, but I am so honored to be here nonetheless. Thank you.

[00:53] Debbie Reynolds: Well, you're such a gracious and smart you.

[00:58] We've been connected on LinkedIn for many years. You're such a great champion of privacy and all of us in the privacy community,

[01:06] and you always have so many really interesting points of view that you're able to get across. And so I love that about you and I'm so glad that you're able to be on the show today.

[01:18] Phillip Mason: Thank you.

[01:18] Debbie Reynolds: I'd love for you to tell your background. So in addition to being a whip smart attorney, you're also a cpa.

[01:27] You have more letters behind your name than Alphabet soup. But I want you to tell your journey and how you got into privacy in your career now.

[01:37] Phillip Mason: All right, well,

[01:38] first I want to give a shout out to the just the terrific team I work with at Courting. So shout out. Dan, Lucy, Harish, Fonda, Clara, and Richard. Thank y' all for letting me be a member of Yalls team.

[01:48] Y' all are great.

[01:49] As far as my background goes, I started off my career in investment and tax consulting. I was a CPA and I have a master's degree in accounting.

[01:57] After doing that for a few years, I went back to law school,

[02:01] which was the worst three years of my life. But I,

[02:05] in any event, got licensed here in the state of Texas. After graduating, instead of going to work as a just a quote unquote pure lawyer, I decided to leverage all my previous experience in education and went to work for one of the largest investment advisory firms here in Houston,

[02:21] Texas. After being there for several years, I left there and joined a leading global insurance organization providing a wide range of property casualty insurance and other financial services.

[02:32] My focus there was mainly in the financial services, but I also assisted a ton in compliance and legal reviews of their financial services products. It was during my time at that organization that the GDPR kicked off.

[02:43] And given that it was and still is a global organization, the GDPR was very applicable to them in several areas.

[02:51] I got plenty of exposure to GDPR and I found it rather piqued my interest because it was also new and exciting back then.

[02:59] But after 15 years or so of working in very large organizations, I decided I need a change of pace and joined a small startup that was involved in SaaS and computer hardware manufacturing as their counsel in DPO.

[03:12] It was during that time I became uber immersed in all things data protection and privacy. It was also that time that I knocked out all those IAPPC certifications, which is something the IAPP seems determined to keep increasing the number of.

[03:25] So shout out iapp.

[03:28] And so after that I did some privacy related consulting and I wound up in my role at Corning. A friend of mine forwarded me the notice about this particular role and I threw my hat in the ring and was.

[03:41] Was just really blessed to land it. So that's,

[03:44] that's how it came to be where I am.

[03:47] Debbie Reynolds: That's quite a journey. I love to hear people who don't take the straight path into privacy. I think all of us fall in it from different reasons, but I think that probably one of the threads that run across a lot of people who I find in privacy is they really have that passion or compassion for humans and so they try to use their skills or their superpowers to be able to help humans.

[04:14] Tell me what is happening in the world today,

[04:19] Privacy that's concerning you?

[04:22] Phillip Mason: You know, I think there's two. Two areas.

[04:25] I was listening to a podcast recently that featured a highly respected lawyer from a prominent law firm in Ireland.

[04:33] And during that podcast the, the gentleman noted that several of his small and medium sized clients were deciding declining to put their AI products into further development because of the overall cost of attempting to comply with the new EU AI Act.

[04:52] And I don't think stifling creativity serves any of us well in the long run.

[04:58] For all of the faults that progress can bring to us,

[05:04] ultimately I think it is in the human spirit to keep driving forward, to keep seeing what's over the next horizon.

[05:12] And if people aren't bringing forth those ideas because of fear of, you know, the cost or litigation,

[05:21] I don't think that's very,

[05:23] that serves us well in the long run.

[05:25] And I also think it lends us to be further dominated by existing big tech, which I think we could all say, you know, does need some, some tweaking and adjustment around the edges to Say the least.

[05:37] So that's my first concern, is that it's easy to be a critic.

[05:41] It is far more difficult to be creative ultimately is my view and opinion on things.

[05:48] The other con, I guess concern or observation that I have involves obviously everyone's going to say it, AI and how they lend themselves to being prediction machines. And I think as important as prediction machines are in a decision making process,

[06:08] I think we forget that the prediction aspect of a decision is only one component of a decision.

[06:16] There are other key elements to making decision.

[06:20] So when you're making a decision, you're taking input from the world that enables the prediction. That prediction is possible because training occurred about relationships between different types of data and which data is most closely associated with the situation.

[06:34] You need to combine the prediction with judgment on what matters the decision maker. That prediction is possible because training occurred about relationships between different types of data and which data is most closely associated with the situation.

[06:48] Then you combine the prediction with judgment and that judgment will then in turn lead to an action of some sort. That action leads to an outcome.

[06:59] So the outcome is the consequence of the decision and that outcome actually can help provide feedback to help improve the next prediction. So my point in saying that is that as prediction machines become better,

[07:11] faster and cheaper,

[07:12] we're going to use more machine predictions versus human predictions.

[07:19] So now we have more predictions by machines,

[07:22] so those become cheaper sooner. But now the other components come into play all the more.

[07:28] So the judgment action of a decision becomes extremely important.

[07:35] And so we have to recognize that with that such a judgment,

[07:41] how are these decisions being made, by what type of people that being made? Because I think everyone says it's great to have a human in the loop. We want that human judgment.

[07:50] And I would agree to a point.

[07:52] But what if the judgment is made by a person that isn't the greatest? Do we want that person making the judgment?

[07:59] And so I think it's hard to separate out judgment and character that we want to make sure it's not just having a human in the loop, it's having a character human in the loop.

[08:11] A person that is willing to say, you know what,

[08:14] maybe this isn't the best decision we should make.

[08:17] For example, recently on LinkedIn, as it was every went around,

[08:22] it was LinkedIn opted in members to train LinkedIn AI on their personal data. They then, you know, didn't really provide great notice about that and then made the opt out process rather tedious.

[08:39] I find it almost incredible to believe that there was no human in the loop in that assessment. But I find it even more incredible that when people were sitting in the room and they said out loud, yeah, we're just going by default, we're just going to use every user's personal data to train AI.

[08:58] Everyone who's cool with this raised their hand.

[09:01] Did no one stop to say, hey, maybe this isn't the best thing regarding your members?

[09:08] Did it even occur to maybe at least offer them something for it? Hey, if you let us do this, we'll give you a free month of LinkedIn Premium, something along those lines,

[09:16] and not depict on LinkedIn. But I think that applies to a lot of situations where people are so concerned about the impact of AI, and I'm concerned about the impact of AI as well,

[09:29] but I'm far more concerned about it from the human judgment aspect of it. Because once again, as the prediction version of a decision becomes cheaper,

[09:38] the human judgment aspect of that decision becomes much more important.

[09:44] Debbie Reynolds: Wow, that's a lot to think about.

[09:45] Thank you for that.

[09:47] I agree with that. I think,

[09:49] I guess two things.

[09:51] One is that I am concerned because I see a lot of companies trying to use artificial intelligence as a replacement for human judgment, and I think that's totally the wrong way to go.

[10:05] But then you just went another step further, which is, you know, the character of the human that's making the judgment, that's another thing. So one is,

[10:14] you know, you don't want a situation where organization is using a technology.

[10:19] It may do something that's harmful or bad or something that a customer doesn't want. And then their answer is like, well, we don't know what happened because the AI told us to do it.

[10:27] Right?

[10:28] So we know that laws and regulations are really about people and behavior and even companies, how they behave. And so I think trying to sit in the backseat while you let all these other technologies try to do the heavy lifting, you know, there's nothing wrong with doing the heavy lifting,

[10:45] but there, there really would be an absence or a problem if there wasn't a human who was in the loop. But then having that judgment is judgment and that character is really important.

[10:59] I think, you know, with your example also not to pick on LinkedIn,

[11:04] I. One thing I would tell people is that anger will not be automated.

[11:10] Right.

[11:11] Phillip Mason: That's a great, like,

[11:13] yeah, that's,

[11:14] that's, that's, that's, that's spot on point.

[11:17] I also wonder sometimes when we're talking about AI being a black box and how we really need to, to put a lot of effort into making it more explainable and there Are some serious efforts being made towards that.

[11:30] And I am very much in favor of that. But sometimes I wonder if people put expectations onto AI being explainable that we don't impose upon ourselves.

[11:42] For one example, take a jury decision.

[11:45] I mean, jury decision,

[11:47] by almost by definition is a quote, black box. I mean, how many,

[11:51] you will never know truly what happened in a deliberation room. I mean, there's no recordings or transcripts coming out of that.

[11:59] And so sometimes you get those crazy decisions out of a jury and you wonder how the heck did they come up with that?

[12:05] Well,

[12:06] you know,

[12:07] if we didn't expect it from ourselves, how can we expect it from AI?

[12:12] And so just that's just something I also ponder in the middle of the night when I should be sleeping, is just things along those lines. And even in situations where,

[12:23] you know, when judges make decisions and that they're following certain protocols,

[12:29] I often wonder,

[12:31] once again, sometimes some of the curious decisions,

[12:33] the hoops that they jump through to justify a decision,

[12:37] I sometimes wonder, is that really the explanation as to why they came out with a certain ruling?

[12:43] Maybe. But I think sometimes maybe not.

[12:46] And we will never know ultimately why they may have ruled the way they did.

[12:51] And so that's just something I think about that maybe before we cast too much dispersions on an AI's ability to explain or understand its decision, that we hold the mirror up in saying, are we always honest and forthright?

[13:09] And sometimes do we even understand why we make certain decisions the way we do? For example, why did I choose Mexican food over Italian food that day? I don't know,

[13:17] just felt like it. I couldn't give you a hard and fast explanation.

[13:20] Now the impact of course is far different on someone's life, I suppose. Once again, just, just things that I, I ruminate about.

[13:28] Debbie Reynolds: I guess your discussion or the thing that you said about AI being predict prediction machines,

[13:36] that always concerns me. But I want your thoughts. So this is what I think.

[13:41] First of all, when we moved into a data age, a digital age,

[13:45] what we went from was kind of gut feelings,

[13:51] right? Like lagging insights where people were making decisions, things like that. And so what the digital age brought was ability to have more real time or near real time decision,

[14:03] more decisions that could be based on data and information is gathered, even though that data, information may not be as accurate as it wants to be.

[14:12] And then now we're moving into an AI age where now we're turbocharging the data that we have and we're doing it at scale.

[14:21] And I guess the challenge that I have with using AI as a prediction engine is because what AI is doing is making a prediction on what happened in the past.

[14:33] And so the future will not be like past.

[14:38] So that's always my challenge with AI.

[14:40] Phillip Mason: You're exactly right and for all. And obviously,

[14:43] you know, poor, corrupted, biased data that gets fed is only going to lead to poor, corrupted and biased outcomes. But to your point, even if you have perfect data that is just as clean as the wind driven snow,

[14:58] once again,

[14:59] it can never, it will never, because there are Black Swan events there, there are ways that there's no way it's going to predict it and you expect it.

[15:08] I think most people expect that decision, that prediction to be nearly always, almost entirely accurate.

[15:16] And this is why I think I am focused and think so much more and reading about the human judgment aspect that you can have all the prediction in the world,

[15:28] but if you don't apply sensible,

[15:31] reasonable,

[15:32] rational judgment to the extent possible.

[15:35] I think to your point, it could really,

[15:37] it could really put,

[15:39] put some folks in a bad situation or lead to bad outcomes and folks could not just, you know, at the individual level, but certainly at a group level, state level,

[15:48] country level.

[15:49] So I think you bring up a very legitimate point. I don't know if there's any true resolution to that, however.

[15:56] Debbie Reynolds: No, it's one of those tough questions.

[15:58] Phillip Mason: I don't know how you would, how you would square that circle because I think just by being alive you're just, it's one almost. I know it sounds crazy, but one Black Swan event after another, something comes out of left field.

[16:09] In a million years you never would have predicted, but it happened and now you can factor it into your next decision and the next Black Swan event, which would have no impact on that previous one, occurs and you're like, okay, well,

[16:22] so I think at the end of the day you simply have to live with the consequences, as frankly we've done throughout our lives.

[16:29] Debbie Reynolds: Yeah, that's true, that's true.

[16:32] I want to dovetail on something that you said really interesting and that was about the AI act and some companies not wanting to go forward with certain AI things because they were kind of concerned about the regulation there.

[16:47] I guess the thing that interests me and I want your thoughts on this.

[16:52] And I feel like a lot of organizations are terrified of regulation and privacy because I think they've maybe grown up with the idea that quote, unquote, regulation is bad. And I don't necessarily think that, but I also think a lot of organizations may not be thinking about the customer angle.

[17:12] So customers,

[17:14] regardless of regulation,

[17:16] they are unhappy when companies take their data and use it in a way that they would not have either agreed to or they were not aware of. So I feel like part of the trepidation towards regulation probably doesn't take into account that customers could be upset as a result of action that a company will take.

[17:39] Like we saw for example with Microsoft when Microsoft introduced their recall thing that they were going to.

[17:46] Basically the purpose of recall was like, hey, you know, if you ever forget this, you know, ask computer that because we took pictures of your screen every five seconds for 20, 20 years or something like that.

[17:57] And, and people really just, there was a firestorm from the consumer community. Even like the professional tech people, they're like, like this is like way too far, this is like way too out.

[18:08] And I feel too far out. And I feel like a lot of that is an imbalance between what maybe a company wants or what they think that they think is good versus what a consumer would actually want.

[18:20] And I think that consumer backlash is they what, what we're seeing with some of these ideas. But I want your thoughts.

[18:26] Phillip Mason: You know,

[18:26] it's interesting the backlash against recall, which I very much shared in versus say the information that is gleaned provided to say Google and Meta or Facebook,

[18:42] you know, people willingly use,

[18:45] you know,

[18:46] Facebook and Google,

[18:48] you know, quote unquote, free versions of such things.

[18:51] And that data think about is that data is also pretty much being stored every five seconds, quote, unquote, you know, it's constantly being stored. What my backlash on Microsoft was,

[19:04] okay, I am paying for Microsoft Office Suite. I am paying for,

[19:10] you know, various other services. I'm actually paying for those.

[19:13] So you know what I get, I feel I should have a little more say in such situations.

[19:19] Yeah,

[19:21] you know, if I'm paying for food,

[19:23] I expect that food to be of a higher quality versus if someone is giving me something because there is, there is a reality of you get what you pay for, which once again not depict on LinkedIn.

[19:33] But if I'm a LinkedIn Premium subscriber, I'm paying for a service.

[19:37] That's why I'm, I personally was so offended to buy the opt in aspect to my personal data.

[19:44] So I think that's where the backlash came in is that I don't know many people. I don't know how you would use most Microsoft services without paying for a lot of them.

[19:56] I'm sure there's situations, but I know me personally in my professional capacity and even in my personal one I'm paying a subscription fee to utilize those services. So if I'm paying the subscription fee, then I think that is reasonable to say, you know what,

[20:14] then I at least need to be very clearly informed that you plan on doing this. So if nothing else that I could,

[20:21] okay, I'm going to stop paying for it. And you know, there are other services I could possibly use right there.

[20:26] Debbie Reynolds: Yeah,

[20:27] I don't know, I feel like maybe in both of those situations there was kind of a fall down on the communication and also it's kind of the notice part.

[20:35] Like.

[20:36] Phillip Mason: Yeah, I mean they just do it. You know,

[20:40] that's once again where you're in the room.

[20:42] Yeah, we're just gonna do this. Everyone cool with that? Okay. Okay, great.

[20:46] There's no one there that not one person.

[20:48] And I, I, I find that almost impossible to believe that, that not one person of, of substance who was sufficient. Your seniority level said, yeah, that was great. That there's, yeah, no issue with that.

[21:01] So go ahead. I can't believe that.

[21:04] Debbie Reynolds: Yeah, I agree, I agree with that.

[21:07] I want your thoughts just on the complexity regulation, especially in the US On a state level.

[21:15] I want your thoughts. I just think privacy regulation in the US will go the way that data breach notification went where California was the first state to have a data breach law by 2019.

[21:31] Every state had one, but they were different in every state.

[21:34] And so what I was hoping that would happen by then and people say, wow, this is like a bad idea that we have different data breach laws in every state.

[21:42] So let's try to do something on privacy that's more unified as a nation on a national level. And that has not happened. So what we see now is privacy legislation on a state level in the US really following this trend,

[21:58] data breach. But I think it's also getting more complex as we bring in more technology. So we're seeing all these kind of spread splinter privacy laws around. Hell, you know, deep fakes AI, different things.

[22:08] They have privacy aspect. But I want your thoughts.

[22:12] Phillip Mason: Well, I think it kind of pairs well with your previous question that this is why potentially an irrational fear of regulation.

[22:20] But this is why there is that, that fear concern of regulation. You know Eileen,

[22:26] I love the example you give of vehicles, how it was regulation that put in seatbelts and having those seatbelts that saved how many lives? Countless lives having it in there.

[22:34] So regulation was fantastic.

[22:36] On the other hand, there's,

[22:38] I know there's regulations here, at least here in the state of Texas, that you have to have a license to cut someone's hair or, you know, arrange their flowers. I, I would suggest neither of those latter two are needed.

[22:50] When it comes to the state stuff, it is inevitable that because there are 50 states with 50 different perspectives on things that it does become fractured and that the attempt at, to put into law at the federal level was stymied because.

[23:09] Because California didn't feel it was stringent enough.

[23:14] My view was get it in place. My view is to get the federal law in place and because that's the compromise that we have to make on a lot of things that,

[23:26] you know, look, I understand and even would have been in, in favor of everything that they were doing.

[23:32] However, it now comes at a cost where you will, you, you're correct. We will have the ultimately 50 different state laws dealing with this like we do in breach.

[23:43] But unless people want to compromise on that, then that's what you. Then that's what inevitably what you have to wind up. I am of the view don't make the perfect.

[23:50] The enemy,

[23:52] perfectly enemy of the good. I guess that's how they use that phrase that I thought that proposed federal law gotcha is 75ish, maybe 80% of what you're shooting for.

[24:02] And then there was nothing stopping amendments or addendums happening to that as different administrations roll in and out.

[24:09] Because keep in mind, it's rare that a law at the federal level ever, ever gets repealed.

[24:15] So I thought you, this'll give you your toe in the door and it's only gonna get more intense as, as I think the years go on. But.

[24:24] So I thought it was a lost opportunity.

[24:26] And you know, but once again, the people in California with that, they may say, fine, at least we have ours. It's just how you want to view it.

[24:34] I was in favor of it, the federal one.

[24:37] Debbie Reynolds: Yeah,

[24:38] I thought my thought, and I'm not sure, maybe it's unpopular, but I think my thought about federal law somewhat along the lines of what you're saying, but a little bit different.

[24:50] So you know, for me, and I thought this about data breach. You know, I wanted to see since the US all states have data breach, why can all the states get together and say, let's create something that's more harmonious that we can all agree on, even if it's not every single thing.

[25:07] But just maybe some fundamentals, maybe like, like briefs, notification, time frames,

[25:14] the definition of what's personal data, the definition of what's sensitive data, you know, those things doing that at a federal level. And then if the states want to do Some additional tweaks they could possibly do that.

[25:27] And so I was hoping.

[25:29] My view is even at a federal level, even if the only thing we had at the federal level is definitions that go across all states around how we define things, this is how we want to do X, Y and Z.

[25:43] And leaving out the preemption fight and leaving out the private right of action, maybe tossing those down to the federal level. I don't know. That was just my thought. What do you think?

[25:54] Phillip Mason: I think it's pretty interesting because. And actually it's a great insight, Debbie, because, you know, you do have the Uniform Commercial Code,

[26:00] so why don't you have the Uniform Privacy Code?

[26:04] I would that frankly, it's a slightly brilliant idea.

[26:09] You know, they do have certainly, you know, the governors get together or a good chunk of them do every year. That would be absolutely fantastic, great idea. I know I'm wrapping my brain around, you know, how it even go about kicking something that like that off.

[26:23] But it would have to be, you know, obviously you have to be at the governors would have to be the ones that getting together to truly do it. I know they have typical meetings ruled with mayors and things like that, but that's not a high enough level.

[26:36] So yeah, a Uniform privacy code would be tremendous.

[26:40] Standardizing definitions across things because you are. One of the personally frustrating things for me is just definitions of like sensitive personal data among the states.

[26:51] I think it's. I'll be blind. I think it's borderline ridiculous, the variations between the two and so I think that's what makes privacy such a fascinating,

[27:00] fascinating but also such a frustrating area is that it becomes a game of whack a mole as opposed to trying to do the right thing.

[27:08] And we're all poor for that reason, which could almost be an argument. This is why people that aren't fond of regulations,

[27:15] I don't know if they're really in principle opposed to regulation. You need guidance. You can't just have chaos otherwise.

[27:24] But this is where taking it to a certain point leads to the situation wherein where all these myriad of variations and at the end of the day, sometimes I wonder what have I accomplished?

[27:36] Oh, I followed this rule, that one, this one, that one, that one, that one.

[27:40] Okay. Was it the best I could have done?

[27:42] I would suggest probably not. But I have to follow a gazillion different laws and I think they,

[27:51] they, they miss the spirit of the law. And that's what my. Like I said, reverting back to the EU Act. I'm glad to see they had small and medium sized enterprises factored in there.

[28:01] I wish the GDPR had more small, medium size aspects factored into there as well.

[28:07] And I wish the states,

[28:09] to your point,

[28:10] would get together and have a standardized definitions and possibly even general procedures of how to go about it and then let them have their unique little nuances that are so lovely to keep track of.

[28:27] Debbie Reynolds: Well, thank you. Wow, that's high praise. Thank you for my idea. I hope someone tries to adopt this.

[28:34] What's your thoughts about this? And so I did a framework, it's called Privacy Safety.

[28:42] So the data privacy safety framework. And I just feel like we need to, you know, I've heard people describe privacy as many things. So I've heard people describe privacy as freedom.

[28:55] They describe it as a fundamental human right. You know, we know it's not in a legal sense, privacy is not a fundamental human right in the US but like in the, in Europe it is because it's part of their, their constitution right.

[29:08] But I thought we should start talking more about privacy and safety because privacy does have a safety element in it.

[29:17] So the more data is being released about people can create more unsecure,

[29:22] insecure or dangerous situations for people. But I just want your thoughts on that because I feel like not everybody understands privacy. Maybe they think it's like kind of an ivory tower concept, but I think everybody understands safety.

[29:37] But I want your thoughts.

[29:39] Phillip Mason: Yeah,

[29:41] you know, it's interesting Overall the European vs United States view when it comes to privacy and I guess overlaps there with safety as well.

[29:52] But regarding the safety, I think people knee jerk, you know, reflexively think personal safety, which obviously that's each component, but I find it's a much easier sell when it comes to financial related matters.

[30:05] That when you talk about the safety of your personal data,

[30:08] if you're talking about, hey, I my bank account's been drained or my credit card's been hacked, they, they don't instinctively think personal data violation, they think safety violation.

[30:20] So I think to your point of doing a privacy and safety regime is really the way to go because that is how you quote, unquote, sell privacy.

[30:30] And once you,

[30:32] and then once I find you can get the financial aspect of it,

[30:36] then leading into the other areas is a little more readily feasible and understandable to people who aren't in our areas. Because when I explain to people what I do, I don't think they quite appreciate the ramifications to themselves.

[30:51] And they don't until they, oh, I had my identity stolen. Someone like that gets it really clear. But most People,

[30:59] they don't.

[31:00] And I think if they realized how much of their data was bought and sold by data brokers minute by minute basis, if not more frequent,

[31:10] I think they would appreciate the privacy aspects and the safety ramifications from that. Because if a data broker can sell your data for pennies and most of the time, obviously nothing physically detrimental comes of it, but it certainly could and it'd be very easy.

[31:27] And bad actors get their hands on such data,

[31:30] they can make your life very difficult.

[31:32] So I think the privacy and safety,

[31:36] combining the two is a terrific, terrific way to approach it.

[31:40] Debbie Reynolds: Now, I want, based on what you said. Thank you. I want your thoughts about this term that I've seen.

[31:46] I've seen some places, some regulators and different, maybe niche regulation are trying to float the term data fiduciary.

[31:56] And so I want your thoughts on that term. I know that I have found, especially my friends in Europe,

[32:03] they don't like that,

[32:04] that type of term because they don't like data associated with kind of money,

[32:09] like something that's more transactional. So they feel like data is more part of their person or their personal representation of themselves and they don't want it thought about in kind of monetary term.

[32:21] But I feel like you, we can marry it to something that people really understand,

[32:27] which are dollars and cents.

[32:29] Maybe it, maybe we can explain it better. But I want your thoughts just on that term of art.

[32:35] Phillip Mason: I'm surprised they would push back on it because honestly, when you're a fiduciary, you're some, a fiduciary on something, you're on the hook.

[32:42] So I think data fiduciary would be something they would actually rally around.

[32:48] And I have to admit I didn't instinctively relate it to financial, which when you said fiduciary duty, I inflated it sometimes I don't know why I didn't think of it like that, but I should have because it makes perfect sense.

[33:00] But I, I think data fiduciary is a great way of communicating that.

[33:06] And you know, but how receptive the people that are have that title thrust upon them will be to have it thrust upon them. I don't know.

[33:16] You know, once again, if say,

[33:18] hey, someone comes to you, hey, Debbie, you're now the data fiduciary for X, Y, Z. What exactly am I responsible for? You'd have to be very clear about that.

[33:26] I wouldn't just, I wouldn't unilaterally or carelessly accept that title without knowing what I'm getting into.

[33:34] Debbie Reynolds: Right. It's like, like if you're, you think a bank has a fiduciary responsibility to you as, as their customer and then if you're like a data fiduciary, then you're saying, okay, I'm on the, like you say, on the hook for what happens to your data.

[33:49] So I guess that is, that could be fraught with some challenges.

[33:54] Phillip Mason: I think if you put it in place,

[33:56] I do think you would, you would up the game as far as ensuring data protection and privacy principles are put into place.

[34:05] I mean if you're a fiduciary on something, you're going to, I think you take it more seriously than if you're not that personal liability. Again, a CEO signing off on, or CFO signing off on financial statements and things like that.

[34:17] If you're putting in some personal liability, it's different than it's just corporate liability.

[34:21] Debbie Reynolds: Totally. What's your thoughts on AI and how that changes or challenges the privacy landscape?

[34:29] Phillip Mason: I know there's a lot of sturm and drying about how AI models are trained and have been trained.

[34:38] I thought to me the horse is out of the barn on that.

[34:41] I think that particular battle is over that you had all this information out there.

[34:47] It's been scraped.

[34:50] The nature of the Internet was all about data sharing.

[34:53] You know, I'm of the tender age where, you know, I was there when AOL and all that kicked off and we didn't think about the time, obviously we would do it differently.

[35:07] So I'm actually of the opinion my data's already been out there. It's been out there a long time. It's been traded and sold and bought and packaged six ways to Sunday.

[35:18] So I think a lot of the data subject rights related to AI I think are frankly impossible,

[35:24] really impossible to effectuate.

[35:27] So now the question is, what do you do moving forward?

[35:30] I think moving forward you can say, okay,

[35:34] now you can only, you gotta explain where you're, you're how you're getting your training and where the training data sets are coming from.

[35:40] And that's great,

[35:41] but you're not going to be able to retroactively untrain these models. It's not the FCC and FTC can do that and I guess certain smaller companies.

[35:52] But you're not going to untrain OpenAI,

[35:55] you're not going to untrain copilot,

[35:58] you're not going to untrain grok. It's just not going to happen.

[36:01] So I think now it just has to be a whole lot of training and education.

[36:07] If you talk about it from a corporate level on companies to train their users and employees about how to go about utilizing these tools in an intelligent and hopefully also responsible matter about what data you're putting into it.

[36:24] Do you understand? You know,

[36:26] it kind of goes back to the old adage,

[36:28] maybe you don't want to put anything in print that you wouldn't want your grandmother to read. Don't put anything in a chat you really want your grandmother to read.

[36:38] Unless you're comfortable with that, that existing forever and ever and ever.

[36:44] So that, that, I mean, I wish there was a more I said it's kind of a somber point, but you can't go back. It's just not going to happen.

[36:52] And I wish there was a better answer than that, but there's not.

[36:56] Debbie Reynolds: Yeah, well, I like your pragmatism. Always.

[37:01] Phillip Mason: Kind of how I came up with that whole, you know, for all my fellow data protection privacy folks practicing in real worlds, Phil,

[37:08] you know, I don't have the brain cells that you or Odia Kagan or Phil Lee or Oliver Patel,

[37:16] Jeff Jasic, Mary Marwig, Judd Cavador,

[37:20] Alex Kralev,

[37:21] Robert Bateman,

[37:23] Rewal, Louisa Jurofsky,

[37:27] Kevin Fume, Brian Lee, Michelle Denne, Dennity, Dennity list could go on, but I don't have yalls kind of brain cells to take this really complicated information and distill it into a bite sized package like y' all do.

[37:43] But what I can do is read what y' all are doing and circulate it and then say,

[37:48] okay, this is how I would approach this from a very pragmatic point of view to all the fellow data privacy grunts like myself.

[37:58] And so,

[37:59] and that's why to the extent humanly possible, I only like to post things.

[38:03] Post about things that I have actually read, watched or listened to. I don't like just to circulate something because someone else did it. I want to actually read it first.

[38:11] Which is why I don't post nearly as much as some folks do.

[38:16] And it also, I guess we'll have all that topic for just regular practicing folks like myself.

[38:23] I eyeball almost every attachment I see on LinkedIn related to data privacy just to take a look at it.

[38:29] And there's so many, they just come day after day after day and hour by hour.

[38:34] And so many of These documents are 50, 75, 100 plus pages long.

[38:40] And speaking, speaking only for myself as someone who has no life and really does enjoy, you know, reading about this stuff, I can't possibly read, you know, majority of these things.

[38:49] So that's why when I do recirculate something or post something.

[38:53] I keep it certainly under 35 pages and under, and try to even do it, you know, 15, 10 pages and under,

[39:01] with the hope that if I find it useful, that people will actually take a few minutes to actually glean through what I do.

[39:07] Because it would only take a few minutes that, you know, sometimes on Friday afternoons, hey, here's your weekend reading. And it's 10 documents of like, yeah, if I did nothing else, maybe I could get through three of them, but maybe I'd also like to go outside.

[39:20] And so it's, it's something I try to keep in mind. And sometimes I wonder,

[39:25] how do these people have this time that they could generate such large documents. Between my work commitments and, you know, few personal commitments, I, I try not to just to rent my children.

[39:35] I try to actually do some stuff with them every once in a while. Gosh, how do they have this time? I can't figure it out. But anyway, so that's, that's the role I, the niche I try to fill on LinkedIn and data privacy circles is saying, look,

[39:48] actually watched, read, listened to this.

[39:50] I could put every, frankly podcast you put out, I could say, hey, this is fantastic.

[39:55] And, but the ones I do, I'm like, you really, really should listen to this. You know, you really will learn something useful from this. That one reason I love your,

[40:03] the newsletters and things like that you put out, is it something I can actually get through and understand and appreciate and then say, hey, if you're practicing in X, Y, Z area of privacy, you'll find this particular aspect of the article very useful.

[40:15] Whereas if I'm given something that's a hundred plus pages long, that's essentially at the end of the day saying, how many data privacy angels can dance on the head of a pin?

[40:25] I'm like, okay, what do I do with this?

[40:27] And so God bless the folks that do it. I mean, mad props to you guys. You got, I mean, fantastic. But for me personally, like, oh, gosh,

[40:34] you know, and that's what I'd always try to keep in mind. I want to help bring people along as best I can and not come across that, hey, look how much I know more better than you do.

[40:46] Debbie Reynolds: Well, you do a great job. You're definitely a champion and I appreciate all your posts. I always smile when I see because I know that there's definitely some thought there,

[40:55] obviously.

[40:56] And you don't just forward things. You do add your personal touch and, you know, you bring your brilliance to all the work that you do.

[41:04] Absolutely.

[41:05] Phillip Mason: Well, I thank you for everything you do, Ms. Debbie. And I would like just to end this. Just saying. Hey, everyone, we're all in this very small little world here in data protection and privacy circles, and I think it behooves us to remember we're all on the same team.

[41:19] We're all going to have different views and opinions on matters. But I know personally, I try to do my best just to really emphasize where we have things in common and the best parts about what's being posted and the things I disagree on.

[41:34] Well, you know what,

[41:36] maybe I'm wrong on that and I just accept it. But with my bottom line point is, is that I think it behooves all of us to remember too that we are on the same team.

[41:46] Let's do the best we can to help each other so that ultimately what we do can help people in general. Because really, I truly do believe that what we do in data protection and privacy helps a lot of people.

[41:59] You know, I'm not a. I can't fix someone's ailments and I can't speak to their spiritual needs,

[42:06] but if I can help protect their data a little bit, that's a big deal. And if we all do that, I'll help try to protect people's data a little bit.

[42:14] All that together,

[42:16] that's tremendous. And I really do think it serves the public good.

[42:20] Debbie Reynolds: I agree and I share your enthusiasm for that because I think it is. I think sometimes we think of it as such a big issue is so daunting. But I think all of us doing a little bit will make a huge difference.

[42:32] Phillip Mason: Yes, ma' am.

[42:33] Debbie Reynolds: Yeah.

[42:34] So if it were the world according to you, Philip, and we did everything you said, what would be your wish for privacy anywhere in the world,

[42:41] or data protection, whether that be regulation,

[42:45] human behavior or technology?

[42:48] Phillip Mason: Say it out loud.

[42:51] That's what I would leave folks with if they choose to take anything from this.

[42:56] Say it out loud.

[42:58] And once again, not depict on LinkedIn,

[43:01] but if you're in a room dealing with other business stakeholders, other privacy professionals,

[43:06] and there's some aspect of a privacy related project that you're working on and doesn't seem quite right.

[43:16] Say it out loud.

[43:18] Hey, is everyone. Yeah. So what you're saying is that we would.

[43:21] We're just going to take people's personal data and use it to train our algorithm and everyone's. Everyone's cool with that? No problems with that?

[43:32] Sometimes I found that literally saying it out loud really helps gel what the issue is. Because if it's rubbing you the wrong way,

[43:43] if you're working with decent people overall, it's going to rub some of them the wrong way as well.

[43:49] And maybe that can be really the first step to addressing the matter in a reasonable data protection and privacy way.

[43:58] Debbie Reynolds: Yeah.

[43:58] Say it out loud. I love it. I love it. Well, thank you so much, Philip. I really appreciate you being on the show, and I love your work. And please, people, follow Philip.

[44:08] He always has some jewels that he can drop for you. You're a gem. Thank you so much.

[44:14] Phillip Mason: God bless you, Ms. Debbie. Thank you for all your efforts, and thank you for being just.

[44:18] You are the Mount Rushmore in this area,

[44:22] so thank you for everything you do.

[44:24] Thank you.

[44:26] Debbie Reynolds: Make me blessed. Thanks so much. I really appreciate it. Hopefully we'll have time to collaborate in the future.

[44:33] Phillip Mason: Oh, I hope to see maybe at the summit next year. That'd be fantastic as well, if that's something you're planning on doing.

[44:38] Debbie Reynolds: Yeah, that'd be great. That'd be great. All right. Well, thank you so much, and. And we'll talk soon.

[44:44] Phillip Mason: Yes, ma' am. Take care.

Next
Next

E240 - Ian Glazer, Vice President of Product Strategy, SGNL, Product Executive and Digital Identity Expert