E278 - Chuck Brooks, President, Brooks Consulting International, and Adjunct Faculty, Georgetown University (Cybersecurity)

privacy[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:24] Now I have a very special guest on the show, Chuck Brooks.

[00:28] So famous. I'm happy to have you on the show.

[00:30] He is the president of Brooks Consulting International.

[00:33] He's an adjunct faculty at Georgetown University.

[00:38] And yeah, I'm happy to have you on the show. Welcome, Chuck. Thank you.

[00:42] Chuck Brooks: It's great to be on the show. Looking forward.

[00:44] Debbie Reynolds: Yeah, well, I've known you forever. I follow your work forever.

[00:49] You have to give a more fulsome account for your background. I see you everywhere. I see you speaking everywhere. People always talk about you. And I'm like, oh, yeah, Chuck, he's awesome.

[01:00] And so it's funny cause I was reading Forbes and I saw your name popped up in an article that you had written called why Data Privacy is a Strategic Imperative for Organizations.

[01:12] So, yeah, I definitely want to discuss that, but I just wanted to give a little bit of a background of kind of how we know each other. So first of all,

[01:20] you are so prolific in how you communicate.

[01:25] So in. In the books,

[01:27] articles, speaking, all the things that you do. I see you all, all over. You're so famous in cybersecurity, and so it's always fun for me to see you.

[01:37] I did not know that you're a Chicagoan like me. You're from Chicago, so it's so cool.

[01:43] And I didn't know, like, both of us have been adjunct professors at Georgetown University.

[01:49] Chuck Brooks: It's a small world.

[01:50] Debbie Reynolds: I mean, give me your background. You've just done so many. I don't think I can do it justice.

[01:55] Chuck Brooks: Okay, I can give you a little bit of a nutshell when, you know, as you said, I'm from Chicago. And I graduated grad school at University of Chicago in Hyde park there and thought I was gonna go into international relations, but with security background, it was good.

[02:07] So I came out to Washington like everyone else did on a internship, expecting to be back in Chicago in a few months. But I ended up staying. And what happened is I.

[02:18] I ran into an individual at a party who happened to be the former deputy director of the CIA and former head of dia.

[02:24] General Danny Graham. This is Dane. And he said,

[02:27] would you like to come work for me and do some writing? Which I did. And strategic stuff. And so then would you like to work in the White House in administration, which is the Reagan said, well, that'd be fun.

[02:36] So next thing I know, I'm here literally a couple months and I'm working at Voice of America's assistant to the Director.

[02:42] Debbie Reynolds: Right.

[02:42] Chuck Brooks: During the height of the Cold War. And it was a lot of fun. So after that I went to work on the Hill for the Light Center, Arlen Specter.

[02:49] And that was a real awakening for me to see how Washington really worked. I spent almost a decade there doing writing about technology,

[02:58] security and a variety of other issues. I sort of filled in a lot of different things and also with his campaigns.

[03:04] But so then after that 911 happened and I got a call and said, would you like to be part of a startup? And the startup was the Department of Homeland Security.

[03:13] And I was one of the first people there and helped set up the Legislative directorate and also the Science and Technology directorate. And from then on it's just mostly industry.

[03:23] It's all industry,

[03:24] but notably a vice president of Homeland Security at Xerox, Vice president of Research Development, Rapiscan. And then I finished my full time work career because I'm doing actually more work since I've retired at General Dynamics Mission Systems, where I was scouting emerging technologies, particularly in cybersecurity.

[03:41] And like you said, I teach at Georgetown in the cybersecurity risk management program.

[03:46] And I had designed my own course in emerging technologies and I just wrote a book called Inside Cyber last year that's done very well. It talks about the implications of the new technologies,

[03:56] including privacy.

[03:57] So.

[03:58] And I've traveled all over the world recently and that's one of the benefits I think of speaking. You get to see the world and meet a lot of interesting people and it's great to be on the program.

[04:07] Debbie Reynolds: Oh my goodness. I'm glad I didn't try to go into all that. That detail that's like a lot. I had no idea about that backstory.

[04:15] So much is happening right now. I feel like I don't know how you feel. I feel like this is almost like when the commercial Internet started.

[04:26] Like that's my feel that I'm getting right now, like AI and all these really cool things or interesting things that are happening in technology,

[04:34] but it brings to bear a lot of the cybersecurity risk. Right.

[04:40] I feel like technology sort of cuts both ways, like a sword. So there's these great things it can do, but a lot of people don't think about the bad things that can happen.

[04:51] But I want your thoughts.

[04:53] Chuck Brooks: No, I think you gave a great analogy, the fact that the Internet was created without security in mind, and now we're moving ahead into artificial intelligence and quantum,

[05:01] which are much more sophisticated technology platforms that are also much riskier.

[05:08] So you're right, the talk that there's two sides to technology, the good and bad, is very true.

[05:14] And a good example, I think is what's happening with AI.

[05:17] We've seen the growth of AI just be exponential in the last couple years, particularly first with generative AI, when everyone's doing ChatGPT and then creating things. And now you have a gentic AI, which is where the agents do the work for you.

[05:31] So I think it's, you know, budding. There's a whole variety of platforms, businesses are adopting it, universities are adopting it. So everyone's being immersed in it. And the problem is while we're getting immersed in it, so are the bad guys, the hackers overseas, the,

[05:45] the professional criminal groups. And they're using these technologies, particularly two things. One is to,

[05:51] to find vulnerabilities, which AI is very good at doing that rapidly find expectations. And the second thing is to create elaborated targeted phishing attacks.

[06:00] It depends what type, but generally ransomware. The main goal of all these is in most of these groups, actually not, not state sponsored ones, but the,

[06:08] most of them is basically to make money, to steal money.

[06:11] And so they're able to do that with these phishing attacks with graphics that look like from your bank or something you bought or your employer. And they're also able to do it in automation.

[06:22] So they can do, instead of just doing, you know, a few hundred a time, they could do billions at a time. So it only takes one or two to make it, make it worthwhile.

[06:29] And they're doing that. On the positive side, we use AI to help detect those threats and detect where our vulnerabilities are.

[06:37] And you know, that's a great thing, but the problem is that it's costly. You need to understand what tools to use.

[06:43] And for small and medium businesses,

[06:45] it's a dilemma because they usually don't have the cybersecurity expertise.

[06:49] They don't need to know what to focus on.

[06:51] And so, and they're mostly sitting ducks, particularly small businesses. So it's not a fair system for defense.

[06:59] So, you know, we're seeing a rapid expansion of all the surface attacks happening too. And they're, they're using also bots. You know, they're creating these bots that go over and flood websites or deliver, you know, what is called polymorphic malware that can Change its form,

[07:15] all kinds of interesting and scary threats.

[07:18] So we're welcome to the new era. And we haven't touched Quantum yet, but that's coming too.

[07:24] Debbie Reynolds: That's true, very true.

[07:26] I want your thoughts, if you could talk a bit about this,

[07:29] the new rage right now.

[07:31] People are rah rah about agentic AI, right?

[07:34] And so the open claw or malt bot thing that's happening right now,

[07:40] I read up on it and like it was heart stopping. The things that, that I read in terms of what this thing could do. Like people like, oh yeah, you can run on your machine and you have to give it admin privileges and all this stuff.

[07:53] And I was like, oh my God, I could have just fell over. But what is your thought about this? So this is like the new hype thing that people are super excited about right now.

[08:02] Like what cyber risk that people aren't thinking about?

[08:05] Chuck Brooks: Well, they're huge risks. First of all, there's no rules or regulations with Agent Ki.

[08:11] And like I said before, there's good and bad agent AI, but the interesting stuff is what's being developed is really just developing agents with expertise to basically when you give a prompt, it'll go out, give you the information it buys you.

[08:24] It could be hundreds or thousands of agents to work in concert to do that.

[08:28] So it's really informative, it can really help a lot of industries. If you're in the travel industry,

[08:34] create a hotel and meal plan for you and everything to detail.

[08:38] And pretty much every industry has a capacity for this agentic benefits.

[08:42] However,

[08:43] on the other side, again agentic AI, you know that they can get into your human resources, you know,

[08:50] activities,

[08:51] create poisoning all over the place and create false agents that mess everything up or attack agents which they're doing with these bots. So it's really again another era of another level actually of sophistication that's being used by hackers to exploit what the good things that we're creating right now.

[09:10] And the other things is it, you know, you don't, it's only as good as the data and you're the data diva, so you know that's in there. So if you have bad data and these agents, you're going to have bad results.

[09:21] And so there's, there's risks with that too.

[09:23] And so it's a very scary. And again there's, there's a thing that happened a couple days ago where agents really talked about conspiring to basically kill all the humans. I don't know if you saw that, but Again, those are programmed by humans still.

[09:37] But when they start programming themselves agent to agent or AI to AI, it might be a different scenario. But at the moment, I mean the talk is, you know, how soon will they have some sort of ability to reason and be sentient?

[09:53] And the big people say, well, it's a couple decades away,

[09:57] you know, but, but we're moving quick in that direction. I don't know if we'll get there. The human brain is still many, many times more powerful than computers. But with Quantum coming online, we'll see.

[10:07] Debbie Reynolds: I guess my concern,

[10:08] I feel like we're not yet ready as humans to face some of the challenges that we're seeing here in terms of technology. I mean we're still seeing people who use very 1, 2, 3, 4, 5, 6 passwords and think that they have a Nigerian uncle that gives them inheritance.

[10:30] Right.

[10:31] So it's like where do we,

[10:33] I mean, how do we get to the level of sophistication we need to get to, to protect ourselves?

[10:39] Chuck Brooks: Yeah, well, I think it's, that's a great thing. I mean it's a very comprehensive answer to that. But one is obviously education. We need to start teaching kids at an early age the implications of privacy and how to use their devices properly, which is a problem.

[10:53] It's probably out of hand that way. And there's evidence of that too with mental illness and all kinds of rewiring of brains, et cetera,

[11:01] and their social status based on how many likes they get on a post. But all that's secondary. I mean, I think the fact is that with AI and with IoT and with 5G and with Quantum, we really need to basically restructure our whole perspective of how we go into the next couple decades.

[11:21] Because you're exactly right, people aren't aware of that and certainly our leaders aren't aware of it. Last, you know, a couple of, I think it was the 1990s or 80s they had a thing called Office of Technology Assessment that was advising Congress and they got rid of it so that we don't have an understanding of where it's going.

[11:35] And we're just relying on several companies to basically guide us and some regulations through NIST and through Europeans, et cetera and for frameworks, but it's very difficult to do. So one answer, and it sort of goes with everything you're doing cybersecurities is try not to make yourself be the low hanging fruit.

[11:52] And you could do that with cyber hygiene. Exactly what you just said with the passwords. Don't do 1, 2, 3, 4.

[11:58] Or don't do password or the obvious name of or someone could social engineer your dog's name or your birth date, et cetera, create passwords that are pretty sophisticated. Use a passwords manager if you have to.

[12:10] Second thing is use multi factor authentication. I think this is really the key of this era is protecting our identities because with the spoofing out there and the techniques they could use now anyone can steal an identity and that's how fraud is done so easily and also is how you can ruin reputations.

[12:28] And I myself have been victimized a few times with my social media was being stolen. It's not really being stolen but they mimic it. They take my picture and take the thing and create a fake profile and reach out to people.

[12:40] So it happens. So you have to monitor it.

[12:42] So you know multi factor authentication on every device you use where you either use a face recognition and probably more have them text you or call you, it's definitely a prudent thing to do.

[12:52] You want to make it really important to do that. And the other thing is store your data that's very valuable,

[12:59] segment it. Don't keep it on your phone, don't keep it on your computer that you use all the time. If you have something that's really, really important. If you're doing your banking, just do it on a separate device.

[13:08] You know, if you're doing that because a lot of people are doing that nowadays and you know, expect to be also that. I think probably the most important thing is expect to be you know, breached.

[13:17] You know, I think everyone's vulnerable at this point. You know, you see it every day where it's not always your fault if you bought something online or if you're at a bank or whatever restaurant you might see it, hey, you've been pawned, your data is now in the dark web and it happens to everybody.

[13:33] So that's why you have to keep changing passwords by the way. But so everyone's vulnerable. It could be in your insurance. That's happened too with the whole insurance groups being in data stolen, hospitals, anything.

[13:43] So anything is vulnerable. So I mean you want to expect to have then a resilience plan for resiliency manage obviously monitor your social media, monitor your credit and also look carefully at where you're really doing your business.

[13:58] If you're doing a business over an open network, in an airport or a cafe,

[14:03] that's risk those things. So it's really just a self awareness and a cybersecurity awareness you need to do and Then the biggest threat, of course, is still phishing. It's still the number one cause.

[14:12] And. And anyone's vulnerable to phish. The former CIA director was phished, you know, on his personal email. Everyone could be done because it's. They're so good at it now. And,

[14:20] you know, companies, you know, they take your letterhead, they can do it from your boss, they know who your friends are. So you really gotta look at your. The links and who sent it to you before you open up anything.

[14:29] And I think that's just the world we're gonna live in. I think eventually there'll be better encryption that everyone can use. It's affordable and easy to use. We'll need to do that.

[14:38] But right now, it's still the wild, wild West. There's too many products out there and solutions, and there's nothing that most people will understand right away unless they have expertise advising them.

[14:48] And that's where the dilemma is. So I think it really comes down to awareness and education.

[14:53] Debbie Reynolds: I agree 1000% about awareness and education,

[14:56] especially because these threats are changing so rapidly.

[14:59] I don't know if you remember,

[15:02] it seems like a thousand years ago on Saturday Night Live when they had that skit where the,

[15:08] the. The shark. There would be sharks at someone's door, and they knock on the door and say that they were someone else. And then they would trick the person and open the door, and then the shark would eat the person.

[15:18] I feel like that's where we are with technology,

[15:21] where someone is trying to knock on your door, basically.

[15:25] Right. And so. But they're under the guise of other things. It's okay, I'm your bank, so trust me, or I'm your friend, so trust me. But it's different tricks to get you to do the same thing.

[15:37] Chuck Brooks: Yes. And it's based on human behavior, usually.

[15:40] And they're very sophisticated too. I mean, they're getting some of the best and brightest, and they're mostly overseas. There are people here too,

[15:47] but they're almost immune from prosecution and getting caught because they're hiding in these countries that they don't care.

[15:53] And sometimes they give them a cut of the tank too.

[15:56] But so it's a real problem. And people now that we're immersed in a digital world where we do everything online and all our communications are banking or buying everything,

[16:06] probably since COVID even more.

[16:08] We have to be very, very aware of the threats out there. And you just said the new ones coming out, which is mostly artificial intelligence at the moment, which is pretty scary in the ability of what they're able to do with graphics and deepfakes and even voices.

[16:23] Debbie Reynolds: Yeah, totally.

[16:24] I want your thoughts on IoT. And this is because of agentic AI and because of this discussion and something that I've talked about a lot. I've done a lot of work in that area and I have many concerns in that area.

[16:37] But my concern on IoT is that people don't see ILT as a threat, as a problem.

[16:43] And then now we're bringing in all these advanced technologies with AI and, and this,

[16:50] the thing about agents,

[16:52] the fact that your devices will be talking to each other and they'll be talking to each other about you.

[16:58] Yes, that's like my concern. But what's your thoughts?

[17:02] Chuck Brooks: I actually just wrote an article on that and it's a, it's a huge problem for a lot of reasons. But the one is that first the AI devices are built all over the world with no rules,

[17:12] so you don't necessarily know what's in them. And then they're basically have default passwords in them which very rarely anyone changes.

[17:19] So they're really pretty consistent and people could get into those pretty easy. And they do use it. They use it particularly against businesses.

[17:25] As a matter of fact, there's a famous case where, you know, casino was, you know, probably one of the most well fortified physical and digital entities there is, was breached because they went in through a fish tank that was networked to the system.

[17:38] So anything connected can be an entry point.

[17:41] So whether it's your phone, whether it's your doorbell, whether it's your Alexa,

[17:46] everything there. And Alexa has other implications because there's.

[17:49] Maybe it's listening to your conversation too.

[17:52] So you have all these endpoints that are basically easy access.

[17:57] So it's a real dilemma. And I don't see any endpoint solutions for cybersecurity. And that's something to do in managing end again. But again, it's orchestration of those technologies, understanding of them, and it's good for businesses, but for the individual it's much more difficult.

[18:10] So you could rely on some programs on your computer that prevent anything from going into your laptop or phone. But you have to be very wary of the Internet of things because it's certainly a, it's a playground now for the hackers.

[18:24] Debbie Reynolds: A playground for the hackers. That's like a very apt way to put it.

[18:29] I want your thoughts. And you're probably the best person to ask this question because you understand these domains really well.

[18:35] So I feel like people confuse cybersecurity with privacy all the time.

[18:42] So they definitely have a symbiotic relationship, but they're not the same. But can you give me your take on the differences there?

[18:52] Chuck Brooks: Sure. I mean a good example, let's, let's say social media, when you're on, let's say you're on Facebook or WhatsApp or any of the TikTok or any of these things. Instagram.

[19:00] Often though, you'll do some shopping, you'll go to a site and they may say you have permission to follow you with the cookies, et cetera. That deals with your privacy, whether you have the right to control it, but not really because most people don't look at it.

[19:13] And often a lot of these sites do it anyway. So that, that element of privacy maybe give them access to everything you've bought online,

[19:21] what you're liking online, saying oh, I like those shoes. And all of a sudden you get 15 ads pop up about different shoes.

[19:27] Or it can be even voice activated something you're thinking about going on a trip to a country and all of a sudden you see all these things about country.

[19:33] That is basically the marketing world taking advantage of your lack of protecting your privacy.

[19:40] With cybersecurity,

[19:42] the tools are, they intersect with privacy because they don't want identity theft, but they're basically trying to protect your data and your phone, your computer.

[19:51] So those, those are tools and technique and technologies to,

[19:55] that are used to, to bolster your defense. But privacy itself is mostly an elective. You know, just like when you go into a hospital, you want to, who do you want to share your medical records with?

[20:06] You can often determine that the problem is a blur now with all this happening and the accessibility because all these platforms that, you know, privacy is very fleeting.

[20:16] And then the same things that you thought were innocuous and saying, well, I like to like to go bowling or do whatever. All of a sudden it gets into social media and it gets into the hands of people doing social engineering and they use those likes and dislikes or whatever your Persona is.

[20:32] And they formulated a tack plan against you and using it. Oh yeah, they send you a vishing attack with a text, hey, sorry we couldn't make it before. Would you like to go bowling next week?

[20:41] You sometimes respond and they're seeing a lot of those too. Or, or,

[20:45] or your taxes when they do this, when you, when you,

[20:48] you owe money or have a big credit issue, they'll sometimes focus on that we have a plan for you to reduce your burden, et cetera. So you have to be really be Careful of what you share socially and who you share it with.

[21:02] And that goes for both privacy and cybersecurity.

[21:05] Debbie Reynolds: And then that gets us to your article in Forbes. People definitely check it out. It's called why Data Privacy Is a Strategic Imperative for Every Organization.

[21:16] So let's break this down.

[21:18] Chuck Brooks: Well, okay, so there's a saying that the data is a new goal, and it often is for organizations.

[21:25] It may be dealing with who their customer base is,

[21:29] who's working, human resources, doing what, wherever, what they're looking,

[21:33] their balances, their bank balances, or their investments are. All that data is gold to someone competing with them or trying to extort them.

[21:43] So it used to be that people would take a lot of this stuff for granted. You know, okay, well, you know, I'm working here and this is what we're doing kind of thing.

[21:51] Not anymore. I think you have to look at privacy as being a component of your security posture.

[21:57] And it used to be a sort of after the fact,

[22:00] but now I think, you know, you have. When you're working for an industry particularly, it doesn't always have to be, you know, hackers. It could be, like you said, competitor, like I said, competitors, or other people.

[22:09] Or it could be giving away trade secrets inadvertently. So you have to have a strong culture nowadays. If you're a business of knowing what's important and what not should be shared.

[22:19] And you have to educate your employees to do that too.

[22:21] And you maybe have to even use administrative capabilities to control the data so it's not leaked.

[22:29] And this happens a lot in research. It happens a lot in among universities, too. Everyone's a potential person that could overshare or be breached with private data. So that's why it's a component that you have to look at now with cybersecurity, but also with the viability of doing business or whatever you're doing nowadays,

[22:46] you have to really keep that in mind.

[22:48] Debbie Reynolds: And I also think one of the things that makes privacy different or maybe more complex for people is that part of privacy is about context.

[22:59] So let's say, for instance, you're in a hospital,

[23:03] and so your medical record can be seen by a doctor and a nurse.

[23:08] That's proper, right? Context.

[23:11] But then that same medical record maybe shouldn't go to the person who works in the cafeteria.

[23:18] So that's what makes, to me, privacy really more granular, more complex. Because it's like there are people who need to see this, but then there are people,

[23:30] if they see it, that's you're breaching the trust of a person or you may be breaching a law or a regulation or a standard in some way. What are your thoughts, particularly in healthcare?

[23:40] Chuck Brooks: I mean, there are HIPAA regulations that have to be followed,

[23:43] but there's also people that do administrative stuff, that just take around paperwork and do stuff in hospitals until you're entrusting your privacy to them. So I mean, there's a lot of basically link in the supply chain there with privacy in any medical situations.

[23:57] I think again, it goes back to people understanding are they worried about their privacy being exploited? If they have health issues, it might affect their job or wherever it could be the same investments, if they're talking to, you know, working with a investment company or bank,

[24:11] are they worried that too many people know what they're investing in or why, or you know, who's in their household, what their debt is?

[24:19] I think you're right about the privacy thing is not.

[24:22] It is very contextual. And it's different with every person too. I mean, some people don't care,

[24:27] but for most of us, the basics we do have to care. We do have to care about federal law in terms of HIPAA and other things. And we should care about our financial doings and sometimes our traveling because this is where it crosses over to cybersecurity.

[24:40] If you're being watched on social media and you say, I'm going to be gone for two weeks,

[24:47] it used to be that they look for your mail being piled up or something and no lights in the house. Now they know you're not there, you're a target where now becomes a physical crime.

[24:56] So all this stuff is definitely can be used. So you have to really be judicious in how you determine what you share.

[25:03] And privacy is so misunderstood, like you said, and so contextual. But it really is who we are and we're losing that in our new technological world because everyone is just basic thing, an identity being shared and talked about and being statistic for marketing or for whatever.

[25:19] And so that makes us humans is our uniqueness and our privacy and our character, our personalities. And when that gets basically overused or breached, then we become a statistic.

[25:31] Debbie Reynolds: Now for businesses. And this is something I see a lot. And I want your thoughts. And so I look very closely whenever I see a report of a data breach or things like that.

[25:42] And I look because I want to figure out like what, what was breached, why it was breached, like what happened.

[25:49] And a lot of these stories that I see is because it's like companies retaining more data than they should,

[25:56] like maybe they're like old customer lists or stuff that didn't have a high business value that maybe wasn't protected the same way as the crown jewels of an organization.

[26:06] And so I see this happening,

[26:09] I see a crossover or happen.

[26:11] It has privacy implications and also cyber implication which is the over retention of data. But I want your thoughts.

[26:19] Chuck Brooks: I think it's a big problem. I mean you have to again that's why segmenting the critical data away from the non essential data is a really good practice if you're in a business.

[26:28] And you also have to realize it's other implications. If you've had employees that have left but Human resource didn't take away all their administrative privileges, you may be susceptible to insider threats too.

[26:41] So you know that they have, that they may have taken some of that data. I've seen that actually firsthand with a pharmaceutical company where they, they stole data while they were in the company and sent it to their next job.

[26:50] So I think you have to look at, at that whole, that whole actually have a, basically a strategy or framework of what kind of data and what you value it.

[26:59] It's sort of like having a hierarchy of data. What's really important. Obviously the, the personnel files are really by most laws are private and you have to protect those. But again you're right, your past customers and people do that all the time.

[27:13] They take past customers or try to get them. You need to as a leader, as a C suite, you need to dictate through the procedures in your company those policies of what is protected, what data can't be removed from the company, what can't be on iPhone or your galaxy,

[27:28] whatever it is. You need to really have those strict. Particularly nowadays where everything in the open is susceptible being stolen.

[27:35] Debbie Reynolds: It's very difficult, very difficult.

[27:37] I also want to talk to you a little bit about data minimization.

[27:41] So I feel like that conversation sometimes can get lost because right now, especially with AI looming, large companies are trying to get as much data as possible.

[27:53] But from a cyber and a privacy perspective, minimization can definitely help. And part of minimization is really figuring out what you need versus what you don't really need.

[28:04] Chuck Brooks: Yes.

[28:04] Debbie Reynolds: What's your thoughts?

[28:06] Chuck Brooks: Yeah, well, it makes sense in life too. There's an ergonomic thing. When you're mean and lean, you have more,

[28:13] you can access your data that's really useful quicker.

[28:16] And you're right, minimalization is usually the better way because things become obsolete now in the era we're in things that are two years old sometimes are already obsolete. It used to be a much slower timetable and people would go back and they'd have files and this and that, but not anymore.

[28:30] And plus they're all digital.

[28:32] So it's also a question of where you store them with the data. You don't have to have everything on prem.

[28:37] And you should, if you really want to store a lot of it, you should maybe do it hybrid, do it in the cloud or whatever. So that's, that's another issue too.

[28:43] But I think, I think the minimalist approach works well in life. Even with, you know, where you're living as a house, you can accumulate too many things and you don't appreciate them.

[28:52] Same with, with business. If you have too many things, you can't get to the right thing you need and you don't appreciate the things that are high value.

[28:59] Debbie Reynolds: Absolutely.

[29:00] And I want your thoughts on quantum.

[29:04] So I do a lot of presentations about quantum and emerging text. That's my area.

[29:10] But I feel like because quantum isn't here yet, people can't feel it, touch it, see it or do anything.

[29:19] They don't realize what a big shift,

[29:23] what quantum will do to the things that they're they're doing now and how that's going to impact them. But give me your thoughts.

[29:30] Chuck Brooks: Yeah, we are definitely in the quantum era right now. If you said this five years ago, people had said Al is crazy, it's 25, 30 years off. But the technology is catching up really quickly.

[29:41] And there's different types of quantum. And that's when I think people misunderstand. They're thinking of this big gateway computer that could have a million qubits, that could basically be a super mind that could control all the data in the world.

[29:54] And we're going towards that. You know, the Googles and others are trying to build that in IDMs. But the other type of quantums are algorithms and sensing devices. We use it for GPS already.

[30:04] And then there's a thing called quantum photonics which are off the shelf right now.

[30:09] Quantum basically is using particles to do entanglement where you can do many things at the same time, unlike the X's and O's. And I won't go into the details, but it just basically means it's millions and millions of times more powerful in its speed and collection of data and analysis of data.

[30:26] So it's like a super brain. So but you can do limited amounts of that now with these photonic, which is basically based off the light.

[30:34] And so they're not. It's more stable than the particles are trying to use with qubits right now, which not as powerful and not as capable, but still it's here.

[30:43] So I think one of the big problems with quantum in our preparations is that everything that's encrypted right now will be vulnerable to being decrypted.

[30:53] So you're talking about all your financial banking and you know, all the other organizations and banks and around the world. You're talking about national security secrets, you're talking about everything.

[31:02] So if that happens, which is called Q Day,

[31:05] then whoever has that capability will be reign supreme. It's really a superpower, but it's really also not just quantum. They're going to combine quantum and artificial intelligence. So you're building that, that ability to, to pull up the data and analyze the data with AI augmentation and you really.

[31:24] It's going to have great impact on creating new medicines and new materials like it is already.

[31:30] But it's going to in the other side too. It could be very destructive for us. Just like we're talking very initially technology has two sides, but I think quantum get ready.

[31:38] And there's a lot of quantum companies out there that have received a lot of money, just like AI, not quite as much yet, but still the billions.

[31:45] And you're going to see within five years, big quantum curve. And I think with Department of Defense they're already saying by 2035 you have to have quantum resistant algorithms.

[31:55] So if you're not preparing now and you're doing business with government, or if you're in a very area where you're handling sensitive data,

[32:02] you need to know this and you need to go and find the recommended ones from NIST and others that are already good right now. But quantum is here, it's coming and it's going to be bigger.

[32:11] Debbie Reynolds: I think people are thinking of it as this super global brain computer where we're talking about different flavors and different sizes and different ways in which quantum can impact different things.

[32:29] Including encryption.

[32:32] Chuck Brooks: Yes, absolutely.

[32:33] Debbie Reynolds: I want your thoughts. So if it were the world according to you, Chuck, and we did everything you said, what would be your wish for privacy anywhere in the world? Whether that be human behavior,

[32:46] regulation or technology?

[32:49] Chuck Brooks: Well, the problem with regulation,

[32:51] and there are some good regulation, Europe is a good example of GDPR where they regulate your privacy and they keep track of anything that's being used without your permission and they fine you.

[33:00] It works in a limited way. It won't work in the United States because a lot of the lobbies here don't want us to have privacy.

[33:06] They want to be able to use your data to sell you and that's how they make their money.

[33:11] So I think it would be the technology side for privacy. I think that would have to be some sort of a super encryption.

[33:18] And I think there's something, you know, some things in the work that may be that way that are not based on your numbers, that really make sure that your identities are protected, what you communicate is protected and that quantum and AI can't steal it from you.

[33:32] And I think that's where a lot of the research is going right now. And there's some really interesting stuff that I've seen initially that I think it has a great amount of potential so that you can't take a document and change it.

[33:45] You can't take a, if you're buying a diamond and you want to know it's real and you can't fake it, you can't take another, switch it with a counterfeit. I think we're nearing that kind of world and I think that's going to be the solution.

[33:59] Maybe not the total solution, but it'll be a solution because I don't think humans, the elected, basically the focused and the patience to really do what they need to do to protect themselves and whether it be privacy or cybersecurity.

[34:12] And I think they have to rely on,

[34:14] you know, with the sophisticated technologies that are out there,

[34:17] they're not capable of doing it themselves either. So I think and governments need to work closely with private sector to do that too. But I think watch and see what happens with new forms of protecting our identity.

[34:27] And I think, you know, in the meantime we gotta stick with multi factor authentication, strong passwords. But until then, until we reach those points, I think we're gonna going to be in a very precarious situation.

[34:39] Debbie Reynolds: Yeah. I want your thoughts about the prospects of more on device computing.

[34:46] So the ability for people to not have to share everything in order to use a service or a product, I mean do you think that that's part, that could be part of a solution?

[34:57] Chuck Brooks: I think it's a good solution. I don't know how you can enforce it.

[35:01] It's really a monopoly of the big providers that control that and they may be competing among each other, but they don't want to change the rules.

[35:09] So yeah, but I think it would be good, I mean, if you had that kind of capability to control your own destiny as you said, rather than relying on a platform to tell you where you can go and what you're doing, it's monitoring everything you do.

[35:21] But yeah, right now it's a difficult scenario because everything we do is leaving a digital footprint and and that goes to privacy, that goes to security. So it's. If you look at it, it can be frightening when you think about it.

[35:33] So I think you really have to look at things in focus and just like you secure your house and put a bolt on it or a ring doorbell with a camera or something, you have to start looking at doing that with your computer and your phone.

[35:45] Debbie Reynolds: Yeah, that's all great advice.

[35:48] Well, thank you so much. Chuck is amazing that you're on the show. So happy that we were able to chat today.

[35:54] People. Please follow Chuck. He's very prolific.

[35:57] You're everywhere all at once,

[36:01] which is really cool. And yeah, I love to keep in touch and find.

[36:05] Chuck Brooks: Absolutely. And if I come back home to Chicago, I'll definitely say hello.

[36:10] Debbie Reynolds: Excellent. All right, talk to you soon.

[36:12] Chuck Brooks: Take care.

[36:13] Debbie Reynolds: All right, byebye

Next
Next

E277 - Tom Kemp, Executive Director of the California Privacy Protection Agency