E237 - Matthew Waddell, Founder, Tactically Secure
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information the businesses need to know.
[00:25] Now, I have a very special guest on the show, Matthew Waddell. He is a ransomware expert and the CEO of Technically Secure. Welcome.
[00:37] Matthew Waddell: Hi. Thank you, Debbie.
[00:38] Debbie Reynolds: Yeah, well, we met on LinkedIn as typically that these things happen and you and I had a conversation, had a call and the types of things that we talked about were fascinating.
[00:50] I was really fascinated by your deep knowledge and technology and risk, especially around privacy and just data use in general. And I thought you'd be a great person to have on the show because we had a great time talking.
[01:07] I feel like our talk could have been a podcast in and of itself, but yeah, why don't you introduce yourself to the audience and tell us about your trajectory in tech and how you came to be as part of Tactically Secure.
[01:20] Matthew Waddell: Sure, Debbie. Thanks. My name is Matthew Waddell. I'm a security evangelist. I help make security simple and protect people from ransomware and other serious digital attacks. Security is really my icky guy.
[01:33] It's the Japanese concept of fulfilling purpose in life. And it's found at the intersection of doing what you love, being good at, what you do, something that the world needs and something that you can ultimately be rewarded for.
[01:45] And I'm so grateful to have had such an amazing career. I've worked deep inside several of the three letter government agencies that you've heard, at least one that you haven't.
[01:54] I've spent a career investigating all the unique ways that governments and people break into networks. And I have a reputation for being the one they sent to keep things quiet for sensitive incidents that would have been newsworthy or such.
[02:07] I have a reputation for discretion and managing sensitive incidents that require a delicate approach. Fortune sized companies have trusted me to create, repair and lead their Internet security teams and guide them through large incidents.
[02:20] And additionally, I spent four tours on the front line of Iraq and Afghanistan. I worked as a civilian subject matter expert in forensic investigations, and I directly helped our soldiers locate sensitive or difficult targets and complete military objectives, usually while wearing flakjack and a helmet, trying not to get shot at.
[02:38] In my personal life, I've traveled around the world twice. I've been over to over 60 countries and crossed continents by motorcycle. And I'm constantly daydreaming about my next Adventure. I married my childhood sweetheart and we're raising our adult autistic son together who requires 24, 7 care.
[02:55] And he teaches me patience beyond anything I've ever experienced personally in the government or fortune level companies.
[03:03] I love to teach and I'm offering extensive experience to help people handle the modern threats like ransomware and privacy concerns. And I do this by helping them understand the threats they face, showing them how they can prepare for them and teaching them how to navigate a difficult path.
[03:18] So thank you for having me, Debbie. I'm excited to be here.
[03:21] Debbie Reynolds: I'm excited to have you here. You have a lot definitely going on. I would love to know because this is an international audience, this is very much a business audience. What is it that businesses don't know now that they need to know about ransomware or the types of threats that they are facing today?
[03:42] Matthew Waddell: That's a great question. I believe that data privacy measures directly correlate with cybersecurity and they must align in order to achieve the best impact. Core essence of all the privacy frameworks is that they, they show and help users be.
[04:01] I believe that data privacy rights and actions need to co align with cybersecurity. The, the way that they work together and they're kind of in, in the same path and if they align they can achieve the, the best impact.
[04:15] Ransomware is constantly evolving threat and the most recent thing that we're seeing now is that they're taking whatever data they've collected and posting it online for everyone to see. So it's directly a privacy related act.
[04:27] If, if your phone company gets attacked and a ransomware gathers your information, they will post your information online as a lever to push the company to pay the ransom as an extortion attempt.
[04:40] There are many other avenues but most of cybersecurity teams to be around stopping people from stealing data and that directly correlates with privacy and that you want to keep your data secure and I can do what I can to keep my data secure, but I have to trust other companies with my data.
[05:00] And what happens when they're compromised or what if they do a bad job of securing my data?
[05:05] Debbie Reynolds: Do you think that the onslaught these new AI tools that are more commercially available and very cheap and easy to use, do you think that that is helping companies or bad actors do more ransomware stuff?
[05:24] Matthew Waddell: AI is a tool and I understand the difference between large language model and neural networks, but I think that language evolves and that we all understand when we just say AI, what we mean is any of these machine tools Just to define that scope.
[05:39] It is, it is a useful tool, but it can be malicious. It's like a hammer. I can use a hammer to build a house or I can use a hammer to hurt someone.
[05:50] It depends on the hands that it's in. And threat actors absolutely have access to unfiltered AIs that doing all sorts of things to help create better phishing emails and ways of getting into systems that you know, because they're using it as a cybersecurity expert to look for vulnerabilities and to look for ways to compromise.
[06:11] But the good guys can use it in their defense to help them look for vulnerabilities, look for better ways of doing things. I think that it is foremost part of the industry and of the Internet happening right now.
[06:24] The main problem I see is that businesses, everyone is slacking, everyone is slapping AI onto whatever process and product they have without fully testing the, the implementations of this.
[06:38] I think that a new field may evolve where it's cybersecurity experts using known prompts that will try and break an AI to kick out its background data and background information in order to discover new threats and new ways of doing things.
[06:56] So maybe a new job market we haven't invented yet?
[07:00] Debbie Reynolds: Oh, I think so, definitely. Right. As you say, it is a tool so you can be used for good or bad purposes. So I can definitely see that being used on more of the offensive side of cyber, where you're trying to seek out different threats and things like that.
[07:15] What is happening in the world that's concerning you most right now in your area of expertise?
[07:23] Matthew Waddell: That's a good one.
[07:25] I think that a lot of businesses are putting the business first mentality, which they should. A business's lifeblood is to make money and to grow and to expand, but they're not, they're doing what, what we call in our industry, bolting on security later instead of baking it in, in the beginning.
[07:42] So think of a cake and that you make a cake and you forget to put sugar in. It's really difficult to put that in after the cake is already made.
[07:49] Security needs to be established in the foundation. Like when you open your physical doors to your building, to whatever restaurant that you're opening, it has locks on the door. It has controls around where the money is and has, has controls around any of the resources and anything that would be stolen in it.
[08:08] Digital companies aren't thinking in the same way, mostly because they don't understand the threats against them. They're, they're somewhat invisible unless you're in the industry, you're not aware of them and they just think they can put this up there.
[08:20] And I got a password in place and that's enough. I think with the businesses driving forward and not thinking of security as an afterthought instead of a we need to make sure that this is done secure as they build it.
[08:33] It's very difficult to put it in afterwards. And I think that's one of the bigger problems we're seeing and why we're seeing a lot of breaches and a lot of privacy related things happen is because they're trying to bolt it on afterwards instead of baking it in as they build it.
[08:49] Debbie Reynolds: Yeah, I agree with that. I want your thoughts a bit about regulation. So I feel, in my view, I feel like people put too much emphasis on regulation. Like some of the things that we need to be doing from a cybersecurity or privacy perspective, we should be doing them because it helps the business.
[09:08] Right. Not necessarily because they think they're going to get slapped on the hand by a regulation. But I want your thoughts on that.
[09:16] Matthew Waddell: I actually disagree with that, but that's what makes great conversation. Right. I think that the core similarities between the regulations and to define that, I mean gdpr, ccpa, you know, the other privacy regulations that are mostly centric around the area that they can control GDPR being around the UK and such.
[09:37] So if you do business and have to deal with someone in the uk, then you have to apply to the gdpr.
[09:44] Most of these regulations, essentially all of them, the frameworks are based around protecting personal data, such as the rights to access the right to data or restrict future possession of the data.
[09:58] It is the only way that a government body can really enforce something is saying here's the law that we need to follow and if you break this law, this is the fine that that affects businesses most because businesses are built and run around making money principally.
[10:14] And so it is a direct pull from that. And it's really the only way that a regulating body such as a government can help enforce and help protect its citizens and the people that are, that are in there.
[10:26] And I think that most data regulations are actually more work for the businesses and that they're principally individual focused. And I actually have a lot of hope for how these are coming about is.
[10:38] But we're in the very early stages of that. They've only been around for a couple of years as they learn that this is a new threat and try to adapt for a way to, to, to counter that.
[10:50] Debbie Reynolds: So is your thought that companies wouldn't do the right things unless they had regulation.
[10:55] Matthew Waddell: I think that it would only take one or two companies that have your data deciding that business was more important than your own personal privacy and maybe their thoughts around how they own the data.
[11:08] Like, you've given us this, we own it now instead of it being, well, my Social Security number is my Social Security number and I use it with my bank. But they don't own it, they don't control it.
[11:20] And then it takes a regulating body to say you have to protect this to protect the individuals from the bank that maybe it's lower down in their priority list. Maybe if they're not being malicious, they're just lazy.
[11:34] Because most business use is geared towards running the business, driving it forward faster. And so priorities for things that slow it down get deprioritized.
[11:48] Debbie Reynolds: I guess that's true. I think, I mean, I guess I'm of two minds. So I feel like regulation is a, is a per. An important part of the puzzle. But I feel like people put too much onus on regulation.
[12:03] Where I think there are, to me, there are drivers that can make it hard for businesses to do business if they don't protect data in a certain way. So it's almost like them shooting themselves in the foot with the way that they handle data.
[12:20] But especially with these companies that are maybe the too big to fail type of companies, where maybe they would need a regulation because they're so big and powerful that maybe they don't care as much, you know, they lose certain amount of customers.
[12:35] But yeah, I think. And maybe this goes back to. I know we talked a little bit about the regulation in the eu, you know, knowing that the US doesn't have as many regulations around either cyber or privacy or AI, not in the way that it's being done in other countries.
[12:54] Do you think that's a help or a hindrance? I guess to businesses it's a little bit of both.
[13:00] Matthew Waddell: As a small business owner, I'm incorporated or LLC in Wyoming. But let's say I want to talk to someone in London and they give me their information and I write it down somewhere.
[13:11] Just because they're in London, I'm bound to handle their data based upon dpr. So what if I talk to someone in California, completely different regulations that I have to follow and I'm just a guy trying to do business here and international audience and I have to adapt to and handle whatever location they're in.
[13:33] If I was doing who's there say God gives me, you know, sensitive information and I share it with someone else and I violate their trust, he can prosecute me, they can prosecute me in their location for violating their data protection laws.
[13:51] Whereas wherever I'm located will may not have the same laws. So it's, it's a pendulum that is trying to find its middle path and sometimes it swings too far to the control side and sometimes it's too far into the not control side.
[14:06] I really think that America is trying to figure out its, its place in this and it's watching a lot of other countries develop these and, and you have forward thinking states like California that go ahead and develop stuff and then there's Canadian laws and, and America's really trying to see what happens with these and evaluate so that they can make a better decision ultimately.
[14:28] But they're also waiting too long. You know there's, you have a certain amount of information that you can collect before you need to move forward and that's usually like 80%.
[14:38] You'll never have a hundred percent of the information that you need to make the decision. So I think that, I think that they should go ahead and they should create some data laws but it should be open enough that it's not a hindrance to business because the amount of legal obligations that a small business that just doing one thing consulting practice has to follow is just a paperwork, a large collection of paperwork that I have to follow and understand and know.
[15:03] But I happen to be in that industry. But I can't imagine someone who's non technical who just wants to help someone online and has to obey all of these foreign laws they, they're not even aware of.
[15:13] They could break the law without being aware of it.
[15:17] Debbie Reynolds: What do you think about things like deep fakes and whether that be voice fakes or deep fakes that are videos like the risk of that, I feel like people think that that's like a future problem.
[15:32] I think it's like a today right now problem.
[15:35] Matthew Waddell: Oh, it's absolutely happening right now. We had a. I can't quote it for you but if you Google and some executive fell for someone in finance fell for a video fake of the cfo.
[15:48] I'm probably messing this all up. But they fell for and lost millions of dollars because someone talked to them and it was a deep fake. And there was a company recently that hired North Korean spy but he was using Deepfake to come across as, as a different type of person.
[16:06] This is a real thing that is happening. My recommendation for your readers and listeners is that the people that they trust Your family and your friends, inform them that if someone was to contact them and ask for money, that you really establish this is the real person you're talking to.
[16:24] Ask questions, only come up with a keyword or something. You know, because my dad is fairly old and he could receive robocalls or deep fake type calls that sound like me from pulled from something like this, this podcast, someone could call my dad and, and say, hey, I'm in jail, I need some money and such.
[16:43] And he would want, I would want him to verify that it was really me through that this is, this is real and absolutely happening now. It isn't future, if anything, it's.
[16:53] It's current and past and that it. There are known cases of deep fake video audio textual base based upon gathering all of the posts that you've made and then writing something in your, your tone of voice and your, your way of speaking that is 100% convincing.
[17:10] You know, we need secondary controls that are, and I say secondary controls. I mean soft things like, hey, prove that I'm me to you. You know me. Tell me something that only I would know.
[17:22] Kind of like tell me a childhood story or something to establish truth of identity.
[17:27] Debbie Reynolds: I feel like a lot of the efforts or a lot of the talk, especially in like policy, political circles around like these advancements and AI to like do deepfakes and things like that, they were like, well, we need to find a way to like label a video if it's fake.
[17:45] And in a way I feel, or voice if it's fake or some way. In a way I feel like we need to do it the other way around, like try to establish what's real because I think the fake will outnumber the real.
[17:59] So maybe figuring out what's authentic is probably easier than trying to figure out what's fake. What do you think?
[18:05] Matthew Waddell: The genies are already out of the bottle. They want to add on some sort of fingerprint or signature that you can identify these by. But I, I could open a cloud environment and set up and train an LLM on my own to do these things without any of the censorship.
[18:23] And there are these exist that exist on the deep dark web that you can pay for access and they have no filtering. You can do whatever you want without, without any of the controls that the normal AIs are trying to put in.
[18:37] So it is, you're absolutely correct, something needs to happen. If anyone can figure that out, they're a smarter man than me, which most people. But if the how to protect from this is important and we should as an industry try and figure out how to do that rather than bolt on a security solution that is easily removed.
[18:58] Debbie Reynolds: I don't know, I think when I read these cyber reports or you know, maybe the annual reports that come out about like cyber threats and the breaches and stuff like that, it seems to be getting more and more breaches and they're more and more expensive.
[19:14] How do we turn the tide or turn the corner on that? Do you have any thoughts?
[19:19] Matthew Waddell: Yes. Security basics.
[19:21] There's like seven key things that if every company did it, you wouldn't hear the word breach in the news anymore. If companies followed just the core basics of cybersecurity, they wouldn't have problems with, with breaches anymore.
[19:36] Most users tend to be lazy so they, they will use the same password everywhere else. And a lot of breaches that I've, I've handled, we were unable to determine the actual cause, but we looked and we're like, oh well, he used winter 2023 as a password.
[19:50] This is a very poor password. It's very short. It's very easily guessed by any, any password cracking software. If he used a long and unique password for every account, the threat actor would have only been able to access one little compartment.
[20:05] Mfa turn that on for everything.
[20:09] Keep your software up to date. This recently with the.
[20:13] Fairly recently with the. The crowdstrike thing was, was updating software without using proper vulnerability management. But the when software updates come out it is the the owner of the software saying we found a problem that someone can use against you.
[20:30] Here's a fix. Most software updates don't break things, 99% of them and are a good idea. Companies should use antivirus and endpoint protection. They do two different things. Antivirus looks for known signature malware and things that are in use out there.
[20:47] And endpoint protection uses heuristics to look at behavior based. Why did you just encrypt a bunch of files? Maybe I should look into that a little more deeply. Okay, maybe I should blow this.
[20:58] So training employees, user training of employees is the number one lever that any company can pull. Let them know what ransomware looks like, let them know what phishing email looks like, teach them how to be more secure.
[21:10] They're the front line of security backups and creating an incident response plan for how to behave when there is an accident. When there is an incident is critical. There are seven things that all companies should do and if they did that successfully across the board, there would not be a breach.
[21:28] Debbie Reynolds: It seems so simple, but why is it so difficult.
[21:31] Matthew Waddell: You think it's overlooked because it's simple. I think like your doctor tells us you need to do these things in order to be a healthy individual. And we go, that's great.
[21:43] And then we go home, we smoke a cigarette and eat a donut for breakfast and you know, it's the simple things that need to be done. Most masters of an industry will harp on these simple basics.
[21:55] If you're an artist, you will talk about the essential basics of how to do art. If you're handling woodworking, the very basics, the advanced stuff is useful but kind of muddies the problem and we start to look at those things as the solution.
[22:12] Debbie Reynolds: Yeah, I feel like a lot of times we think that technology is the solution to everything.
[22:17] And I feel as though even in a privacy and a cyber world there are a lot of like low tech, no cost things that you could do to just be safer in your environment.
[22:29] And I feel like a lot of times we skip over that and think, oh, this technology is going to like save from this horrible thing that's going to happen where really some of those basic things, if people practice them and make good habits, will make them kind of lesser of a target.
[22:48] Matthew Waddell: Yeah, I've seen so many Fortune level companies that have million dollar security solutions that they don't use. Well, that they've just thrown at the problem because, oh, if we just had to train our employees, that that's too easy.
[23:03] It doesn't take very long.
[23:05] That can't be the good solution. Train the employees of what to look for, you know, because that's how the threat actor is going to get in. It's going to be in through a regular employee.
[23:14] They're the front line of defense. They're the, they're the villagers outside of the castle, you know, tilling the ground and just doing everyday things. And they're the ones that see the horde coming over the hill and go back and tell the king and then they close up the castle.
[23:28] It's a terrible analogy, but I'll try and use it anyway. The, the employees are the front line of defense. They will see the initial attack. Most incidents that I've handled, when we've traced it all the way back to the beginning, it always starts with the, hmm, that's interesting.
[23:45] That shouldn't happen, you know, and if we can train the employees to look for these things to spot the common indicators for an attack, then you will have stopped it before it's ever really gotten into the system and grown into the cancer.
[23:59] Debbie Reynolds: Yeah. I want your thoughts on insider threats. One of the things. And you're smiling. One of the things about insider threats that I tell people is that for some reason people assume that an insider threat is malicious.
[24:15] And I think probably the majority of them are not. Like going back to your talk about the.
[24:21] The employee, they're just kind of doing their job and maybe they see something that's kind of off, but. Or they may be doing something that either they're not trained, have proper training, or have the proper knowledge of why it's important or why it's risky.
[24:35] And so I just want your thoughts on that because I feel like if you assume that insider threat is always malicious, I feel like you're going to miss the majority of the insider threat problems within the organization.
[24:48] Matthew Waddell: You're absolutely correct. Statistically, 30% of your workforce is a insider threat, but you're absolutely correct. It's not malicious. It could be someone who just doesn't care and they're just going to use.
[24:59] They say. You say, I have to use a new password for this. I'm just going to throw one of my older passwords at it. That's an insider threat. In a way.
[25:07] Having access to more than they need to do their job allows them to be leveraged by a threat actor. And then of course, you have legitimate people who want to take data from the company to use elsewhere, but.
[25:21] Or some beef with someone at the company or the company itself and decide to change things inside the environment that will be hurtful.
[25:30] If three people are working together, statistically, one of them is going to hurt the group. From a digital privacy cybersecurity standpoint.
[25:40] Debbie Reynolds: Yeah.
[25:42] I wish that you could scream that message from the rooftops because I feel like a lot of companies.
[25:48] And to me this translates into tools as well where some of these tools that are trying to detect behavior, you know, they're looking for things that are, you know, an anomaly in a way that may be malicious.
[26:03] But I feel like a lot of these anomalies aren't malicious.
[26:07] So that may fly under the radar. Gives people like a false sense of security that, oh, we're tracking everything. It's like. But you don't really understand what's actually happening within the organization.
[26:17] They may not come on the radar of some of these tools that are only looking for malicious threats.
[26:23] Matthew Waddell: Yeah, DLP or data loss prevention tools do a lot of good to help companies protect the data and protect the systems that are inside of it. I've seen so many companies didn't implement and put all of the good DLP in and then no one's watching it, you know, so it wouldn't matter what you did.
[26:41] It's logging it, sure, but no one's reading the logs.
[26:46] The EDR solutions look at behavior based, but that's mostly not aimed at what the person is doing is mostly looking for malware type of event. And so they're still learning how to behave.
[27:00] They use a primitive form of AI in the back end, and they still learn what to do and what not to do. And we do have a lot of false positives in cybersecurity.
[27:09] It is the number one thing that all cybersecurity experts would take away if they could would be the amount of false positives that we have to deal with.
[27:18] Debbie Reynolds: Yeah, that's so true. I remember I was working with a organization and they kept getting alerts on some documents that someone was using. And the alert kept saying, oh, this person is transmitting a Social Security number.
[27:32] And what it was was a document number in like a document management system that had that number of digits. It's like it didn't. Couldn't tell the difference between that. Right.
[27:41] And so he was looking for the.
[27:44] Matthew Waddell: 3, 2, 4 with dashes set up and it said, there's a Social Security number in this and probably just kept alerting because there's no good way to turn that off without turning it off across a larger swath of things.
[27:58] Debbie Reynolds: Exactly.
[27:59] Matthew Waddell: I was smiling earlier because I've had to handle a lot of insider threats up to the level of counterintelligence investigations that I've done where it was known to threats coordinating between each other and working to stop those from releasing very damaging information.
[28:15] That was, that was the smile that I had earlier.
[28:20] Debbie Reynolds: Well, I'm glad, I'm glad we have minds like yours that are thinking about these issues and are definitely on the case.
[28:27] Matthew Waddell: Well, thank you.
[28:29] Debbie Reynolds: So if it were the world according to you, and we did everything you said, what would be your wish for either cybersecurity or privacy? Whether that be in the realm of maybe regulations like new or changing regulations, technology, new or emerging technologies, or in the realm of human behavior like how people act or how people conduct themselves in business settings around data.
[28:59] Matthew Waddell: Okay, that's a good question.
[29:02] First off, I shouldn't run the world. I would probably have some very out there beliefs. As a cybersecurity professional, I do my best to teach other people how to behave because they don't understand the full gamut of threats that they're faced.
[29:16] But let me answer this in a, in a, in a different question. And more, more in tune with your audience. And this is privacy centric and privacy and security basically go hand in hand.
[29:27] The security keeps your stuff yours ideally under good circumstances. I believe that the privacy, my personal privacy depends on the individual threat that I'm facing. So my defenses of how I will behave and this is going towards the whole world as to how anyone should behave based upon a threat to their privacy.
[29:48] If your threat is more local, say I don't trust my neighbor or I don't trust my brother in law because I think he's out to get me or something like that.
[29:57] Then the, the manner of how I protect myself as much smaller, much more concentric. I will make sure he doesn't have access to my phone. I'll use good password on my phone or biometrics.
[30:09] I will control the access to the data and it's a little easier from a smaller standpoint. That's the center part of a concentric circle. So let's move out just a little more and say that I'm a little more concerned about big data.
[30:22] Google, Apple, Microsoft data brokers handling my data.
[30:27] The way that I protect myself changes because the biometrics to my phone and my personal devices and good password controls and access doesn't really help in that situation because they're not coming into my home to get my data, they actually have my data.
[30:43] So the understanding what I put out there is possibly public even though it was under an encrypted secure channel and such. But being careful with what I share with large data brokers in order to be able to defend that because I no longer have good control of that data.
[31:03] That's something that everyone could learn how to do and not trust big data brokers. And then the last concentric circle is how do you protect yourself against a government that really wants to see what you're doing?
[31:13] And really you have to become like my friend Bob, who has no phone, who checks email through tails through multiple VPN connections and it's very hard to get a hold of if I want to just go have lunch with them.
[31:27] The nation state level attacks and it could be your nation state, there's no really good way to block against that without just not using the devices. Your phone off in a room and you're talking, it is recording you.
[31:43] That's why you can't take the batteries out of the phones anymore.
[31:46] Your phone sitting in a room turned off and you having a private conversation. If a nation state actor wants to record that conversation, they will there, there are methods of, of Being able to do that without any technology in the room that we can record stuff from classified distance away sitting in a van or, you know, completely remotely.
[32:10] All of these things for are impossible to defend against for a nation state. If they want to get your privacy, if they want to know what you're doing, they can.
[32:19] Citizens need to understand that our governments are not always our friends, kind of, but it's that pendulum. It swings back and forth depending upon who's in charge or who's watching, who watches the watchers.
[32:34] So I went a long way and didn't answer your question, really. But I think that people should understand, people should understand that to protect yourself, it depends on the threat itself.
[32:46] You protect yourself differently from a someone close to you that physically has access to a large data broker to a, to a government state. You're, the way that you handle yourself is different once you've let your data, your personal information, any pii, personal identifiable information, which is any two points against you.
[33:07] My name is Matthew Waddell and I am 50 years old. Those are two pieces of information. And now created a PII that Debbie, I need you to protect. We can't release this podcast anymore because I've told you some PII that I, I don't want you to let go.
[33:22] Actually, I, I, I do, I don't care about that pii. But there are other pieces of information like my Social Security number, you know, or net worth or where I'm located.
[33:31] Those are, those are critical information and I want to protect that. So in order to do that, I won't tell you, I won't tell a data broker. I won't put it into an app.
[33:41] I won't tell my government if I don't trust them.
[33:45] Debbie Reynolds: Wow, that's really deep. We have to at some point. It's so funny because I had a, a call with a reporter about the phones and I basically told her the same thing.
[33:54] She was like, really shocked. I was like, you know, because people are like, oh, I have to do is turn off the phone. I was like, that doesn't work.
[34:01] Matthew Waddell: Not in the slightest. It will slow down low technical threats.
[34:06] Debbie Reynolds: Right.
[34:06] Matthew Waddell: And that's all.
[34:07] Debbie Reynolds: Exactly. Exactly. Oh, wow. Wow. Well, we definitely have to talk again. This is amazing, Matthew, if someone wants to find you, what's the best place for them to look?
[34:18] Matthew Waddell: I've spent a career being behind the scenes and being encouraged not to have social media. I have LinkedIn. It's LinkedIn/Matthew Waddell. That's M A T T H E W Dash W a d D e L L But an easier way is to find me secure.com,
[34:34] which is the only website that I have at this time. But I like to help people, and I like to teach, and I really. Security is my icky guy.
[34:44] Debbie Reynolds: Excellent. Well, thank you so much. I really appreciate you being on the show, and I look forward to us being able to chat more in the future.
[34:52] Matthew Waddell: Thank you. It was a pleasure.
[34:53] Debbie Reynolds: All right, see you soon.
[34:55] Matthew Waddell: All right, thanks, sa.