E145 - Stephen Lawton, Founder of AFAB Consulting LLC, Cybersecurity Expert and Technology Journalist
1:00:12
SUMMARY KEYWORDS
companies, privacy laws, cyber insurance, ai, privacy, data, cyber, smb, application, security, carriers, test, laws, people, network, protect, cloud, writing, technology, identify
SPEAKERS
Debbie Reynolds, Stephen Lawton
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official Statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show, Stephen Lawton. He is the founder, consultant, journalist, and freelance writer for Afab Consulting LLC. Welcome to the show.
Stephen Lawton 00:41
Thanks so much for having me.
Debbie Reynolds 00:43
Well, I always like to start off by saying how we met, that you and I have a mutual friend, probably Canadian that you and I know together. And we ended up on a couple of panels, I believe, around finance. And I was very impressed with just your background and everything that you do. And over months and years, I guess we've had opportunities to collaborate on different things. You're a journalist, so you also do articles for dark reading. And you've called me up a couple of times to ask me some questions about privacy. Actually, a really funny thing is that I use HARO, Help A Reporter Out. And I saw just by happenstance I was glancing through your listings. And I saw something that I thought was really interesting, and I knew that I was the right asset; you know, I can help you with this. You're like, oh, my God, I was trying to reach you anyway. So it just worked out serendipity; I suppose we were trying to find each other. But why don't you talk about your background? You have a very unique background in media and journalism and cyber. Tell me about that.
Stephen Lawton 01:51
Thanks so much for having me on the podcast; I really enjoy our conversations. And I started off as a newspaper reporter in Los Angeles back in the 70s. So I was a city side, police beat reporter covering the hillside strangler and all kinds of fun stories back in the early 70s. That I started writing about technology in the late 70s. And I've been a technology journalist. Since then, I've been the chief editor of several publications, including Micro Times Magazine and California. Digital news and review. Most recently, I was the editorial director for the content lab at SC media, spent six years there on staff and about five years before that contributing articles on data security and cybersecurity, but I've also been on the vendor side; I was the senior director of strategic marketing at Acronis, a backup disaster recovery business continuity firm in the early 2000s. So I've been around this industry for a very long time. I think the first webcast you and I did together was on privacy laws in Canada. That's right. And we've also, I think, done one other one on cyber insurance. I've been covering the cyber insurance market for a couple of years. In full disclosure, I do have a client in that industry, global cyber risk advisors. I'm one of their consultants, and we help organizations that have either lost their cyber insurance or can't qualify it to rehabilitate their security, so they can once again go and obtain a policy. So I am both on the journalistic side, as you mentioned, dark reading. I occasionally will write for Insider. I write for IT Pro today, which is from the same publisher as dark reading. And as I say, I've been writing professionally about technology for 40-some-odd years now.
Debbie Reynolds 04:26
You recommended me actually for an Insider Webinar at Business Insider that I did, which is great.
Stephen Lawton 04:33
I did?
Debbie Reynolds 04:34
That was wonderful. Thank you so much for the opportunity to talk a little bit about it. It's interesting your trajectory in terms of media against technology. So you've pretty much seen it all. What is surprising you now about technology and cyber?
Stephen Lawton 04:56
Well, one of the things that you and I have both looked at for many, many years, as you know how AI is moving into the security area, there are some real issues with AI. It has to be trained, and it has to be taught what is and is not an issue. It needs to be like any other human analyst that was looking at things; it needs to eliminate the bias. So, if you train AI to have a bias against a certain type of person, certain language, a certain way of presenting an opinion to the AI, you can actually damage the entire AI program to return back false positives, or for that matter, false negatives. You know, the AI is great, or at least it will be at some point; it has a lot of promise. It's doing a pretty good job right now of being able to identify potential threats as they come into a network. So you have some very, very good MDR programs that at the top of the funnel, they can identify a potential threat, and then process it out, and just you managed to keep it out of the network until the human analyst looks at it, and determines whether or not it's real. The problem, of course, is that a company will do a lot of development; they might have groups that are testing software or malware to make sure that their own systems can identify it. And if the AI and this would actually work for an MSSP, or any other human group that's doing analysis, third party analysis in the company, if they're not informed that these tests are going on, they can actually affect the test by saying, Hey, this is a problem. You know, let's get it off the network before it does any damage, even though it's an absolutely valid application of the malware or of the new code that's being tested. So it's great to be able to jump in and be able to identify a potential threat early. But on the other hand, you don't want to damage the test. Case in point, I've been testing some software here, at my home. And unfortunately, my security application software immediately canceled the test. And I didn't realize that I was getting back false results. Because my anti-malware software, which I had forgotten to turn off before I did the test on that machine, identified it as a threat. So I fell victim to my own advice.
Debbie Reynolds 08:33
Right, if you're doing something that in this system seemed like an anomaly, you're just, you know, you're testing like, I used to have an issue. If you're moving lots of files, they think it's something someone's trying to steal something. It's like, no, I'm just trying to do my job. So being able to try to find ways or maybe AI can help with that. Maybe they know Stephen does these types of things. So don't bother him. And don't think it's something weird is happening.
Stephen Lawton 08:59
At some point, you know, if I had identified, well, in this case, it wasn't AI; it was simply security software. But if I had identified that this machine was going to be running this test at this time, it might not have caused the problem. Or I could have simply turned off the software entirely, which is what I ultimately had to do to run the test. I guess it's a good thing that machine was not online at the time because I would have been inviting, you know, the bad guys in, but I keep the software running all the time.
Debbie Reynolds 09:42
I named you one of my top global data experts in 2023.
Stephen Lawton 09:48
You did, and I thank you. Well, you know, I've been writing about tech for a long time; I've got a lot of friends that write about tech. And a lot of the people that I've known and worked with for many years started out as journalists before they became technicians. Some started out as, as attorneys before they started writing about legal tech. Others were programmers before they started writing about programming. But what I'm seeing today is, you know, some of the journalists don't have that basis, that historical basis, they, they look at the cloud, and they say, wow, the cloud is wonderful, you know, look at all the things you can do. Well, when I started writing, we called the cloud timeshare. You know, it was just someone else's computer; you're sharing time on that computer; you're sharing access. Now we have a lot more cool ways of getting on and off the cloud and on and off other people's computers. But today's many of today's writers don't have that historical reference to understand where we were or what we're doing now. And what is it that we've missed? What have we forgotten? When you're dealing with other people's computers, your data is on somebody else's system; you don't have access to that, and you don't know what they're going to do. So there, there are some applications, for example, that are marketed as being, you know, a wonderful cloud based app, where you send them all of your, your writing your data, whatever. And it goes through all of these gyrations, and it comes back to you, processed and cleaned or whatever. But unless you know what's happening on the other end there unless you can actually trust that that application or the people running that application, you know, there could be potentially an issue. You know, one of the things we see today, because of the great strides we've made in technology, data centers are actually going back on-prem; they're a lot smaller than they used to be. They're, they're a lot more compact. They're, they're quite powerful. I actually wrote about that for data center knowledge, which was a sister publication to, to dark reading. But, you know, companies are bringing back some of their processing for their most important data. Because sending it off to the cloud. Well, it sounds great. You have to remember that cloud providers’ primary goal is to protect their investment, not yours. So they're protecting their infrastructure. But unless you're doing the job to protect your data, you don't know if that data is being shifted elsewhere, you don't know if, you know, s3 buckets, you hear about those all the time being attacked. You know, in the Amazon cloud, it's not Amazon's fault, is that their clients aren't doing the job of protecting their own data. So unless you are working hard to protect your data every step of the way. You don't know where that is; you don't know, for example, if your data is being shifted from one country to another, so GDPR, for example, requires that data that originates in a country and in the EU stays in that country. But unless you know the changes that your cloud providers are making, basically in the same country, it may shift to a different country. Here in the US, we have, you know, we have internet lines that will cross the US-Canadian border. Not because there's anything, you know, terrible about that. But that just happens to be the way those lines are run. They cross borders. Technically I believe that would be a violation. In Europe that that would be a terrible violation. GDPR certainly is much more stringent than the US regulations or the Canadian regulations at this time. So, you know, unless in the cloud providers, by the way, won't tell you when they do updates because they're doing updates all the time. So if they're not careful about what they change, you as a corporation as an enterprise, could be in violation of regulations without even knowing it.
Debbie Reynolds 09:50
And you're amazing. For anyone who doesn't know Stephen, you definitely need to follow him and follow his writing. One thing I love about your writing so I have an eagle eye for tech journalists. Because I look very closely to see if they are explaining the technology correctly, are they like, you know, getting it wrong? And you always get it right. And you're very detailed in you're reporting. So I really appreciate that. What bothers you in media that you see with some tech reporting that happens? I can tell you what bothers me. That's true. One thing I tell people about the cloud, I think they did a great job of advertising and marketing, right? Cloud sounds so innocent and fluffy and nice. It's just sort of thing in the cloud. And it's not that it's bad, but I think I feel like people brought their bad habits into the cloud and thought it did magical things that it didn't do. Like I told people like, okay, I'm putting this thing into the cloud. And they're thinking, okay, well, this company is going to back this up for me. So if I make a mistake or delete it, they'll restore for me like, no, unless you pay for that, no, they're not going do that. So people just thought that that was just the way things were.
Stephen Lawton 16:27
Again, it's just because the data doesn't reside on-prem for you; it doesn't mean you're not responsible for it. And for the companies that put too much trust in their cloud provider to protect them. You have to be very careful; you have to read your service level agreements on what exactly the provider will do for security and what exactly you you need to do. But always assume that you need to do everything. That's right. Because it's, it's, it's a bad habit to get into, to just assume that your provider will protect you because that's not their business model.
Debbie Reynolds 17:14
No. Right. I think people misunderstand that. Tell me, you touched on it a bit. Let's talk a bit about what's happening with AI. So first of all, AI has been around for a while; some of this AI that now that people are talking about, I think the change that has happened is that with things like ChatGPT AI use has kind of broken into the consumer market, where it was probably pretty mature in the corporate market, and you know, big companies, they have people who build AI stuff for them all the time. They're used to this, but I mean, even a lot of the tools that I'm seeing, you know, they've been probably used and tested in these bigger companies. And now they're doing like a more business consumer type of offering, I think is an important moment in technology where these tools are, we have probably been accustomed to using in corporate settings, are breaking into the consumer market, and people are going crazy about it. But let me know your thoughts.
Stephen Lawton 18:25
Well, first of all, AI is sort of a nebulous term. When we talk about AI, most of the time, we're talking about machine learning, which is just one component of AI there, there are others. But really, the main one we talk about is machine learning. And as I mentioned, it's how you train your machine. That's really important here. So what we're seeing with, you know, with these new applications, I, as a matter of fact, just got my official invite to join Google Bard a couple of days ago. So I can test that along with ChatGPT. But it is a nascent technology, we we don't yet know how it will be used in the consumer market. I saw on Twitter, I guess it was maybe two weeks ago somebody did a video saying I'm using ChatGPT to do all of my tests for my university. It was somebody who was in Europe, and I'm thinking, right, you're cheating. This is not what this application is for you. If you want to actually succeed in what you're going to do, eventually you need to understand that, you know, using an application to write your papers which may or may not be accurate, by the way, is dishonest. And at some point, you're going to pay the price. But, just like any other tool, you know you, you get better at it. As you use it, the tool itself will improve over time. But right now, consumer machine-learning tools are very, very basic. They and I did a story on AI, I guess it was six months or so ago. And I was asking some of the corporations that promoted they have AI tools. And they said, well, how much of that tool is really AI, is really machine learning.?And how much of this is marketing fluff? And to be honest, some of these companies were saying, Well, we do have some very basic machine learning. But full-blown AI. No, you know, we aspire to that. We hope to have that someday. But right now, we were at the, you know, the very basic parts of it. And when you're talking about network security, when you're talking about privacy, you need more than just the very basics; you really need to have a very mature set of security tools, you need a very mature plan, your analysts need to know what they're doing. Yes, you can use AI for some parts of it. As I said, MDR is a good example of where AI is indeed making entree into the cybersecurity market. But it's started, you know, we're probably a good five to 10 years away from having AI-based tools that are fully dependable. And I'll probably make a lot of AI software companies very mad at me for saying that. But the truth is, is that we're just not there yet. We're getting there. We're making progress, we can expect to see various XDR tools improve over the years, and we can expect to see better tools in monitoring, and third-party risk management, where it will be able to not only look at whatever is coming in but ultimately backtrack to the secondary, the tertiary parties that that we're producing, and to make sure that vulnerabilities are not coming forward. That'll be great, but we're not there yet.
Debbie Reynolds 23:10
That's true. I tell people recently we're in this it sucks to be you part of human development where we're guinea pigs for what's happening here in AI. And I think we just have to know what we're dealing with and know what we can do.
Stephen Lawton 23:31
Take a look at the cyber insurance market for a moment. You've got some companies out there some carriers who are using AI to analyze the submissions that are coming in from potential customers to determine if this customer is going to be a good risk or not. That's an early use of it, and it'll get a lot better. Personally, I think of cyber insurance as basically the ultimate in compliance management. Because if you lie to your cyber insurance broker or carrier, it will come back and bite you. There was a case last year where a company claimed that it had 2FA, and Travelers gave them a binder. That was in April of last year. In May, the company got hit with a cyber attack through its 2FA; yes, if it had AI analysis, it would have looked at it and said yep, it's there. But the AI, excuse me, the 2FA, was not sufficient to identify this particular attack. It may have been a couple of years ago when it was installed. But in 2022, when the attack occurred, it wasn't sufficient to consider simply denying the claim. Travelers actually sued the client for lying on its application about, you know, attesting that it had actual 2FA that was sufficient to meet the current market.
Debbie Reynolds 25:23
Wow, that's a new one. I hadn't heard that. I hadn't heard that one.
Stephen Lawton 25:27
Wow, that's one of the reasons self-attestation and cyber insurance is, is falling out of, you know, the, they still accepted, but it's no longer, you know, accepted the way it used to be, it's much stronger to have a third-party of some sort. To go in, analyze your network, and say, yes, you really do; these checkboxes we've tested, we've established that all of these are up to snuff. And, and that will be a much stronger application, when you go to your insurance provider, whether it's a broker or a carrier, whether the I would have found it, I don't know, you know, I can't speak for the applications that the carriers that use it. Now, let's add a couple of uses exclusively to analyze applications.
Debbie Reynolds 26:32
When cyber insurance first came on the scene, a lot of companies thought they were going to be printing money because they thought the risk was going to be super low. And it just didn't turn out that way. So it's like, the risk is way high. We have so many breaches. And then we've seen over the last few years, as you were saying, cyber insurance companies are really tightening up, they're asking a lot more questions, they're asking for a lot more evidence that you're doing things. And I think, when you talk about having third parties come in and test your stuff, that's really a tough issue, especially for small fledgling new businesses, because some of these third parties, attestations can be 1000s and 1000s of dollars.
Stephen Lawton 27:24
Potentially, it could be a lot more than just 1000s of dollars. But you're right, and it does become a challenge. And at that point, you have to try to find somebody that will do us a sliding scale. Obviously, you would not expect an SMB to pay the same as a Fortune 500 company. But, you know, let's, let's take a look at this for a moment. The risks are different, certainly between, you know, what, what a Fortune 500 company would have at risk, you know, because of their size. And because of not only the number of networks they run but the nodes and all that. But you might have an SMB, for example, look at Wall Street, you know, hedge funds or private equity funds. They're moving tremendous amounts of money through the economy, throughout, you know, through trading. But many of those companies are fewer than 25 people. So it's definitely an SMB. But, you know, it's also a huge target for attackers because of how much money is going through; you would expect the financial services industry, even an SMB there, to invest more into its security than, say, a small manufacturing firm or a small services firm. Now, I know this isn't fair because Target has done a lot of work in the past 10 years to improve its security. But the famous 2013 Target breach went through an SMB, each fact vendor. And it hit the Target network, and for whatever reason, target identified it immediately. And it was a problem that SMB was acting as a third party to Target. So Target had actually several different approaches that it could have used to identify vulnerabilities coming onto its network. Not only would it do the basic analysis of data crossing its lines. But you would not want to keep, for example, your data from your data that would be managed by an IT network on the same physical network as data from an OT network, Operational Technology Network. That was one of the problems of network use, much less secure infrastructure devices, the routers that did not have to meet the same levels as an IT network. So when you're managing the data flow across your network, you need to make sure that number one, you have good network hygiene to be able to look at the data, identify if there's a problem, you need to be able to look at the data that comes out of your, your Sim or whatever other data analysis you're doing. And actually be able to identify when there's a problem. Make sure that if you're running AI, to analyze the data, that you have a human analyst, go back over the final results, to be able to do those fine tunings to determine, yes, this is a problem, you know, no, it's not a problem. And be able to judge where to put your resources. And also, make sure you don't do silly things like have your ITN not running over the same physical network if you're not using the most secure IT devices on that network to protect it. As for whether or not an SMB should be looking at things like third-party risk or orphaned SMB should be looking at vulnerabilities. Of course, it should. But you wouldn't expect it to pay the same price; it's still going to be potentially a victim of ransomware. Because the bad guys will go after whatever data they can get about the next fish up in line, you have the SMB feeding into another data network. So obviously, they were looking for credentials. You want to do the best job you can for a small company to protect its vital data. But no, you're not going to get you're not going to be able to charge them the same rates. It wouldn't make sense. And what they would end up doing is saying, No, we don't want to do this. The kinds of cyber insurance that they buy may well be different from the cyber insurance that a larger company buys or the largest companies. So you have to be able to scale. It's part of the security environment, if you will, to the size of the company.
Debbie Reynolds 33:29
Third-party risk is a problem, a challenge because third parties are now being asked to do things that they weren't asked to do in the past, then people, some companies, and most firms are not really understanding the scale of a small business. So it's like, they're asking for things that only these multi-million and billion-dollar companies can do. That's not realistic.
Stephen Lawton 33:56
Well, you go to the board of a multinational Fortune 500, the board members don't understand what third-party risk is, you know, you talk to the general counsel or the GRC. You know, the chief GRC officer, you know, they can barely deal with their primary business partners. You start talking about secondary, tertiary, you know, go on down the line. And you may as well be talking a different language to them. They don't get it. You know, GCS has a lot on their mind. And the general counsel or the outside counsel that specializes in cyber insurance will be the one negotiating those agreements with the carriers, but they may not necessarily understand all the components in Marsh McLennan, the largest broker in the world has a list of 12 Cyber controls that you need to have in place, there are five basic ones others that you need to actually have before they'll even sit down and talk to you. And, you know, third-party risk, supply chain risk, that's number 12 on their list of 12. But, you know, that would be, for example, a good starting point for a company that wants to prepare itself for cyber insurance. And other carriers and brokers have their own lists, you know, I use martial only because I've, I've used that graphic and multiple stories. And, you know, I worked with him on a couple of webcasts in the past. They really know what they're doing. But, there's a lot of outstanding brokers and insurance companies that understand what you need to have ways before you get a policy; I strongly suggest that anybody who's trying to get cyber insurance, if they've not been able to qualify, you know, ask your broker, ask your carrier, what are the controls that you absolutely need you to have in place first. And, you know, help me identify where I'm falling down on the job. Because if you just get back a rejection that says sorry, your cyber controls aren't cutting it without actually outlining which cyber controls are missing. Nobody's going to be able to figure out where they need to spend their money and where they need to spend their efforts.
Debbie Reynolds 36:53
Yeah, that's fair, right? You don't want companies spinning their wheels, doing things that aren't really going to help them solve their problems? Yeah, they need to be productive.
Stephen Lawton 37:04
You know, so, you know, so ask the questions that, you know, the first time you approach either a new insurance broker or a carrier, or you go in for a renewal, you know, ask them, you know, what is it that you're looking for. So I can make sure that when I submit my application, you know, it's more than just checkboxes. It's more than checkboxes. ICS will tell you, it's more than just checkboxes, you know, after the, you know, the fiasco with, with travelers, you need to make sure that, you know, you need to factor authentication, you need to have the training, you need to have a plan for how you retire or machines. One of the biggest problems that we have with companies that are exclusively in the cloud is that they'll spin up virtual machines to do whatever, and they never take them back down. They don't even know what resources they have, aside from the fact that all those VMs out there are costing them money. Every one of those VMs that they spin up that they don't take down is costing them a vulnerability. So you need to make an effort to find out where your vulnerabilities are and then start working backward from those before you present to an insurance company. And again, it's good to have a third party verify what you're doing.
Debbie Reynolds 38:50
I agree. What are your thoughts about privacy? So you and I have had many chats about this. I feel like cyber and privacy have a symbiotic relationship, obviously not the same. But tell me about privacy and your thought process and its importance.
Stephen Lawton 39:11
This is kind of a banner year for privacy laws. There were five States that had new privacy laws go into effect this year alone. California and Virginia are probably the two main States that have new privacy laws. Connecticut has won Utah has won. And I hate to say it, but I'm just completely blanking on the fifth State. Yeah, I think so. So, you know, privacy laws are, are certainly necessary. They're outstanding. The problem that we have is each one is a little different. And it's not just one privacy law per State. So you'll have privacy laws that cover individuals, you'll have privacy laws and covered children, you have privacy laws that cover medical. Yes, we do have HIPAA and high-tech national privacy laws. But with so many different privacy laws, it's very difficult for companies to, to know what you know what they need to do to protect the, you know, the privacy, that data that they're collecting, then you throw into the mix of, you know, if any of the people that you're working with are EU citizens, now all of a sudden, you have to add full-blown GDPR to the mix. We have privacy laws and new privacy laws in India, and throughout the APAC, there are several countries that have added privacy laws. So if you're international, you need to somehow normalize all of these laws. One approach certainly is to look at all the laws that are affecting you, take the most egregious parts of each law, and build a privacy policy around that. And figure if I'm covering all of the, you know, the most difficult parts of every law that I'm dealing with, I should be okay. But still, that takes a lot of time and a lot of effort. On the other hand, if we can't get a national privacy law, it's better to have something. So yes, California is probably the closest to GDPR of all of them. But still, the laws in the US are, are different in focus to the laws. In Europe, the EU regulations are focused on protecting the privacy of the individual. The US laws are more process-oriented and are less; they still focus on protecting the individual's data. But they're, they're not looking at the data the same way the EU laws do. The EU laws are, are totally focused on protecting the individual. And basically, the GDPR comes from what the Europeans suffered during World War Two of people turning in family and making claims about other people. And so, those laws are more focused on protecting the individual's privacy. In the US, the laws are more focused on protecting corporations and keeping them from having to violate consumer privacy. But the focus is less on the person and more on the organization; it's just a different way of looking at the same thing.
Debbie Reynolds 43:30
Right, I say you have a consumer focus as opposed to a human focus. And not every human is a consumer. So there are some gaps there in the way laws are articulated in the US.
Stephen Lawton 43:46
Yeah, I think I mentioned already that insurance companies are really a compliance tool that companies can use. If you're involved, for example, in a merger or acquisition, you definitely want to find out, you know, what the target company that you're acquiring told their insurance company about how they protect privacy, how they protect weathers privacy of the network, privacy, the data, you know, privacy of the individual, how they're going about doing that? Because you want to make sure that, of course, everything is in sync. Certainly one of the worst things that happens in an acquisition is that companies don't ensure that that their technology matches the technology of the company that they're requiring, as that's usually the last thing they talk about before they sign the contract. It's like, hey, do you guys have, you know, security controls? Yeah, we got security controls. Well, let's sign in; they spend a lot of time looking at other processes in the business and how the business parts will mesh. They don't look at how well the security, the physical security, the data security, the application security, they don't look at how how those will mesh. And, you end up with situations where, where companies will have applicate, one, one company might have a large Apple base for their creative teams. And the other side might have Windows based and they now work comes in and one, one application will say, Oh, that's, that's not my responsibility, because that's, you know, that doesn't match the machines that I'm working with. I'll leave it for some other software. And the other software says, well, it's not my responsibility. And, you get, literally, you'll have malware enter networks because the software isn't talking to each other. After a merger, it can become quite ugly. And that's one of the reasons actually why you do see attackers go after companies that are involved in an m&a because they become vulnerable, more vulnerable than they were before.
Debbie Reynolds 46:33
Right? Yeah, I think there was. Was it Marriott that had a data breach because they had a merger, they merged, they bought, they acquired another hotel group, and they had a breach, so they inherited the breach, and then a huge fallout because of that? So that's a good example of that.
Stephen Lawton 46:55
You know, and that becomes, again, a real challenge of making sure that cybersecurity becomes a board-level responsibility as opposed to an IT responsibility. Because unless the board looks at cybersecurity as something just as important as operations, then we're going to have issues. We've had some laws passed, where boards are now becoming more responsible for understanding and implementing cyber. But until some boards start, you know, getting very large fines, or, you know, perhaps somebody actually goes to jail for not spending the time and effort to protect the company. The privacy issues, the, you know, the people that are ultimately being hurt by a data breach. It just becomes the cost of doing business, and they write it off, as you know, well, you know, it's spoiled because they didn't pay attention. Well, you know, that the CIO is responsible for making sure that the trains run on time, that, you know, their focus quite often is, with the operation side of it, make sure that sales aren't hampered the CIA. So if they report to the CIO, there's a conflict of interest. Because the CIO is trying to make sure that the operations weren't as opposed to the security with a CIO, so reports through risk or directly to the board. That's better. But again, until the board actually internalizes it, cybersecurity is just as important. As you know, sales and operations. I think we're going to continue to see short shrift being spent on cyber cyber issues.
Debbie Reynolds 49:32
Yeah, I agree. So if it were the world, according to you, Stephen, and we did everything you said, what would be your wish for privacy, cyber security, cyber insurance, anything?
Stephen Lawton 49:50
Lordy, you really don't want that. I think what you would see is the CISO reporting directly up to the board and having the boards or responsible or we're protecting, having they have a fiduciary responsibility already to the company. But I think that fiduciary responsibility has to include cybersecurity. I think it would help to see less self-attestation and more verification. I think that there's going to be a challenge there because, you know, some companies can't necessarily afford to hire third parties, as you mentioned earlier. And quite often, cybersecurity analysis is just a point in time. It's one of the challenges and problems we have with the PCI standard; you'll have an analysis done. And it's just a point in time, and the next time somebody upgrades software or does changes to the environment, it no longer meets the PCI tests because these changes may have significantly altered the network. I think we need to see more, we're seeing more investment in things like AI in things like third-party risk and in the analysis of companies before they get cyber insurance, but the cyber insurance industry is in chaos. Now, as you mentioned, you know, you used to buy a cyber policy the way you bought a house policy or auto policy. That's, you know, that was very simple. Then came the pandemic and the significantly large increase in ransomware attacks. And, all of a sudden, companies that were selling cyber insurance are saying, well, wait a second, this is, you know, we didn't get into this business with these expectations, so you'll find now companies pulling out of the cyber insurance market there, there's not as much inventory available, what inventory is available is more expensive, it's harder to get, it's harder to get renewed. Because of a lot of fear, uncertainty, and doubt amongst the carriers and the brokers, but particularly the carriers, you know, who were the underwriters that make that decision? And we need some normalcy. But we need normalcy in the privacy laws as long as, for example, in the US, as long as there's so much infighting within Congress and, you know, fighting, fighting about the silly stuff. Nobody is looking at some of the issues that really matter. And trying to bring normalcy to the lives of the people and to the business operations. You know, they're spending too much time, you know, playing their own games of politics instead of actually trying to do the job they're there for.
Debbie Reynolds 54:18
Yeah, I'll confirm that we, can we do big things anymore? Could we do the Hoover Dam today? Could we do the Federal highway system today? That's one thing. I wonder about that. So that concerns me with things like something as big as a Federal privacy law. I wonder, can we even do this?
Stephen Lawton 54:43
Well, right now, I don't think we could. And again, it's not that we don't understand the need for it. But you know that The Federal Highway Administration, you know, you're talking about things that were created in the 1950s, when, you know, when the parties were, you know, the loyal opposition, but they were still loyal. Now, there's so much chaos in Washington that, you know, and such a lack of trust that I don't, I don't know, that we can do those kinds of big privacy bills, those important bills, I mean, perhaps, you know, we'll be back at that point again. But I think for the next, you know, 10 years or so, we're, you know, we're going to try to be getting over some of the silliness going on in, in Washington, we, you know, we might be able to, we might be able to do some basic privacy laws, basically, if, you know, but, you know, perhaps taking, you know, what we already have in, in some of the States and combining them, but, you know, you look at our friends up north, our Canadian friends, and even, you know, even they had a Privacy Bill, I think it was, what a year, almost, almost two years ago, that almost made it through a national Privacy Bill, I want to say, Bill 25, that might be wrong. But, we came very close up there. And then they had a national election, and then everything fell apart, and the bill never actually made it through there, you know, the legislature in Ottawa, and they still haven't been able to pull it back then. There are some new, some new bills that they're trying to get through. But even there, some of the provincial privacy laws will take precedence over their national privacy bills. So and they have a lot fewer provinces than we have States.
Debbie Reynolds 57:35
That's right. Yeah, that's true. Oh, my goodness. Well, thank you so much for being on the show. Steven, this was illuminating. It's always a pleasure to collaborate with you; you're a tough cookie, and you bring a lot of knowledge and depth to all these things. So I love seeing your perspective always on what you're thinking about what you're writing about because you really get everything about technology, and you ask all the right questions.
Stephen Lawton 58:07
Well, thank you so much, Debbie; it's always a pleasure to sit and talk with you. You know, you're so knowledgeable about privacy. You know, I feel like a little first year student when I'm talking to you, because you know so much about this, and I can be intimidating at times, oh my goodness, but I just love discussing this topic. And I, you know, I invite folks to reach out to me if they, you know, you could find me on LinkedIn. My email is very simple. It's my initials, sl@afab.com. Always, I was taught to talk to the folks, and especially I do a lot of vendor-neutral stories. So, you know, I like to talk to folks that are, are not necessarily pushing a product, but but an idea. That's right. You know, how can we present what's going on in the world of privacy? You know, with new ideas that maybe, you know, maybe people have an idea of, of how things should be, you know, it's there's a better way a different way of looking at it. I'm always open to hearing about that.
Debbie Reynolds 59:35
Yeah, that's amazing. Definitely reach out. Find Stephen on LinkedIn. Look at his posts and his articles. They're always illuminating. Definitely, but thank you so much. And we'll definitely talk soon. We always end up running into each other.
Stephen Lawton 59:51
Oh, thank you so much again for having me. It's always a pleasure.
Debbie Reynolds 59:56
All right. Talk to you soon. Bye bye.