E286 - Bradon Rogers, Chief Customer Officer, Island
Many thanks to our Podcast Sponsor Island - "You can learn more at island.io, where Island is rethinking how enterprises secure data, gain visibility, and manage privacy—without getting in the way of how people work."
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:14] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:26] Now I have a very special guest on the show,
[00:29] Bradon Rogers. He is the chief customer officer at Island.
[00:34] Welcome.
[00:35] Bradon Rogers: Thank you Debbie, for having me. I appreciate it.
[00:37] Debbie Reynolds: Well, I am excited to have you on the show. You and I have had meetings before.
[00:42] I actually looked at your product and I was blown away. Also, I was very impressed with you and your technology background and how you look into problems that enterprises have around wrangling data.
[00:56] But you know, give me an overview of what you do at island and what problems you guys solve.
[01:02] Bradon Rogers: Yeah, so the role that I'm in is anything that we do that's engineering centric in the field is my universe. So that says customers test island and see how it applies to their own use cases, their own problems before they've ever become a customer.
[01:15] We help them through that. I work with a team of people who engineering folks that help people deploy and use island at scale and large environments with again complex problems are solving and then support of the product as well.
[01:26] So very fortunate to be deeply involved in some really, really cool transformational outcomes.
[01:33] Customers have been chasing very difficult challenges and data protection is right at the heart of that. For many of the environments I'm working with them.
[01:39] Debbie Reynolds: I think that's true. Let's describe the enterprise as it is now. So what is the hellscape that people are dealing with the enterprises as it relates to data?
[01:49] Bradon Rogers: It's a great way to describe it, the Hellscape. Yeah,
[01:52] we've taken a series of technologies, we've got a user that has to access applications and resources and taking a series of technologies for the past 20 years and we bolted things on around their experience.
[02:04] Some environments we take traffic and we steer traffic back through on prem resources. We steer it through clouds to inspect content,
[02:11] look for stuff that's sensitive the data in this discussion here to make sure we're protecting the right stuff, sometimes we put an agent on the host and again try to assert our will for data protection there.
[02:21] But the reality is we're just bolting on experiences,
[02:26] better yet bolting on technologies around the user experience.
[02:29] And oftentimes those things start having a negative impact on the user experience.
[02:33] So you can imagine if we start leveraging things like virtual desktop infrastructure,
[02:38] we start giving people a virtual Desktop. And we're streaming pixels to the desktop. Well, streaming pixels is not a great experience for the end user. At the end of the day, typing a keyboard and imagine you're a developer typing keys and you're trying to develop code and every keystroke about a half a second behind everything you put in.
[02:53] That's frustrating.
[02:54] Trying to backhaul traffic with breaking inspective ssl, looking inside of stuff. It's just complex on practitioners to assert their will. It's complex on them to stand up the stack of technologies.
[03:04] And then it also basically has a big massive impact on the end user on the back end whose job becomes harder to do at the end of the day. So all those things become just real challenges in how orgs in the status quo have historically approached making sure the user safe as they access apps.
[03:20] And even worse,
[03:21] you start thinking about the tactical areas of data protection.
[03:24] Debbie, you've worked dealt with this a long time. Since this is a topic at hand for what you deal with every day.
[03:30] That the way we've had to attack and approach protecting data is just incredibly complex and we're constantly chasing data.
[03:38] And that sounds kind of like a weird statement, but the data is living and breathing. It evolves all day long inside of an organization.
[03:44] Then the semantics that people use, the nomenclature they use, the way they word things in paragraphs and the notes physicians enter in a field or someone, the data they enter a deal room for merger and acquisition looks different than what we originally trained on.
[03:58] So we're constantly tweaking and tuning to chase the data. So we've got to think of an alternate approach. We're not chasing the data and we can start dealing with things like unstructured data very effectively.
[04:07] At the same time, give the organization the protection, the data that it needs and the privacy that it requires. And then the end users the same thing, keeping the user safe and giving them privacy when they require it to.
[04:16] Debbie Reynolds: I want your thoughts. So my thoughts are that there are two big problems in organizations. There are a lot more than two, but I just mentioned two, right?
[04:28] One is that data is in all these little different fiefdoms, for lack of a better term, right? So they're in all these different places and then you have people in all these different silos.
[04:42] But data is like water, right? So it runs through organizations and a lot of times it ends up in places where it's not supposed to be or the people who need it can't get to it.
[04:53] But talk to me a little bit about that tension.
[04:55] Bradon Rogers: Yeah, that's a really good parallel when you describe data being like water because you know, water finds its way through the cracks. If you don't seal something up perfectly, water gets through.
[05:04] And you're right as well, it flows through the organization.
[05:07] And one of the challenges we also have too is that data differs in different parts of the organization by domain of expertise.
[05:14] And a lot of times our approach to data protection has been mechanics that are kind of a singular approach to everything all at once.
[05:22] And the people unfortunately that are tasked with building policies,
[05:25] it's quite difficult for them because they may not be an expert in legal and yet they got to protect the legal data simultaneously is protecting the patient data over here, say your healthcare provider.
[05:34] So it's really difficult to live in all these different places because you have to actually be a domain expert oftentimes of the types of data they deal with so that you can build an effective policy and govern.
[05:44] And by the way, also instrument your way around the data, to your point a minute ago, behaves like water. So I've got to instrument my way myself in a way that seals up the places where the water could get through in that process that starts impacting the end user at the end of the day to be able to do their job.
[05:59] As we start sealing things up because they the user starts feeling confined and blocked and so the workspace starts feeling just like a negative workspace for something positive that empowers them to do their job.
[06:09] We believe there's a way where an org should be able to serve the their will, assert the stack of technologies they need to assert at the same time with policies, et cetera, at the same time giving the users a better experience like.
[06:20] But these don't have to be trade offs.
[06:22] Debbie Reynolds: Absolutely.
[06:23] Well, there was one example that you had given me that you had shown me that was mind blowing.
[06:29] And this happens a lot in companies when I'm working with them. Let's say for instance, you told some you had a training with your company people and you said, hey,
[06:39] when you're entering notes into the notes field, don't put personal information or don't put people's credit card information and different things in those fields. And so those things are almost impossible to really wrangle the way that you want to because a lot of systems aren't made to do that.
[07:00] So a lot of times companies are either ignoring, mostly ignoring them completely and just hoping that there is a risk there, or they're trying to sift through as best they can to try to remove that information.
[07:11] But Give me like an example how someone would attack that problem. Lyla.
[07:17] Bradon Rogers: Well, a perfect example and probably a more modern example would be in the usage of AI.
[07:23] There's things that I may not want going into certain AI providers that are part of the organization's footprint.
[07:29] At the end of the day, let's, let's, let's assume that the organization can safely shepherd you to an acceptable AI universe.
[07:37] Well,
[07:38] you may be fine with certain data for again, I go to healthcare a lot because there's some obvious different angles inside of healthcare organs. They have to deal with a lot of different things.
[07:46] But you may want certain data going into an AI universe that can help you with predictive healthcare process, but you may not want the PI, the phi going in for the specific patient into that model.
[08:00] So at the end of the day,
[08:02] that's a perfect example of what you're talking about. There is I may not want certain things to go into certain fields, I may not want certain data to go to certain places at the end of the day.
[08:10] And if you think about where you have to assert your will on those things, it start asserting it in the network,
[08:16] breaking open SSL and trying to see that stuff, that's almost impossible. It's just very difficult. It's a difficult equation trying to put again back to what we talked about earlier, putting agents on the machines.
[08:26] There's places where you're not getting an agent on a machine. And again, I go back to healthcare again as an example, the physician's device. You don't own the device,
[08:34] their physician's practice does. The healthcare providers got them under contract to support the hospital, but they work for their own practice. That's their device. You're not going to get your agent on their machine to get stuck with these difficult impasses.
[08:45] And so for us certainly believe that there's actually a different way to approach it. What if we approach the problem by governing at the presentation layer of the applications?
[08:52] So it lets me govern anything that goes into a field or anything that's displayed back to a user. So saying, you know what I don't prefer for this tenant of an AI universe, let's don't let data go from this application over into that tenant.
[09:03] But these other apps, sure, no problem.
[09:05] Or in this field, let's govern it, let's do some validation and make sure no one's plugging phi into that field. If they do plug phi into that field, let's redact the field so that when it's displayed on the screen, there's nothing Being shown it's being redacted.
[09:17] But living at the presentation layer is one of the powerful things of what we've done at island by governing what we'll call the last mile. Leveraging the enterprise browser, which we created.
[09:26] And the enterprise browser is nothing more than a browser specifically designed for the needs of the enterprise to give the user a very familiar experience because they already know how to use a browser.
[09:36] So we take advantage of their know how to use a consumer browser, but at the same time put mechanics in it to the for the. Org to be able to assert policy, to assert its will as people engage applications, to give the organization the ability to make things safe,
[09:48] to give the IT or part of the organization and simplify the delivery of applications in the stack there. But most importantly, give the user a very nice experience. We don't have to train in a user, so everybody wins in that process.
[09:59] Debbie Reynolds: Yeah, that's true. And I think the simplicity of a browser, people know what browsers are, they know how to use that. But I think the very different thing that we're talking about is a tool or platform that can actually take action.
[10:14] Right.
[10:15] Do things or stop people from doing bad things especially.
[10:19] And I want your thoughts. I think a lot of organizations, when they deal with like things like unauthorized access or data breach,
[10:27] especially when it comes to data that's handling internally,
[10:31] a lot of times those things are not malicious. Right. Maybe it's lack of training, maybe it's someone made a mistake. And so I think trying to solve those issues or eliminate them at the start really helps companies without them trying to look for these edge cases.
[10:50] What do you think?
[10:51] Bradon Rogers: That's exactly right.
[10:52] I think one of the challenges that we have, we historically in cybersecurity and in data protection, we have a big concept of say no.
[10:59] So we go into block mode, we block everything. And so when the user can't do something, boom, they run into a big wall.
[11:05] Well, when people. You mentioned water example. When water hits a wall,
[11:09] it stops, but then it starts trying to seep its way around the gaps, the holes where it can.
[11:13] And that's what. That's a lot of what end users end up doing in the environment. And like, let's use the AI example I talked about earlier users. Everybody love a cheat code for AI in their universe.
[11:23] Everybody's playing with it. The average end user is drawn to it,
[11:26] if nothing else for helping them write their emails. But some users are using it for more advanced types of things as well. And there is a lot of potential in AI, but you want the user to use the sanctioned AI universe.
[11:37] At the same time,
[11:38] if you didn't take this approach that we've always had in cyber, that said, the say no approach, what if we could say yes? What I mean by that is let them use personal stuff.
[11:46] They can use personal,
[11:48] you know, chat, GPT or personal Claude or something like that. Let them use personal stuff. But company data won't spill beyond the boundary into those personal areas.
[11:57] So we can create a perfect line of demarcation in there where the user works seamlessly in the workspace for the work. They work in the sanctioned AI with the sanctioned apps.
[12:05] But over here in the personal space, they still have access to personal. So as a result, to your point a minute ago,
[12:10] water finding its way around the cracks at the end of the day, even if it, when it finds its way around the crack at the end of the day, there's no risk that it's going to cause damage to the internal parts, because the internal parts are what's sealed off from,
[12:21] from the user's personal workspace over here. So it is one of those approaches. My belief, it's a philosophical approach that's wildly different than what we've seen in cybersecurity. This concept of being able to say yes versus the say no police.
[12:34] And then all of a sudden, users become more comfortable. They're less apt to try to find their way around the system because they can access the personal things they need.
[12:41] And so everybody's happier at the end of the day. And so at the end of the day, data becomes infinitely more protected when the users are not trying to circumvent the system, and users are working within the system very effectively.
[12:51] Debbie Reynolds: Explain to me how the boundary works. This is fascinating. Right? So a lot of times when people think about boundaries, it's like, okay, I go into an app,
[13:00] there's an access control, and certain things I can access and certain things I can't.
[13:05] But the way that you all are describing boundaries is at an enterprise level. Can you talk about that?
[13:11] Bradon Rogers: Yeah. I think of a boundary as a bit of a virtual perimeter.
[13:15] The user doesn't see the boundary. It's there, though.
[13:19] And the boundary helps define where data, the organization's application remains. And what it really does is lets the organization maintain custodianship of their data.
[13:28] It lets the user have fluid movement across the corporate applications.
[13:32] But the corporate data doesn't spill beyond to something personal. Now, historically speaking, if you go back, you know, 20, 25 years of cybersecurity,
[13:40] we've created boundaries before.
[13:42] But the boundary, usually the edge of the boundary is some big block page. You can't do this.
[13:47] So we're trying to create boundaries and in the traditional sense of it,
[13:51] I'm sure you probably remember this, at least I do with my gray hair.
[13:54] The, the boundary was, it was a perimeter. You have a firewall. And everything behind the firewall is our corporate stuff. Everything outside that was not. And then obviously we went to the SaaS universe.
[14:04] A lot of other things happened that redefined the boundary. In that case, for us, we're building just a virtual boundary. It's a virtual boundary where the user gets the freedom to work the way they would naturally work, as if everything was on the Internet.
[14:15] But when they try to take data beyond the boundary, even to places where they can still access things, I can keep data from spilling over there. So you know you want to use personal Gmail, great, knock yourself out.
[14:24] You can use personal Gmail all day long.
[14:27] You can use the corporate G suite tenant too. But the corporate G suite tenant is in the boundary, Personal Gmail is outside the boundary. And data won't commingle in those particular cases.
[14:36] Debbie Reynolds: And also when you're talking about the presentation layer,
[14:40] where there are things that you can present someone that is actually like redacted or grayed out, where they actually can't see the information,
[14:48] explain that part. That's fascinating.
[14:50] Bradon Rogers: Yeah, it's a, it's a very unique thing about living at the physical control of the presentation layer.
[14:57] So essentially what you get to control is what happens before data hits the glass of the screen.
[15:03] So you might say that field in that. That again, I'll go back to medical because I keep going there. The electronic medical records environment.
[15:12] That field is the field that I know the physicians always type their case notes in.
[15:16] And if you think about case notes for a physician is a perfect example. Case notes are a very non structured form of data.
[15:23] Because the physician's not going to type their data to conform to your DLP policies,
[15:28] you're going to have to build a DLP policy to chase what they put in that field.
[15:31] Rather than that we said what if we could control elements in the application and in that case just redact the field in that application.
[15:39] But because you govern in the presentation layer, rather than presenting on the screen the actual data in that field, you present something that's redacted to the user so they can see that something's there, but it's redacted and they can't read or engage it.
[15:50] And. But living at the physical Glass of the screen is what makes that possible and it's why you can't. You don't see that with things like when you put an agent on a host or when you put stuff in the network.
[16:01] In network technology you've tried redacting data and doing things with tokenization for many years. It's exceptionally complex, it's very painful, it's hard to manage and it. And sometimes say sometimes, most of the time it doesn't work.
[16:11] So it's just very hard to instrument your way that way. We take just a very novel approach to instrument our way very differently in that front.
[16:19] Debbie Reynolds: Very good.
[16:20] I had written an article recently about AI agents run amok. And so when I heard people were taking these agents and installing them on their computers and giving them admin access and stuff, I could have fainted when I heard that.
[16:39] Bradon Rogers: Cybersecurity mind and you going crazy there, I'm sure.
[16:41] Debbie Reynolds: Yes, exactly. But tell me, how do you all work with this new frontier of agents run amok in the organization?
[16:50] Bradon Rogers: Yeah. And I think it's important to remember out of the gate majority of the universe of AI is starting from a position many of your providers, the common providers. There's a lot of focus on the consumer need and that's okay.
[17:03] There's nothing wrong with that. I mean there's billions of users around the world. If they can encourage to use their own AI universe,
[17:10] they can capture that usage and monetize off of that for consumers and. But that need is different than let's say a large financial services firm or a large healthcare provider, et cetera.
[17:21] Like my need to use AI is going to be applied differently.
[17:24] And so we've got to think about things more in terms of how the enterprise needs to ensure its,
[17:30] the safety safety of its operation and make sure things are. Things are copacetic with the enterprise requirements. Just because you wanted to adopt AI does not mean you got to walk away from your data protection requirements,
[17:42] does not mean as a financial services provider you got to walk away from your fiduciary duty and your regulatory requirements that you're bound by.
[17:50] It does not mean you walked away from the need to have credential management and have effective access to internal private applications to bind your core. All those things you built for years,
[18:00] those core requirements, I mean they still exist and they're probably more important in the AI generation. So for us there should be a really clear and easy way to take any consumer AI and then put island around it to make it enterprise grade and make it enterprise ready.
[18:15] And then that also falls into what you just talked about a moment ago. Agentic workflows let agents live at the end of the day in the in bindings that are appropriate for the agents.
[18:23] Just like we talked about a moment ago, the boundary. Let an agent do its work with the same policies, the same type of, or should say the same type of policies, the same type of data protection, but let them live in the boundary, let them do automated work for me.
[18:36] But also at the same time, let's recognize as an agent there maybe we may want to put some additional control over it for things like preventing prompt injection, et cetera.
[18:43] So we built mechanics to ensure people can leverage AI safely and the environments they operate within. And that's including agents. If I hopped up out of this chair, all you'd see is an empty chair.
[18:51] But behind that empty chair might be an agent sitting there doing its work.
[18:55] It should be bound and governed in a very similar fashion to make sure it's doing the things it should do and not the things it shouldn't do. And again, keep the data safe at all times in the process.
[19:04] Debbie Reynolds: I want to talk to you about something that's been around forever in the digital world and that's been security by obscurity.
[19:14] And that does not work in the AI age. Right. So you have these agents and things that you're, these new tools that you're using that can go out and seek things that maybe you thought that no one cared about, that no one's ever to find.
[19:27] But how do you work with companies, for example, for them to define what those boundaries are and then the granularity around what people can do and can't do because you can do it.
[19:40] You know, even though you're talking about a boundary, it's a lot more granular than that.
[19:45] Bradon Rogers: Yeah, there's that age old thing, security. I mean, it's been around for ages.
[19:49] Obfuscation can only go so far and we've seen that in the human world for ages. I mean, honestly, it's been debunked over and over again. And if you hide something, people will find their way to it.
[20:00] It's kind of hard to hide it in the AI universe because AI can perform discovery,
[20:06] connect tissue that otherwise the human brain couldn't connect.
[20:09] And so discarding those types of old school legacy thought processes is exceptionally important in the AI generation. I think one of the most important things that needs to happen in the AI generation is eliminating security through obscurity, but also eliminating bolt on security at the end of day.
[20:26] Start really truly thinking about and applying the principles that have been discussed for years around secure by design.
[20:33] And secure by design is literally weaving security into the natural workflow of the end user.
[20:39] And security's thought of at every turn of how the user engages an application as a part of the design of how they use the workflows of the apps. That's really important in the universe of AI because AI is a very workflow centric universe.
[20:51] And again, having stuff built into the experience so that AI doesn't run amok, a bolt on experience is not appropriate. And again, you start thinking about all the universes where AI really doesn't have a good story that can make the organization feel comfortable, like your virtualization environments, your VDI infrastructures.
[21:06] There's not a good AI conversation in that universe. There's nothing good about. It's not an empowerment conversation you're going to have with your users around AI in that universe. You start thinking about your bolt on providers, your SASE providers and others.
[21:17] They're often sitting on the sideline in that conversation.
[21:19] And at the end of the day, it's really about empowering the end users, making the end users effectively doing their jobs and is starting to think about how we can leverage AI effectively in the environment.
[21:29] But again, it's got to be bolted. It's got not bolted on, it's got to be woven into the fabric, the design of the user's flow. And at the end of the day, obscurity and bolt on stuff, they're just yesterday's approaches.
[21:40] One's even more scary than the other, but at the end of the day, they're both very challenging in the AI generation.
[21:45] Debbie Reynolds: Yeah, I like that analogy. Woven in. I think that's exactly right.
[21:50] Give me some examples for the privacy folks of how this tool can solve some of their privacy issues.
[21:58] Bradon Rogers: Yeah, well, there's privacy on two sides of the fence. There's the organization's privacy, the concern that they have to keep their data protected and keep the privacy maintained for their themselves and things they hold dear and their users.
[22:10] And then there's the user's concern over personal privacy. And that's to me, this is one of the best parts of the story. I enjoy this conversation with privacy advocates and privacy experts because at the end of there's three things that really are important about our approach to things.
[22:23] Number one is communication to the end user because it's paramount you let the user know the state. In some situations, especially where you've got privacy mandates that could be in play, you May want to communicate to the end users to let them know you're about to be monitored.
[22:36] Do you consent to covering as a presentation layer lets you layer content over the workflow to communicate to users. You may want to watermark content to let the user know this is something sensitive.
[22:45] You may want to put an indicator on the screen to let them know that you're being potentially being monitored here. So communication matters a lot. Especially you get into the world of people that have the proverbial right to know.
[22:55] The second part of the conversation is the level of audit logging that needs to happen should be contextually driven by the situation.
[23:03] Situationally speaking.
[23:04] Perfect example, I've got a user in a part of Europe where we've got a mandate with works councils and privacy laws that we got to deal with. Well, maybe we want to anonymize the data for those users in certain application engagements, but a user in the US maybe we go differently,
[23:20] we approach it differently, maybe we go deeper, maybe we capture screen recordings because our contractual requirements require us to capture that with our third parties. But again, laws in Europe may differ than what's happening in the us so adherence to the law and again, policy driven level of audit logging is important.
[23:34] And then lastly, it's the custodianship of the data itself. The application lives locally, the user engages the application and it should be a relationship between the user and the app, not things sitting in the middle that are intercepting the data and capturing packets and capturing information around what the user's engaging.
[23:51] So that's really important because for us, policy lives locally on the machine, lives locally in the experience.
[23:56] We're not passing traffic back to the cloud and breaking open ssl. And now I've got US data center somewhere in the cloud that's captured all your data and we gotta now attest to how we've treated that user's data.
[24:05] No, it's a relationship between the user and the app or the policy lives locally and the only thing you gotta deal with in at that point is the audit data,
[24:12] making decisions about where you store that. Do you store that in the geolocation for that user by the law that's required? Or do you store that in the organization's own storage?
[24:20] Or do you not store anything because you decided to anonymize it because there's nothing, you know, that audience of users you don't want an audit event for because you have to adhere to some law.
[24:28] So it is really good because it lives on three real focal points of communication to the user of the level of logging and where to store the data that's associated with the logging.
[24:36] And that really scratches the itch of privacy advocates everywhere when I deal with them.
[24:41] Debbie Reynolds: I love to hear that.
[24:43] I think a lot of us have been trying to do like rubber bands and paperclip type things in the background to try to get things to go together. So to me, this is like an excellent way to leverage technology to really solve some really, really hard problems.
[25:02] What are some things happening in your world that you deal with and privacy that are getting your attention?
[25:09] Bradon Rogers: Yeah, there's certainly the concerns right now in the universe we hit on AI a little while ago. There's the concern,
[25:15] all those aforementioned concerns a moment ago for organizations both for their data and for their users. And then the users obviously have concerns around what are the AI providers and others doing with my data?
[25:25] Some users are more concerned with it than others. And there's just like the natural trade off.
[25:28] You've seen this for ages, I'm sure you've dealt with this before.
[25:31] When the user gets value from something, they're willing to trade off some of their privacy sometimes.
[25:35] And so there's the same with Google Maps.
[25:38] You know, there's data that you're feeding a provider somewhere, but you get value out of the maps and they get something back in terms of the data.
[25:45] So everybody's willing to have an acceptable trade off in that that's really arisen. We see a lot in the conversations around AI usage for the end users.
[25:53] You're going to plug your own personal healthcare data inside a provider.
[25:56] If you get results back that you're happy with, you may be willing to have that trade off. The organizations are less willing to have those trade offs with consumer based AI.
[26:04] They want to make sure that they've got controls over the usage of AI in their own environment for their specific data, that it's being leveraged by the right sanctioned providers.
[26:12] The sanctioned providers are the ones they have their enterprise agreements with. So they've said we're comfortable with how they protect the data, how they maintain privacy,
[26:20] et cetera. So shepherding the users of those sanctioned environments the organization has legal agreements with and the things we can assure protection for our users and our data is really important.
[26:30] But yeah, definitely AI universe is a really interesting area because we all don't fully appreciate how that data is going to be used in five years and 10 years. All the stuff that's being learned about us all now and orgs are dancing their way or stepping their way into it very delicately.
[26:43] A lot of cases and other orgs are jumping both feet in. So good thing for us is we, we provide the mechanics to let them decide how far they want to go and make sure that they have a clear set of controls around how used in today's environment as it's arising.
[26:55] And then obviously,
[26:56] you know, over the next few weeks and months, as it continues evolving, the capabilities continue moving with that evolution.
[27:02] Because AI is like water, it's a shapeshifter.
[27:05] And, you know, we want to make sure the technology that it lives within can shift along with its movements, et cetera.
[27:12] Debbie Reynolds: One thing that I want to talk to you about,
[27:14] get your perspective on, and this is to me,
[27:17] one of the bigger privacy challenges that companies have extremely hard time grappling with, and that's privacy. A lot of times is about context,
[27:27] right?
[27:28] The context in terms of who's looking at stuff, when they're looking at stuff, what they're doing with data.
[27:34] And then also for systems that retain data over a long period of time,
[27:40] like you were saying, certain situations based on how someone's using certain data, like you may have watermark, you may have some type of message that comes up. But a lot of that is about a journey, the journey of the data.
[27:53] And I think a lot of companies don't know how to manage that. But give me a little bit more information about how island does with those things.
[28:04] Bradon Rogers: I'm glad you called that out. You mentioned the journey of the data. We call it data lineage. The lineage of the data, where it began,
[28:10] how it evolved, how it moved through the system.
[28:13] But where was guardrail too? So there was an attempt to move this data began over here in this system and move from here to here to here, this file, for example.
[28:21] And then someone over here tried to upload it to a USB stick and we governed it. But having an understanding of what that flow and that lineage look like is very important for an organization to really get their hands around.
[28:31] Again, going back to the AI universe as well,
[28:34] seeing that data around the environment and inadvertently went to a personal chat GPT, probably something you want to know about.
[28:41] Now, can you do anything about it at that point, now that it's already gone over there,
[28:44] you can't do it anything about it retroactively other than maybe internal revisions of policy, et cetera. But then it's important to understand where those things happened and have a record of those from an audit standpoint.
[28:55] So if an auditor or someone else comes along, you show exactly what happened and where we missed and how we iterated and evolved so we don't have that same problem again.
[29:03] But data lineage is super, super important.
[29:05] And for us it starts the way that's made possible for us is starting made possible through the application boundary construction,
[29:11] understanding what are the corporate apps that are sanctioned, which are the ones that are not the sanctioned ones. And as people engage, letting data flow freely like water again, you're parallel again.
[29:20] Water flows freely to the right places, but it doesn't flow to the wrong places.
[29:24] Debbie Reynolds: If someone were to ask you, like, what is it that differentiates island from other tools? First of all, I've never seen anything like what you guys are showing me and I'm like blown away by it, but tell me your thoughts.
[29:39] Bradon Rogers: Well, it's funny the approach that everyone else has taken.
[29:42] I go back to some thematics I brought up prior to is we built a technology and we thought something was creative. We had to find an artificial way to wedge that technology into the flow of the user.
[29:53] We did that most often with agents and with steerage of traffic. That's what we have to do. We have to put something around it. Our approach is just different because living at the last mile, living at that is the actual edge.
[30:04] I'm sure you've heard this term edge before. The actual edge is where the user's fingers are on the keyboard and the things they see on the glass of the screen.
[30:10] And it gives you a true, a much better understanding of what the user's engaging and how they're engaging it by literally sitting in the seat of the user.
[30:18] And so that is a very markedly different approach that would deal with many, many problems orgs have than the status quo. And the status quo is many, many things.
[30:28] But the status quo that we've evolved, we've leveraged for years doesn't add up to a strategic outcome. Status quo helps me go and build a stack of technologies to solve for byo and then I gotta go do a different stack of technologies to solve for a merger and acquisition scenario.
[30:41] And then I got a different stack of technologies to solve for my average end user with some sort of SASE process. Then I've got a call center, I've got to stand up a VDI infrastructure.
[30:50] At the end of the day for us,
[30:51] we call it tactical decisions, Strategic outcomes. Because you make a tactical decision to solve a certain use case and you apply that same set of mechanics, that same architecture against the next use case and against the next use case.
[31:02] And its versatility is what makes it so different. We're able to solve an incredibly difficult or incredibly Versatile or broad set of use cases that have been very difficult for orgs, but in a very, very novel way, like using super simple mechanics.
[31:16] Debbie Reynolds: And it makes me laugh too. I work with companies across all different industries and it's so funny because people like, well do this technology or that technology, I'm like, all this stuff is built on like the same.
[31:30] You may have gotten a tool where they try to curate it for your use case, the gut, the tool,
[31:40] they're pretty much the same, right?
[31:42] So what you're doing is using technology that's there and trying to stretch it towards how people actually use data and information 1000%.
[31:53] Bradon Rogers: Again, the good thing is this is as pure as it gets when you talk about the concept of secure by design because the user feels a very natural experience. They don't feel all the crazy mechanics.
[32:03] Their job becomes easier in the process. Yet somehow,
[32:07] through the magic of this technology,
[32:10] the cybersecurity team becomes better.
[32:12] They don't become worse by getting the user a better experience. Those things shouldn't have to trade off. And then all of a sudden in the IT side of the house,
[32:18] they're also becoming better. They're able to simplify the stack and reduce the cost of how they operate.
[32:23] And so you get this win across all three of those pillars. The end user, the cybersecurity folks and the it.
[32:29] Everybody wins without having to make a sacrifice.
[32:31] That's super rare. You don't see that very often.
[32:34] Debbie Reynolds: I agree with that. And when I think about like people we always talk about like shadow IT or now shadow AI. I think organizations have always been concerned about people using tools that aren't sanctioned,
[32:49] putting data into systems that are not governed, that they may not see,
[32:53] putting their personal information in places where it shouldn't be. And so what you're doing is creating a system that governs that without the person actually having to think about that.
[33:04] Right?
[33:05] Bradon Rogers: 1,000%. That's exactly the way to look at it.
[33:07] Debbie Reynolds: That's fantastic. That's fantastic.
[33:10] Bradon Rogers: I think one thing, one thing that's really important to remember is for us it's a complete platform. It's not just about the browser. The user's workspace exists in web based apps and non web based apps, it exists in virtualized apps where you may want to engage those, but also in locally installed apps.
[33:26] And it also exists with resources that need network connectivity. So it's a very important part that we not only stand up the pillar of the enterprise browser, but it's the place, you know, that was because that's the center of the universe, where the center of the user does most of their work is inside the browser.
[33:41] But they may work with a specific thick app. We want those policies to live there. What that thick app to have connectivity that it needs to have to the backend.
[33:47] But it's consideration again for the workspace of the user.
[33:51] It's not a consideration for cybersecurity alone or just it alone at the end of the day. But what does the user need to do their job effectively and smoothly without the cumbersome stuff that they bump into every single day?
[34:02] And that's what we do at Island. It's exciting time.
[34:05] Debbie Reynolds: Yeah, it is.
[34:06] A dear friend of mine used to say that people don't want technology, they want stuff that works.
[34:12] That's true. So being able to give them something that helps them do their job I think is tremendous. Also, I think a lot of. And I want your thoughts about this.
[34:21] A lot of companies,
[34:22] because they can't figure out how to do this,
[34:25] it creates like a chilling effect where they're not sharing data or not getting value because they don't know how to do it or they're afraid of doing it. What do you think?
[34:35] Bradon Rogers: There's certainly many times paralysis or being overwhelmed.
[34:40] And if you work around cybersecurity for any period of time, it can be an overwhelming proposition.
[34:46] There's a gazillion different approaches to traditional problems that people have.
[34:51] Again, I go back to various tactical decisions that you make around traditional problems that don't drive strategic outcomes. And you wind up with this massive stack of tech that is just.
[35:01] It's a bit of a Frankenstein.
[35:02] And that for many folks can be an overwhelming experience.
[35:07] And I think a lot of orgs feel that pain today and they're trying to find an alternate path. And we hope to be a core part of the strategy of helping them evolve to that alternate.
[35:16] Debbie Reynolds: So yeah, are there any, like wins or use cases that you can tell us about about companies? Maybe they had barriers that they were able to lock new opportunities as a result?
[35:28] Bradon Rogers: Yeah, 1,000%. We've got a gazillion of them. I mentioned earlier this concept of tactical outcomes or using the tactical to drive the strategic outcome.
[35:37] And that's kind of the approach we take. I see organizations all the time. They find. They find a win over here and that leads to the next win, leads to the next win.
[35:44] The next thing, they're rolling it out to everybody in the organization.
[35:47] 1org. Great story.
[35:49] You know, in the pharmaceutical universe,
[35:52] started off with summer interns,
[35:53] went so well they onboarded a thousand summer interns, didn't ship them devices, were able to get rid of the virtualization environment and then went so well, they said, well, what about this acquisition we're about to do?
[36:03] Could we use this to onboard the acquisitions? Yeah, 100%. Let's walk you through that. So tactical decision here led to an additional tactical decision and then continued down the path, use case after use case to the point where it becomes the technology everybody uses for accessing applications.
[36:17] Very strategic outcome for them. But I've got over and over examples where it started in this little corner case that was exceptionally hard.
[36:25] It started probably, usually, hopefully in the hardest use case they have because everything else after that's easier. So we can go solve the hardest one.
[36:32] I'll go attack all the easier ones afterward. And we see that march very often through orgs that solving exceptionally difficult one and move on to much easier ones. And the next thing you know, everybody in the organization's got it in some fashion or form.
[36:45] And they when I say got it, they've got the island platform, whether it's our full browser, whether it's our extension living in the existing browser they've got or there's island desktop living locally on the machine to protect the thick apps that the user's engaging, all with the same policy over the top of that.
[36:59] But end of the day, I love people to bring us their hard use cases.
[37:03] Debbie Reynolds: I agree. I like hard problems too, so I'm glad. I think we're in alignment there for sure.
[37:10] But if it were your wish, Brandon, anywhere in the world about privacy, what would be your wish?
[37:17] Whether it be technology,
[37:19] regulation or human behavior?
[37:21] Bradon Rogers: Yeah, I think it's a wish that's probably outside of my reach or even maybe outside of technology's reach.
[37:26] But I watch practitioners every day grappling with just misaligned privacy mandates and regulatory mandates. I even watch it just in the States alone. I watch a lot of people working in the universal financial services wrestling with different parts of government that all have their different design point, but they're not all aligned.
[37:47] And so now I've got to have different teams of people, or at least I've got to approach the problem for this entity over here this way and this one over here differently.
[37:54] Yet they should all be aligned. There should be some symmetry across these. You see that especially when you get outside the country, start seeing adherence to 10 different privacy laws simultaneously is really hard for an organization.
[38:06] And it went specifically to privacy. Again, for the folks that deal with privacy every day, I'm sure they would love to see something that was a little more unified. That's just.
[38:14] Again, it's a very ambitious journey because everybody in their universe wants to. Their myopic universe. They're geographic, sovereign. They live in wants to control things and have policies they need for their own people.
[38:24] And again, we need technology that can be versatile enough to help them meet those needs without some of the chaos they've undergone. And we. We believe we're probably the closest thing to helping them on that front.
[38:33] But it's.
[38:34] Yeah, it would be a lot easier for a lot of practitioners I work with if everybody had the same thought processes and had to live up to the same mandates.
[38:40] But, you know,
[38:42] wishful thinking, I suppose
[38:44] Debbie Reynolds: that's a good wish.
[38:46] I agree with that. I agree with that wish.
[38:49] Oh, my God. I know. I just wish even if we never got a federal law, I'm not holding my breath for that.
[38:56] Even if we just make the language and the definition and the requirements similar,
[39:04] that would be so much easier.
[39:07] Bradon Rogers: Yeah. 1,000%. I watch practitioners spinning like a washing machine, always on spin cycle. It never stops.
[39:14] Debbie Reynolds: Yeah.
[39:15] Bradon Rogers: Very hard.
[39:15] Debbie Reynolds: It's true. It's true. It's very, very, very hard.
[39:18] Well, thank you so much, Bradon, for being on the show.
[39:22] I really appreciate it and definitely let people know how, if they want to learn more about Island. How do they reach out?
[39:29] Bradon Rogers: Yeah, just start with our website, www.island.IO.
[39:33] start there. A lot of great resources there for you to learn from. There's a Contact Us page there. We'd love to reach out or love to engage you. Feel free to reach out to me on LinkedIn as well.
[39:42] I'm pretty easy to find, and you see my name here on the podcast, and I'm sure you can, if you reach out to me there. I'll connect to you and help steer you to the right people, too, in the process.
[39:50] But hopefully we get a chance to work with anybody out there that's listening and watching as we have gone through this day. And, Debbie, thank you for having me as well.
[39:58] Debbie Reynolds: Yeah, well, I'm excited because I love to see people putting on their thinking caps and using technology in a way that can really, really solve really hard problems. So you guys are doing a great job.
[40:11] Bradon Rogers: Thank you. I appreciate that. Really do.
[40:13] Debbie Reynolds: All right, well, we'll talk to you soon.
[40:15] Bradon Rogers: Thank you all.