E3 – Allen Woods

Allen Woods Podacst_Guests.gif

SUMMARY KEYWORDS

people, nature, cookies, rest, data, law, code, point, posts, page, matter, edict, means, government, EU, itar, website, terms, cookie, UK

SPEAKERS

Allen Woods, Debbie Reynolds

 

Debbie Reynolds  00:01

Hello, this is Debbie Reynolds of "The Data Diva Talks" Privacy Podcast. Today, our special guest is Alan Woods, who is a genius, in my opinion. He's a self-taught computer whiz as well, as someone who's very deeply entrenched in data organization and also data privacy. He created the website and performance organizers where he works with structured data and also is very much an expert on the Internet, how the Internet works, and how individuals' rights need to be protected when you're dealing with websites. Very happy to talk with Allen today. He's been such a mentor and good friend to me over the years, and we welcome him. You and I have had so many conversations, you know, via LinkedIn and posts and the comments and, you know, private messages. I know sometimes I'm up late at night, you're texting, oh, you should look at this, you should look at that. I think at one point, we sort of geeked out about ITAR ((International Traffic in Arms Regulations). But I'd love for you to introduce yourself, I know that you're retired, we're trying to keep you from being retired, pull you back into other things, but you have a fascinating background, you know, military, you're interested in technology, you're very skilled in structuring information or systems. And, always, I'm fascinated by your work related to, you know, websites, and how that impacts individuals.

 

Allen Woods  01:35

I'll give you a brief introduction for anybody that wants to, it's the same kind of thing I put up on LinkedIn. When I was 17, I was a bit of a rough at school, left school with no qualifications, joined the military, and spent the next 24 years doing what soldiers do, which is basically getting drunk all over the planet. And around about the 15-year point, I decided to retrain on something after many years of operational service, and there happened to be on IT. And I did the classic, fund yourself through college things the next seven years, and also spent the last two years of my service in a Military Small Systems Group. And that's basically where I learned my trade. By the time in terms of training and qualification, I mean, I was training for seven years, on and off in IT. By the time of leaving the army, I was described as the backbone of my core programming effort by my then boss. I then studied for longer, I ended up being a chartered member of the British Computer Society but spent the rest of the time working for the Ministry of Defense on all kinds of things. I think it was partly because I retrained in IT and qualified in IT at the point of which PC's started to proliferate. And I could make the things do things that other people couldn't. And my last job was riding shotgun on the user assurance testing for the Voyager program, which is the tankers. I wrote the FEOD's first health and working health and safety information system, which is how I got on to ITAR because that brought me into all kinds of things to do with hazardous materials. And just about everything inside of defense, quite obviously, is about information management, security, like you wouldn't believe, and so on and so forth. I retired two years ago now. But when the GDPR came out, I took a look at it, got to Article 17, and had the biggest smile on my face. It's just not real. 17 is the right to be forgotten because it struck me that for inside the wire, given the information management capabilities, which the military has, which are extensive, there were things that Article 17 by itself, where I knew were extremely difficult to pull off. Document reduction and so on being just one of them. So I decided to post just an old geeks witterings for want of a better phrase. And I posted regularly once a week at about the same time. And by the time I left, I was getting 5000 views, so it must have been going down well somewhere. And publishing stuff on the basis of one of the difficulties with all of this is that it is hideously complicated. There's no way around it. And I decided to try and post by explanation and demonstration. So the posts I will put up would be the documents with slide decks with a video over the top. So hopefully, everyone's learning scales now. It was an interesting thing to do. What it also did was taught me a lot. And one of the things that I'm quite happy saying is that the law is flawed in so many ways. It is very much behind the curve, technologically speaking. But it's the best they've got at the moment. And that's me in a nutshell. Well, now I'm just an old guy sitting in a cupboard, doing Waldorf impressions with a dog with a dog. Yeah, you'll hear him surely. He's a little tiny terrier, we tried to pass him off as a toy Alsatian.

 

Debbie Reynolds  05:32

Oh, my goodness. Well, I would love to hear your thoughts about Schrems II. We were talking. Actually, I was joking before, before we even started recording, we just got into the thick of things. But I would love to hear your thoughts about that decision and its impact.

 

Allen Woods  05:51

I think it is probably a very good dissertation on the law and the nature of the conflict between EU law and the way US law is to do with data transfer is structured. But I also think it was technologically illiterate. And the reason is that it was, in effect, a no warning edict. Now, when I read it, and it was another jaw-dropping moment, you go back and read it again, what it did for me was, I used to run a website, which contained all of the stuff I put up, and it did stuff like show browser fingerprints, it brought about a need to review, I was going to go and wondering how I could fund the website and keep it going on a voluntary basis. But one of the things I decided to do was look into putting up one of these donate buttons, and so on. And I put the code up, and then thought, well, best check it, see what's going on when it's doing. And that took me into a world of complexity; it's just not real, there was nobody doing anything illegal. But what the Schrems II judgment meant was that any transaction data going into the US would, as given that the Schrems II judgment was in effect in an edict was just wasn't viable. Now, again, it's a matter of the way the world I think it's been astounding over the last five years about the way in the world has become even more connected than it was, say 10 years ago, we've got a vacuum cleaner, Robo vac have watched that go around the room, and it's mapping my house. So now I trap it in between four chairs and let it bash its head against the four chairs, sooner than let it continue. The fact of the matter is that you can make as many judgments like that as you like, implementing them is another thing entirely. Absolutely. And the degree of, especially as an edict with no warning, it just made no sense to me. None whatsoever. And it's from the big things like where cables come from the US to Europe, undersea cables, and all the rest of it, the vast majority of them rock up on the UK. And again, because of other things like Brexit, and what have you, that's gonna present even more to my mind transfer problems. Absolutely. Which it just isn't. The law just isn't geared to cope with it, it just, and furthermore, people can't cope with it and comply with the law as it stands, I don't think, without a whole raft of other associated tasks.

 

Debbie Reynolds  08:31

I feel like there's always this tension between law and technology. Obviously, technology goes farther ahead than law. But I think it's a danger when there are laws being passed about technology without the understanding of what it takes to actually accomplish what's been asked.

 

Allen Woods  08:54

That's right. I mean, it is a matter of just the scale of the business going through London. In terms of money, it's in the trillions of dollars. Right. And it's all very well, people making a decision to say you must stop transferring data because of FISA or whatever. It is a matter of how do you stop it. Right. Because nobody in the right mind is going to stop that volume of data going through anywhere really, it really doesn't matter. So it's a matter of the judges were no doubt writing what they said. But the way they went about it, as an edict, just wasn't feasible.

 

Debbie Reynolds  09:39

Yeah, I totally agree. I would love your thoughts about Brexit. I just posted a video last night about Brexit.

 

Allen Woods  09:48

Well, we are never going to get adequacy.

 

Debbie Reynolds  09:51

You don't think so?

 

Allen Woods  09:52

No, I don't at all. in much the same way for much the same kind of reason. There's there will never be an agreement between the US government and the European governments about the nature of the legal protections being offered. Right. And if we just take the US for a minute much of the law so that the EU objects to stems, I believe straight out of 9/11.

 

Debbie Reynolds  10:17

Oh, absolutely.

 

Allen Woods  10:18

And there is no way on this planet that any US President or Congress is going to change that. No. At the behest of a foreign government. The UK has similar laws in place, so the Investigatory Powers Act, and what have you, and the UK Data Protection Act contains some rather thought-provoking clauses, I think is the best way to describe it, which I can't see the EU agreeing with. And again, it's the same situation applies. There's a bit of brinksmanship going on, but there also has to be a bit of practical reality.

 

Debbie Reynolds  10:51

Yeah, that's true.

 

Allen Woods  10:53

But as adequacies, it stands within under the GDPR, I don't think it's possible or feasible.

 

Debbie Reynolds  10:59

Wow, I was hoping that it would be, but I knew especially, I don't know, maybe we could talk a little bit about the Cloud Act too. I did a video about that I am posting. But as you said, the laws of the US related to surveillance in the Patriot Act, those are over and above the consumer protection laws, we have in the US. And when we talk about privacy laws, we're mostly talking about consumer laws. So in the US, it is more consumer-based versus human-based in the UK and the EU, and that's very different. So not every human is a consumer. So we have more gaps in our laws related to privacy than you all have, but then there's this issue that I feel like this issue is happening all around the world where surveillance seems to be people are kind of taking sides on the surveillance thing. So either we want a lot of it, or we don't want any of it. And it's hard to, especially that's the sticking point. You know, that's a point that a lot of people aren't going to come off of one way or another.

 

Allen Woods  12:09

There are several things I would ask people to consider about that. If EU law, as I understand it, its foundation document is the European Convention on Human Rights or one of its parts. Right. Article 15 in the ECHR, which very rarely gets a mention, basically gives the state or government a cop-out of everything else that's produced. And it does that, by virtue of when there is an existential threat to the state. And the state can do whatever it needs to protect itself right. Now, to all intents and purposes, that's the way I see the Patriot Act, Cloud, FISA, and all the rest. Absolutely. The government has a duty to protect its citizens. Now you can see the abrogation of the remainder of the rights to do with data protection in the Emergency Powers Act to do with COVID. Right. In pretty much every country because basically, the GDPR has been ignored by governments for all sound reasons. And there has been an overemphasis, as far as I'm concerned, on the importance of the GDPR without putting it into the context of the rest. Right. And that, too, is something that I find curious, and I go back to the stuff I was doing on the MOD and its system, one stage we were doing was doing some work to do with something that the military does that's generally lethal. It doesn't matter. But basically, I had to ask, how long does this safety documentation need to be on the system? And the reason was, you could see that this size and footprint of a document file was growing from around about 360 k kilobytes to the biggest one was 75 megabytes because it contained, everyone's been helpful. They put in images and videos and all the rest and what is nominally a Word file,

 

Debbie Reynolds  14:07

Right.

 

Allen Woods  14:08

Now I asked the question, and I said, Well, how and the response was, well, we've got to keep them for as long as the life of a human being. An immediate response to that is, well, how long is that? And it worked out to the suggested working-age from 18 to 65, which meant in principle, something like 50 years, these things have to be hanging around. Now the fallout from that is sizing and all that kind of stuff. You can't make these laws without about to do with tech without understanding the nature of the implications for things way beyond processing way beyond the idea of person. Right. Because this stuff has to sit there. It just does. And that is becoming more and more complicated. The more and more stuff you store.

 

Debbie Reynolds  15:02

And the longer you store it, too, because you run into legacy issues legacy systems.

 

Allen Woods  15:08

Yes, I mean, with the case of this system, there was the idea intention was to, for hazardous materials, which was several million items from the NATO stock catalog, each with a health and safety sheet, or a safety data sheet, each with multiple versions or more overview and the growth in size from kilobytes of just plain text ASCII text, right, the way through to stuff carrying videos and all the rest meant terabytes of storage provision needed to be made just for these forms. The thing about government is if it makes rules, then the first thing it's got to do is abide by them. And in all government organizations, it doesn't matter which government it is, it really doesn't. And there are probably server rooms like you wouldn't believe three or four stories down holding stuff.

 

Debbie Reynolds  15:59

Yeah, totally, totally true. I want two things that you and I always talk about, I love to talk about. One is cookies. Cookies. For me, I feel like there are these people who latch on to cookies. In a way, this kind of unreal. And a lot of the legislation that people are passing our cookies is like, to me is like you're fixating on this particular way that data is is deposited. But while you're doing that, other ways are being created that are probably much worse. So I feel like while people are talking about cookies, by the time these cases even get finished about cookies, we won't even be using them anymore. So I feel like we're sort of missing the point. It's like we're trying, we're trying to say, you know, let's fight cookies, let's like come up with like, some type of law, or thing about cookies, but you know, like, they like I've done a video on beacons and other you know, so we're gonna create something and two or three years is totally different. And then people are gonna try to chase that with the law. What are your thoughts?

 

Allen Woods  17:04

No, I am bemused by the whole cookie saga from what's the same kind of reason as you're describing? There are several things that people don't seem to get just on the cookies, first of all, they can't track. And if I may, we'll come back to why I don't think Google is tracking either, nor is Facebook, nor is anyone else. But there is a massive collection exercise, but cookies themselves are binary value small files, except that they're not anymore because each browser treats them differently. But they are not there to track. They perform the same kind of function as a Facebook beacon, for instance, which people have leaped onto them.

 

Debbie Reynolds  17:43

Exactly.

 

Allen Woods  17:45

And you're quite right. There are other, more sophisticated tools and techniques in place that render to my mind cookies totally redundant. I don't understand the fascination at all.

 

Debbie Reynolds  17:58

Yeah, me either. I really don't.

 

Allen Woods  18:00

It, it's just, I published a DPIA on the donate button. And that, to my mind, illustrates the nature of why cookies aren't important. They're just not. And there is another website analytic tool that everyone seems to use, which when you look at it, there are a variety of techniques employed to capture information about an end-user device, there is a bit of code when I got it when I worked out what it was doing. I was in awe of the coders who had written it because it was very, very smart. But it had to do with the idea of page rendering and the sequence page rendering from the point of request device. Because what it did this bit of code did is a bit of what they call what is called asynchronous comms. In other words, a page can send a message back to a host device server or something without the page refreshing or anything like that. It's all done in the background. The term for it, the acronym uses Ajax. Now, this particular bit of code was taking 2000 character footprint, footprint fingerprint rather, of every single visitor and every single page you use this analytical tool on, it would take another fingerprint. Now there isn't a cookie in sight during the placement of query in that code at all nothing happens. And I said I'll send you the stuff later on. There is another exercise, and the more I see it, the more and more I think, well, nobody knows what they're doing is using cookies anyway.

 

Debbie Reynolds  19:48

No, why? No,

 

Allen Woods  19:50

Not for tracking anyway.

 

Debbie Reynolds  19:52

Absolutely not. I totally agree. Now, there's another thing that you talk about a lot, and I would love for you to expound upon that, and that was about you know, you advocate that people, even people who think they understand privacy from a legal perspective, need to understand coding, codes, how to read code.

 

Allen Woods  20:12

Yeah. And it's right. It's not a matter of being coding proficient. So, I mean, I've been coding for 30 years. And there are people, I think, leave me for dead. It's not a matter about that. But it's understanding. Certainly, when it comes to a website, which is by far the most popular means of connecting to the web nowadays, understanding things like page rendering calls to computer data networks, external to your own side, and understanding what's going on. And above all, to understand what's being delivered inside a client machine. Right. Now, that doesn't mean specifically being able to write code yourself, but it does know, it does mean that you understand what's going on. And if you take a library, which is quite popular, a shareware library, I use it, I think it's, it so much saves so much time, it's just not real thing called jQuery. Now, if you know how to look in a page, then you can call jQuery from using a single include statement in JavaScript. It doesn't actually, that's only one line. But as depending on where you request the library from, then what happens is the request is passed to the domain, let's say it's on GitHub, but GitHub or something, it supplies the code. The jQuery consists of, and jQuery consists of some 11,000 lines of code. But it's not a small thing by any stretch of the imagination. Now it's open-source, and all the rest of open source is not without responsibility and liability. But what that means then is, when you drop code into a client machine, as the site owner oblique controller, you are responsible for understanding what that code actually does. Right. Now, you then extend that by the various addins, the jQuery UI, and so on, what you find is that the average web page, which may contain, say, a couple of thousand lines of HTML, and a mix of JavaScript, actually becomes a very complex piece of software in its own right. So it's not so much being able to write this stuff. I mean, once you start coding, it's like a disease and rots your head, you get fixated on solving a problem. It's just not real. Yeah. But it is understanding a bit more about what people like me get up to, there are more instances, let's take the British Airways data breach thing, which is finally, come to fruition

 

Debbie Reynolds  22:47

Oh, yeah, that's the good stuff.

 

Allen Woods  22:49

That's a classic skimming attack under the general banner, what skimming means was somebody got some code into somewhere in the BA world. And it didn't go rooting through the servers, from what I gather, but actually latched on to end-user machines. Now, unless you know the nature of page rendering, at least in passing, and you know the importance of testing what goes on to a client-side machine, then actually, you are on an almost the level of business risk is just unreal. And the way I try to point people at the nature of the legal risk. And I know this, e-privacy directive from the EU is shortly to be shelved for something else because it's too complicated to write. But as the document stands, page three para 24 describes the idea of a sphere of privacy or a privacy sphere. And what that actually sets out is the boundary around an end-user device that you should not cross without consent for just about anything. And every single geek I've shown it to has looked at it, read it, and had an OH MY GOD moment because of what's going on in the pages they do. So it's, it's not so much to become a coder. And actually the number of as far as I'm concerned, the number of verbs that people need to know about and to be able to detect isn't actually that many given the richness of JavaScript, right power or PHP or all the rest. But there is a need to understand the kind of things that are going on when you call code from another machine anywhere in the world. And what that machine delivers or may deliver because basically what you're doing is surrendering control to the other machine, and there may only be for nanoseconds. But there's an awful lot you can get into a client-side device in that very short period of time. Very true. It's that that I will wish people would take more notice of. And if I'm mentioned a name, there's a guy called David Knickerbocker. He's doing some, I think, really, really clever work on graph theory, and I'll lapse into geekspeak for a little while, but it's about a different way of looking at data on an ontological basis. One of the things Amazon has sussed is the nature of the ontological relationships between things. And if you don't know the code that is being dropped, then people like me can do all sorts with it. And the reason, one of the reasons why the BA hack was so spectacular, I think, was because its nature was into a machine at the client-side, but most of the clients not even knowing you're just accepting this is BA, and it must be right. But it clearly wasn't. And it has to do with not use it, testing websites, and code delivery into a client device that is relatively straightforward to do. But you do need to be more technically savvy than most people seem to be.

 

Debbie Reynolds  26:06

Yeah, I totally agree. I totally agree. I would love for you to talk about terms and conditions. So no one wants to read these things, right? They're 80 pages long. People, you know, they, I think it's a psychological trick. So they know that people aren't gonna read 80 pages, they just want to get on with whatever it is they want to do. And you know, you and I are, you know, I do read the terms and conditions. So, you know, it is very boring. It's very legalistic. But it's important because you really don't have any clue what it is exactly that these tools are doing unless you read that.

 

Allen Woods  26:47

Now, my introduction to T and C's was a task I did to do with a government outsourcing project. And my job on that was to take what was called the top 20 programs, applications. And the reason they were called the top 20 was because they each had a maintenance thing of annual maintenance fee of around about seven, six, or seven-figure maintenance fee. It was a due diligence exercise. And we went through them and had to go through them line by line, that was the job. It was one of the most boring, horrendous things I have ever done in my life. And I reckon the people who draft these things has the patience of a saint. But we have to go through line by line. And as we gain more knowledge, the more and more we notice things that were actually legally very clever, that was played on a number of internal procedural disjoints that, again, nobody had realized the nature of the risk. And one of them was the acquisition process for large scale programs. And basically, what I'm getting at when I say that is you get the person who requests a complex piece of software to be built, will hand it over to an acquisition manager. In other words, somebody who knows how to source and buy something, who will then buy something, give it to somebody who is the Integration Tester, and what have you. And then it'll eventually get to the point where it will be installed. And the guy who's installing it will take a CD or whatever medium it is and install everything. Now, that's quite common. And what the majors do is they tailor the terms and conditions to the nature of the installations that are being carried out and the changing market conditions. And that's fine. But what it means is that far too often, more stuff has been installed that has been paid for. And again, it becomes a matter of the nature of the terms and conditions are written to protect the intellectual property rights and so on of the people who are writing the software. Fair enough, it's a big investment, and to limit the legal liabilities, but at the same time transfer significant liabilities to the end-user. So it's a responsibility to check this stuff on the basis if nothing else than caveat emptor. But as you say, people don't. And the impact of that can be it's just a business risk I would choose not to ignore another consequence. Now every time I don't, if I can't write software, then I'll buy it, and I'll buy it, and I'll read the Terms and Conditions first before I consider buying. I mean, at one stage, we had a visit from one of the heads of the shed and one of the majors, and he brought with him a young technician. And what we did was we took the young technician into the mess and got him legless, and then grilled him about what it actually meant. I mean, no names, no practical, but what they've done is changed and taken into account the idea of parallel processing, multi-threaded processing, and all the rest of it. And we're adjusting their license terms to suit that. And that's fine. Providing you understand that you are aware and far too many aren't.

 

Debbie Reynolds  30:27

Right. Exactly.

 

Allen Woods  30:28

So because what they were doing was taking advantage or, sorry, exploiting the nature of the sophistication of processing that was becoming possible, regardless of whether or not you actually use the stuff. And that again, that's it, there's nothing illegal going on, it's just good business from that perspective. But it is a matter of caveat emptor with a vengeance. And, most recently, in fact, in the last couple of days, I found, again, tracking down a cookie and what was being dropped. I followed the code, I now know roughly what the cookie does, and where what placing, who, what places the cookie, what the aftereffect of placing the cookie is, and perhaps more importantly, the nature of the licensing that's behind the hosting mechanism. And what that does, as far as I'm concerned, this means that pretty much ownership of the whole of the sites that use this capability rests with the people providing the hosting. So again, that's fine, but there is a matter of ownership in the switches. If not being abused, that has definitely been played fast and loose with as far as I'm concerned.

 

Debbie Reynolds  31:46

I agree with that. I agree with that. I think especially all these new technologies are coming out, they're almost assuming that you're like a cyber expert. Yeah. That you would have to meet to understand what's going on and how it really protect yourself. So it's like you're, you know, it's like giving a piece of steak to a baby.

 

Allen Woods  32:06

Yes, it is. I decided that when I go up to my fridge, and it tells me I can't have any ice cream, me and it are gonna fall out over an axe.

 

Debbie Reynolds  32:14

That's right.

 

Allen Woods  32:16

Because that's what's happening, it's you buy a fridge now. And it's it. It's getting to the stage where you must have spare IP capacity on your router in order to get the bloody thing to work. But they're doing that for a reason.

 

Debbie Reynolds  32:32

Yeah, exactly.

 

Allen Woods  32:33

I'd read somewhere that there has been a battle over the last few years to get inside the house, your domestic residence, and all the rest of it. That's largely been sorted out. And I think

 

Debbie Reynolds  32:44

I think we lost.

 

Allen Woods  32:50

Like I say, if the fridge says no, you can't have any ice cream, it's getting it. There's no messing about. But it's a matter of the nature of data gathering. People use the word tracking, I don't like it. Because it implies tracking of individuals. And I don't think they're doing that or rather. Some are, but the vast majority aren't. Yeah, Jeff Bezos came out with a quote about a shirt. What he said was, I'm not particularly interested in one person buying a shirt. But if I can get an interest on, if I can get an idea of what 500,000 people who might like that shirt might be, then that becomes extremely useful. And I think the better way to look at it is that there is a massive collection and collation exercise from which you can draw fairly accurate inference about the way people are going to be living their lives, for want of a better phrase, not only detail. Absolutely. But in the nature of the things you are interested in. Now, it's absolutely the right thing to do, especially while it's feasible,

 

Debbie Reynolds  33:56

Right. And you talked about this movie, this film, or this movie called "Margin for Error." And we're talking about skimming of tweets, or just skimming in general, I think skimming illustrates that point really well about inference.

 

Allen Woods  34:11

Yes, they, if you cast your mind back to just after the GDPR came into the statute book, the UK ICO raided Cambridge Analytica. And the ins and outs of the story are no doubt well publicized. But one of the things that legislation like the GDPR hinges on is the idea of person as thing. And PII is personal information identifiers and all that kind of thing. Well, if you have access to enough data, and this comes back to the ontological, the word ontology and the nature of the relationships between things that you can exploit, especially if you have composite data sets that are huge. Above the terabyte level, then you can discover things about socio-economic groups. And you don't need to target individuals to do that. What you do is identify the social-economic groups and then just wait until they crop up on your radar and put the right stuff in front of them, you don't need to hunt him down, they'll turn up.

 

Debbie Reynolds  35:21

Right.

 

Allen Woods  35:22

Now, this margin for error is a slightly different take, as I see it on the CA thing. In that what they do is they take tweets and use them for polling purposes in the same way as Mario or Gallup do with the in, in the aim of trying to identify voting intent or anything, there's no reason it's just voting just happens to be the thing that's mentioned in the documentary. Now, that is something, one of the things they say in the program is that they do not target individuals. What they are looking at are common words or phrases that score of interest to their customers. So eco-friendly words and phrases would tend to indicate that somebody is interested in greenhouse gas kind of thing. Now, there are lots and lots of words, there are lots and lots of work going on under the banner of natural language processing, and machine learning, and so on, which means that if you can get access to that, lots and lots and lots of data, you might not get it right the first time, you might not get it right, the second, but it's just a natural maturity of processing capability because you will get it right.

 

Debbie Reynolds  36:41

Yeah.

 

Allen Woods  36:42

Now that becomes something that if you can say that in such a sort of state in the US, everyone, the vast majority are going to vote Republican and you can do it accurately, then that's extremely valuable to politicians and a whole raft of other people besides,

 

Debbie Reynolds  36:57

Yes, it's true.

 

Allen Woods  36:58

And that's the kind of thing that I think margin for error is about. It's not targeting individuals, it is identifying social-economic groupings and trends amongst those social and economic groupings of what they write and what they do and what they say. Again, it's just a different way of looking at vast quantities of data, but it's a valid way to do it. And it's actually extremely useful if you have a mind to do that.

 

Debbie Reynolds  37:26

Right. I agree. Well, we're ending, we're at the top of our show today. And this is fascinating. You know, you and I, we can talk like this for hours.

 

Allen Woods  37:39

I'm gonna send you some stuff to look at. I've mentioned this before the video, and it's about this cookie thing that I discovered the other day.

 

Debbie Reynolds  37:47

Yeah, I would love to read it.

 

Allen Woods  37:49

And it ties in T and C's and all of those things. It is a massively changing world.

 

Debbie Reynolds  37:55

Yeah, it is Totally. Totally, I totally agree. Well, thank you so much for being on the show. I really appreciate it. This is so much fun. And I know that the listeners will really enjoy just listening to us talk. This is basically how we talk, right?

 

Allen Woods  38:11

It is it is. And to be honest, I think there are a few people whose posts I regularly look at, and you're one of them.

 

Debbie Reynolds  38:20

Oh, thank you so much.

 

Allen Woods  38:22

Let's be creepy for the month. Just wait until you get the bill.

 

Debbie Reynolds  38:25

Oh, that's hilarious. That's so funny. Well, thank you so much for being on the show. And I look forward to us talking in looking at the documents you're gonna say, okay, no

 

Allen Woods  38:34

worries. Watch your inbox

 

Debbie Reynolds  38:36

Okay. Talk to you soon.

Previous
Previous

E4 – Leonard Lee of neXt Curve

Next
Next

E2 – Susan Brown of Zortrex