E235 - Carly Kind, Privacy Commissioner, Office of the Australian Information Commissioner, Australia
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:13] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:26] Now I have a very special guest all the way from Australia,
[00:30] the Privacy Commissioner at the office of Australian Information Commissioner. Welcome, Carly Kind.
[00:37] Carly Kind: Thank you so much for having me, Debbie.
[00:39] Debbie Reynolds: Well, it's a pleasure to have you here. I've followed your work for a number of years and I'm really excited when I saw that you had taken over this position in Australia.
[00:48] You've already made some pretty significant moves that people are watching, especially people like me, around the world. But why don't you tell a bit about your journey, your career, and how you came to be be the Australian Privacy Commissioner.
[01:01] Carly Kind: Great, thank you. So I started out my career studying law and then practicing in particular in criminal defense law. And that really opened my eyes to the way in which law is a kind of tool of social justice and can advance but also undermine people's rights.
[01:18] So I got really interested in issues of social justice and human rights. And when I was studying law, it was in the early 2000s, and I became very engaged both as a student of law and as a student of politics in issues around the war on terror, national security,
[01:35] and in particular the use of technology to enable state surveillance in the context of the war on terror. So that was really my first kind of trigger to get interested in privacy as an issue, but also more generally around human rights.
[01:50] I moved to Europe because I felt that pursuing a career in these issues of human rights policy would be more challenging in Australia than in places like Europe for a few reasons.
[02:02] But one of the key ones was that the that Australia still doesn't have a national human rights law or Bill of Rights or anything like that. So there's a lot less litigation in the courts around human rights issues squarely in that space.
[02:15] So I moved to Europe, I got an internship with the un, and then I spent a number of years working around the un, in particular with special rapporteurs, which basically independent experts that advise the UN Human Rights Council on human rights related issues.
[02:32] And then I became more and more interested in privacy. We in particular were looking at privacy of vulnerable communities and privacy of people living in poverty and how at that stage in the early 2000s, emerging systems of technological systems in particular in the welfare space, were being used to kind of surveil and track people who were the recipients of welfare benefits.
[02:55] And so there's a kind of privacy threat, I suppose. And then I was fortunate to get a job at an NGO called Privacy International, which is really where I was fully indoctrinated into the world of privacy.
[03:05] And that was about 15 years ago. And since then, I've worked with a range of different charities, human rights organizations like Amnesty and Human Rights Watch, advising them on privacy, technology,
[03:16] free expression, and all of the human rights issues that intersect with technology and the Internet, essentially. In 2019, I took up the role of the first inaugural director of the Ada Lovelace Institute, which had just been established.
[03:30] ADA is a research institute based in London that is dedicated to ensuring that data and AI work for people and society. And so I spent about five years working on AI governance and researching the impacts of AI.
[03:43] And then I saw an opportunity to come back to Australia, where I hadn't lived for 16 years, and also to try my hand at regulating, which I have been very much in the not for profit, charity, international organization space.
[03:57] So this was a great opportunity to see what it's like to work in government and what it's like to really use the laws on the books to control and to shape the.
[04:08] The direction of technology and how it affects people's rights. So it was a very attractive position for me and I threw my hat in the ring and here I am a year later, having started last February.
[04:20] Debbie Reynolds: Well, you're doing a tremendous job and we're applauding your work and we're watching you very closely in the things you're doing.
[04:27] I want your thoughts about this. So when you talk about human rights and we talk about privacy, a lot of times when people are on different sides of the fence, one is like, you know, if you enforce privacy, you're going to stop innovation, you're going to have companies lose money.
[04:44] Then the other side is like, the things you're doing with data can hurt people. You're not really thinking about it. And so I guess on one side, some people think of privacy as a right and some think of it as a privilege.
[04:57] So I want your thoughts on that.
[04:58] Carly Kind: Oh, that's a really complicated question. I'm not. Coffee before I get into that one. I suppose my starting principle is that we as a international community have articulated a set of rights that we think are important to preserving liberal democracy and the dignity of humans.
[05:16] And privacy is a key one. There's only about 20 of them and privacy is one of them. So it is a really key feature of how We've come to understand the framework that should govern our society is particularly in the aftermath of the atrocities of World War II.
[05:31] And the right to privacy was very much informed by what happened in **** Germany during the Holocaust. And the notion that no human should have their body or their family or their correspondence, subject to the scrutiny and intrusion of another and particularly of a state.
[05:49] And you know, that is my starting point, that the principle is key to who we are as humans. It's key to who really privacy at heart is about autonomy. And autonomy is really key to who we are as individuals and really key to our human dignity.
[06:04] And that is played out in all sorts of different ways as to whether there's utility and having the right to privacy. I suppose the go back to, you know, questions human rights, not necessarily about utility, they're about, you know, things like dignity.
[06:18] But I also see that there's a lot of utility in protecting the right to privacy.
[06:23] And I would say that for a few reasons. I think we know that societies and where in which individuals are free to be who they want to be, including an element of that is being able to think, communicate, speak without being afraid, that is that they're being watched and conforming to some ideal articulated by another.
[06:44] Those societies are more creative and vibrant and free and going to lead to more innovation, more ideas, more progress.
[06:53] So I think at a high level, actually societies where individuals have a good degree of freedom and human autonomy really does also it's consistent with innovation.
[07:03] I also think that when it comes to technology in particular,
[07:08] privacy has been shown to be good for business. It's actually about being responsible with the data that you have, the data that you collect, and thinking through the risks that might flow from that and mitigating those risks.
[07:24] And if you do that, you're minimizing your risk as an organization. You're also able to build trust. And building trust is really key for individual buy in for government buy in for B2B services.
[07:36] We know that kind of trust and confidence in technology tools is a really key element to ensure their adoption and their success. And there's nothing easy about technological uptake or digitization.
[07:47] It is a messy, messy human project. And privacy is a really great building block to encourage that. So I don't really buy into the notion that the two should be put into some kind of head to head clash, but rather that privacy can absolutely be consistent with and even support innovation.
[08:07] I would just caveat that with saying that I think as ever in all fields of regulation, we can err too far on the side of more rules, more layers of process.
[08:19] And we should really be reflective of that. As legislators, as policymakers, and in my case, as regulators who have to apply the law. What we want to achieve is the right outcome.
[08:29] We don't necessarily need to lean into the process just for the sake of it. And perhaps in the data protection space, that balance hasn't always been gotten right. And that's something that we could work on more going forward.
[08:42] Particularly if by getting the balance wrong, we undermine the entire project of privacy. That would be a really bad outcome, I think.
[08:49] Debbie Reynolds: Absolutely. I agree. It is about trust. So I think, I think companies understand trust when it comes to maybe a product or service that they use, but maybe they're learning more that part of that trust also is how they handle people's personal data.
[09:03] And there could be dire consequences for that from a business perspective, including from a human perspective as well. I'm just curious, so as someone who's had work in private sector, you've done other types of policy work outside of regulating.
[09:20] What has been surprising for you as you stepped into your regulator role?
[09:25] Carly Kind: Lots of things,
[09:28] I suppose.
[09:30] I've certainly had my eye opened to the difference between having a law on the books and really ensuring compliance with that law and all of the tools that are in my toolkit that can help me achieve compliance with the law.
[09:52] And there are many tools and none of them is perfect. We don't live in a world in which just because something is a law, it means all businesses or all people comply with it.
[10:02] And you know, if we did, we would have no, no crime, no people, you know, getting parking tickets, all sorts of things. So how do we get businesses to comply with laws that are on the books when it comes to privacy?
[10:13] I think one of the big surprises to me is, as I said, is that there is a few different tools. You know, some of them are more effective than others and some of them are easier to achieve.
[10:22] A key point is some of them are cheaper than others as well. So one of the really effective ways is to go out and talk about what the law demands and to educate regulated community.
[10:33] Because actually people have. And businesses have vastly different understandings of the law. They have really different size legal teams. You know, small businesses won't have an in house lawyer, they won't have an in house team, but big businesses will have whole teams that have, they'll have chief privacy officers,
[10:48] they'll have teams. And so how do we make sure that everybody's, there's a level playing field there. How do we fill the gap for those that don't have the internal resources to do the legal interpretation themselves?
[11:00] Then there's also. I have a range of different enforcement tools available to me and the most, potentially the most impactful of those is taking an entity to court. But taking an entity to court for non compliance with the Privacy act is a very involved and difficult thing.
[11:15] It requires a lot of work to put together a case. Many thousands of documents exchange via discovery. It also requires, you know, to meet a particular set of legal standards.
[11:28] We are a little in the hands of the judges in how they interpret the law. We are also in a space in which there hasn't been a lot of jurisprudence in Australia around the Privacy Act.
[11:38] So it's a little bit unknown how these judges who've never really had to interpret this law before, will do that when we end up in court. So there's lots of.
[11:46] I think the big surprise to me is just the range of different factors that we have to take. Take into consideration the high degree of uncertainty, the resource intensity of taking an entity to court.
[11:56] So every time there's something in the news, this is every day, you know. Yes, last week it was Amazon's changes to policies around Alexa disclosing data back to the cloud, for example.
[12:06] You know, every time there's something in the news, we get all of these incoming requests. What is the OAIC going to do about it? But the idea of taking on every single one of those matters in a, in a very adversarial enforcement action is pretty fanciful.
[12:20] You know, we're a small regulator. We can't take on every issue that comes before our door, particularly in a world in which there is a privacy, new privacy issue every day.
[12:29] So we need to think about how we use those different tools and what is most effective and when. So that's been the big way in which my eyes have been opened.
[12:37] And part of the challenge, but also part of the fun, because it is quite a creative process to think about what's the most strategic move to take in each individual case.
[12:47] Debbie Reynolds: Right. Regulators can't take on every case, but trying to figure out where are the best places to use your resources and kind of sending the right message to the business and community of people as well, I think is a big deal.
[13:02] So what's happening in the world right now that's concerning you most as it relates to privacy or data protection?
[13:10] Carly Kind: Where do I start? Well, I know you want to talk about AI, so I'd start with AI.
[13:17] And I think we're really still feeling our way as privacy regulators about how to talk about AI and the risks that poses and potentially the opportunities that it creates around privacy.
[13:29] But I do have concerns about the way in which the,
[13:33] the only one that is coming to mind is hype and it's not necessarily the word I want to use. But I mean, I suppose I'm trying to encapsulate the excitement, the investment, the momentum around AI is incentivizing the scraping of personal information online, the retention of personal information and the reuse of personal information for all sorts of purposes that weren't originally envisaged by companies when they collected personal data.
[13:59] And that really challenges the way in which we think about privacy protection. Because, you know, the Australian framework is different to the EU framework, is different to the various sectoral laws in the us, HIPAA and COPPA and others.
[14:13] But one, they're all united by the same principles which are articulated by the OECD in the 80s. And one of those is purpose limitation, that you collect piece of information and you use it for the purpose for which you collect it and you limit that use.
[14:25] And what we're seeing with AI is just really an expansion of that, a challenge or a pushing of the boundaries of purpose limitation.
[14:33] So, and I think that that is important for a couple of reasons. One is that individuals already feel very out of control of their personal information. They feel like they don't have any agency, there's nothing they can do about all the ways in which their data is being used and shared,
[14:45] etc. And AI is looming on the horizon. And they, this is kind of exacerbating this feeling. I'm not in control.
[14:53] There's also very specific privacy risk, including the disclosure of personal information through generative models, etc. But I think moreover, we see, I have concerns about the leveraging of access to personal information by dominant companies in the data driven business model space to establish dominance and power in this new AI ecosystem as well.
[15:16] And how that might shape that ecosystem and create unfair advantage may create kind of unequal dynamics. And really know if we're talking about the infrastructure of the future, which many people are calling AI, it's being built off, you know, really problematic business models and problematic data economy.
[15:35] And I think that that link is really challenging. So that's something I'm, I'm thinking a lot at a more practical level. The kinds of issues that are coming to our office and that we, we see as emerging concerns all relate to new technologies in some form or another.
[15:49] So biometric technologies, facial recognition, has of course been a lot in the spotlight in the last few years and our office has done quite a lot on that. But other relate other biometrics technologies are beginning to emerge.
[16:00] Not only things like recognition, but also,
[16:03] you know, how you keystroke recognition, eye movement recognition, these kinds of biometrics that are emerging in different kinds of technology, kinds of data, exploitative practices in the online ecosystem, particularly when it comes to location data, we're seeing lots and lots of app seed data, for example, collecting unnecessarily location data.
[16:23] And I think that this is a big issue of concern for me. Other kinds of tracking technologies like pixels,
[16:30] certainly at the top of our list of things to scrutinize. And then we're also interested in kind of specific sectors where there's, you know, real asymmetries of power when it comes to personal information.
[16:42] One of those is data brokers. They obviously have a huge amount of information about individuals. Credit is another. But also real estate. We here in Australia, and I'm sure it's mirrored there in the US and elsewhere, real estate agents collect a huge amount of data on individuals.
[16:57] In the context of the housing crisis, where people are having to compete for very few rental tenancies at high cost, they're having to submit just the most extraordinary amounts of personal information.
[17:09] You know, pay slips, bank account statements, identity documents. And the kind of governance and use of that data is really of concern for us as well as its security too.
[17:19] So that's the kind of flavor of some of the things we're thinking about at the moment.
[17:23] Debbie Reynolds: Yeah, that's a big list.
[17:25] I agree with the more emerging technologies that we bring into play, just brings up those new risks and those new use cases that we hadn't really thought about or contemplated.
[17:37] So it is a very busy, busy time. I, I'm sure for you. I want to talk a little bit about the online safety amendment, the social media minimum age bill, I'll call it the bell that was rung that heard around the world.
[17:51] And it's very important And I think 50 years from now people will talk about why this was important. But I want you to tell me a bit about, for you, why this particular law or bill was important and what's happened since then.
[18:07] Carly Kind: So the social media age assurance bill essentially says that children under the age of 16 must be banned from social media platforms. It was legislated last year by the government here in Australia.
[18:21] And it responds obviously and quite clearly to, you know, real concerns that parents, teachers and kids have about how social media is shaping children's lives and in a negative way.
[18:34] And I think it's come at a time when this conversation is clearly in the public domain. We had Jonathan Haidt's book the Anxious Generation come out last year and that really was the backdrop to some of the conversation here in Australia and I'm sure there in the U.S.
[18:47] i know there's been some efforts in the U.S. to advance similar approach. I don't know if you've seen the Netflix series Adolescence that everybody is also talking about here now as well.
[18:55] And I think it's quite interesting how we've seen their coalescence of kind of social and as well as legal factors. When the bill was adopted last year, I expect express some reservations about the approach that the bill takes and namely, I suppose from two aspects.
[19:11] One is that it will require and incentivize the collection of a lot of personal information, not only on individuals, sorry, not only on children, but of course on everybody who uses social media because platforms will have to age ashore, all users.
[19:25] And I have concerns about the risk that that creates, not only directly in the collection of personal information, but how it will inure people to providing having to assure their identity online all the time.
[19:38] And that raises any risks around scams. For example, will people really be able to know when a request for identity assurance is a legitimate one versus illegitimate, etc. So I think there's a lot of privacy risks with the approach.
[19:52] But my broader concern is that for me, the premise that social media is inherently such an unsafe place that we must ban children from it really deserves some scrutiny. I think these technologies are what the companies have made them, but they could make them something else.
[20:10] And we can use legislation, regulation and other tools to reshape the platform.
[20:16] And I think we can see that, you know, very demonstrably, not only the efforts to make platforms more safer yield dividend, but the inverse of that. When, for example, Elon Musk took on ownership of X, he removed the trust and safety team from X.
[20:34] And over the course of a year or two, you just saw the degradation of the platform in terms of the. Not only the quality, but the safety really, you know, more outrageous content.
[20:43] As a user, I felt, I felt, I experience that more nude imagery, other things as well. So, you know, we know that trust and efforts and trust and safety do work.
[20:53] And my concern is that having ostensibly removed children from the platforms, but I'm not sure that actually the bans will be 100% effective. Arguably, almost certainly they won't.
[21:06] There will no longer be any investment in making these platforms safe places either for children or for everybody.
[21:12] So I think there's some kind of problematic premises there. It also comes at a time where we've just been empowered as an office to develop a children's online privacy code that mirrors that which has been developed in the UK as well as California and other places.
[21:28] And we've yet to do that work here in Australia to really articulate what children's privacy rights should look like online.
[21:35] So I'm worried that we've preempted that work by now in setting the Spanish. Nevertheless, the ban is here. It comes into force at the end of this year. We are one of the regulators that will be overseeing its application.
[21:49] And I absolutely am open to being wrong about the positions I articulated last year. And it may be. Certainly one of the developments that have occurred since then has been real shift in how the tech companies speak about adherence to norms and rights issues.
[22:05] And that's really come against the backdrop of the changing U.S. administration.
[22:10] And so it may be that they are no longer able to be engaged in a conversation about how to improve trust and safety on their platform. And perhaps the ban, at the end of the day, is the only solution in a world in which those platforms are not interested in coming to the table about how to improve the experience for children and for other vulnerable people.
[22:35] So I suppose we will see when the. When the law comes into effect at the end of this year, how effective it is, how necessary it is, and what the potential unintended consequences may be.
[22:45] Debbie Reynolds: I agree, and I love that you have a very nuanced view of this. So that tells me, which I always knew, that you have a very keen eye to all the different angles that happen here.
[22:58] And I'm with you. So I agree that,
[23:02] you know, when we do a lot of things in age assurance, what we're really doing is sort of asking for more information on people. Right. And so we were saying companies maybe aren't doing the best job or could be doing much better at protecting data, but then giving them more data that then they didn't have to protect kind of creates a bit of a snowball.
[23:22] I talked to parents, I talk to companies. I think a lot of companies don't understand what this means for them. So it's going to change a lot of the Internet in terms of age assurance and other things that they have have to do going forward.
[23:35] Carly Kind: I think that's right. And I think. I don't know, Debbie, if you'll indulge me for a minute, one of the other kind of top level concerns I have is like to your point just then, the nature of the Internet is changing very quickly right now we have on the one hand this space it used to be,
[23:50] you could be anonymous on in some form or another, at least pseudonymous, is going to become a place where we are identified almost all of the time. The other big change that's happening of course, is the introduction of generative AI in search.
[24:04] And I think that also really changing how we experience the Internet. It's a much more intermediated place now. It's a much more identifiable place. I think it's turning into something fundamentally different and as it should.
[24:18] It's a technology, it changes, it responds to how humans use it, I suppose. But I'm not sure if we've completely grappled with those changes that are about to happen.
[24:27] Debbie Reynolds: Yeah, I'm sure we are both going to be watching really closely to see, see what happens in that space.
[24:35] I always like to ask, in a perfect world, if it were a world according to you, Carly, and we did everything you said, what would be your wish for privacy or data protection anywhere in the world, whether that be regulation,
[24:50] technology or human behavior?
[24:53] Carly Kind: Good question. I suppose I would say two things. On the human behavior, I think we would all benefit from reconnecting with why privacy is important as an ideal, as a human right, and stepping back from our preoccupation with online privacy, which is important and a component of that, but really kind of reconnecting with the original notion and trying to build some public acceptance that it's a value that we,
[25:23] we agree on. And I that's a very complicated thing to do. Obviously it will change across countries and cultures, but from the research I've done in this space, there is some notion of privacy at the heart of most cultures, including eastern cultures, contrary to what has been argued in other places.
[25:40] So I think it would be good for us to reconnect with that and for us to understand why it's an important right to begin with.
[25:47] But I think on the technological front, I think we have to. If I could kind of click my fingers and change one thing overnight, it would be to rebalance power in this kind of digital ecosystem by,
[26:02] by removing the dominance of many of the digital platforms because I think that their excessive power,
[26:10] fiscal power, data power, infrastructural power, has, is, is shaping the development of our societies in so many ways. And I think that does not lend itself to privacy preserving or protecting innovation.
[26:27] I think it lends itself to the entrenchment of those. Those actors. And I think we do need a rebalancing. We need a more competitive marketplace. We need public infrastructure to rival private infrastructure.
[26:41] And we need options as consumers and as Internet users, genuine options. I think one of the things we see too often is people don't have any agency. They care about privacy,
[26:53] but they don't feel that they can do anything. So we have this notion of the privacy paradox, where people say they care about privacy, but then they go on Facebook.
[27:01] And I always say, I think you can't interpret their actions as some kind of endorsement of the ecosystem that they're working within, that they feel that they have no choice either because their community's there, their friends are there, their services are there.
[27:14] They have no real power to change the privacy settings beyond a few tweaks at the edges.
[27:20] What I think is really interesting is when. When you're. When you give people real choices and, you know, we don't have a lot of good data on this, but one I like to point to is when Apple turned on the do not track feature on it, something like 94% of people chose to stop tracking.
[27:34] So when people are given real choices around privacy, they choose privacy, but they're not often given real choices. So it looks as if they don't care, but I think that that's not correct at all.
[27:43] So it's really about how you rebuild that agency, and I think a big part of that has to be choice and competition in the digital platform space.
[27:51] Debbie Reynolds: Very good. Well, thank you so much. This has been tremendous. I really appreciate you coming on the show today. Really applaud your work, and we're all watching you all over the world really closely.
[28:02] So thank you.
[28:03] Carly Kind: Thanks so much for having me, Debbie. I really appreciate it.
[28:06] Debbie Reynolds: You're welcome. We'll talk soon.