E256 - Jennifer Wondracek, Director of the Law Library and Professor of Legal Research and Writing, Capital University Law School, ABA Women of Legal Tech Honoree
[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
[00:14] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.
[00:27] Now, I have a very, very special guest on the show,
[00:30] Jennifer Wondracek. She is the director of the law Library and professor of Legal research and writing at Capitol University Law School in Ohio. Welcome.
[00:41] Jennifer Wondracek: Thank you, Debbie.
[00:43] Debbie Reynolds: Well, it's a pleasure to have you on the show. I noticed you from your posting on LinkedIn about a lot of legal tech things. Also, you write about AI as well,
[00:56] and your writing is very crisp, very clear, very to the point, as I would no doubt from someone who's a library scientist.
[01:05] We have a couple things in common, actually. So one is that I've been an adjunct professor at law school. So I was a adjunct professor at Cleveland Marshall. I've been an adjunct at Georgetown University Law School as well.
[01:19] And you and I both have been American Bar association legal women honorees.
[01:27] Jennifer Wondracek: Very nice. Well, it's fabulous to meet another women legal tech already.
[01:32] Debbie Reynolds: Yeah, yeah. And actually my tech career started in library science, so I was a library back in the day. So you're too young to remember when they had card catalogs and libraries, but I was turning those.
[01:46] Jennifer Wondracek: I have actually helped convert two libraries from card catalog to digital. So I can remember.
[01:53] Debbie Reynolds: Well, that was my.
[01:54] Jennifer Wondracek: I look younger than I am.
[01:56] Debbie Reynolds: Well, that was my start in technology, and so I've kind of taken it in a lot of different directions. But I would love for you to tell your background and your journey.
[02:04] I'm just fascinated. So I love data people and I love librarians because you guys are just super smart and very detailed. So tell me, tell me, what's your journey?
[02:14] Jennifer Wondracek: Oh, well, I actually grew up in Northeast Ohio in a small town right up by Lake Erie, so.
[02:21] And we got snowed in a lot,
[02:23] which gave me a lot of time to spend in libraries with books and with tech.
[02:27] I actually got my first computer when I was five,
[02:31] so I started with the Commodore 64.
[02:34] And I knew all along that I wanted to be a lawyer or work in the law. So my mother kept my six year old paper saying, I want to be a lawyer, and then my sixth grade journal saying, I want to be a lawyer all the way through.
[02:49] So law school was always in the plan.
[02:53] So I actually graduated from UNC Chapel Hill in 2003 with my law degree and I worked for North Carolina Prisoner Legal Services.
[03:03] So my clients were the 36,000 inmates in North Carolina.
[03:07] I got married to a video game designer, however, and his job moved every two years.
[03:14] So we went from North Carolina to California. We almost went to Arizona. We went to Florida.
[03:24] So at some point I'm like, I cannot take the bar in every 50 states.
[03:28] So I had a really great mentor in law school. Her name was Laura Gasaway and she was a copyright expert who testified before Congress.
[03:37] And so when I was thinking about what I wanted to be when I grew up, I decided I wanted to be Lolli. So I went to library school with the intention of becoming a law librarian, which I did.
[03:48] I started my law library career Barry University's law school in Orlando in 2006,
[03:54] and I have been working my way through libraries ever since.
[04:00] Debbie Reynolds: That's a cool story.
[04:01] Wow, that's amazing. And you know, I remember back in the day when law libraries were just books, so it was like nothing electronic. And so now it's like a totally different ball game for sure.
[04:12] Jennifer Wondracek: Yes. I think I was one of the last generation to learn in the books.
[04:17] So I have actually shepherdized in print and I would never wish that on my worst enemy.
[04:23] So Lexus and West with their citators are amazing,
[04:27] amazing inventions.
[04:29] Debbie Reynolds: They are one thing that you wrote about recently.
[04:34] First of all, please follow Jennifer on LinkedIn. You have so many really cool things that you talk about,
[04:41] but one of the things that you talked about that's like super, like the rage and the legal profession.
[04:47] I have corpus of friends and all different types of industries. So I have my legal friends or whatever. So the things that are hot with my legal friends are people using AI to cite cases in lawsuits.
[05:02] But you post something about this. Tell me a little bit about this.
[05:06] Jennifer Wondracek: So I recently wrote a blog post for AI law librarians.com about the very first known court case to use hallucinated sites.
[05:17] And it appears that the judge took a proposed order and didn't check it and used it.
[05:24] So the appellate court was actually angrier at the attorney than the court for using citations. And it's the nature of the beast when you're using LLMs or large language models to generate text, it's probability machine.
[05:40] And so it will do everything in its power to make you happy and it will give you cases that don't exist, which is what happened in this case. We believe We've seen over 200 instances of cases turned in by lawyers, by pro se litigants with cases that don't exist.
[06:02] So it's becoming rather an epidemic of late, and we've seen some pretty big names get hit with this.
[06:11] The. The MyPillow inventor is one of the most recent ones.
[06:16] We had Morgan and Morgan get hit with this in a case for Walmart. We've seen Latham and Watkins actually turn in hallucinated sites in a case that they were litigating for Anthropic for Claude.
[06:34] Debbie Reynolds: Oh, my gosh.
[06:35] Jennifer Wondracek: Yeah, it's crazy.
[06:38] Debbie Reynolds: Well, I have a lot to say on this topic, so I'm glad we're talking about it. First of all, this is how I feel.
[06:44] So as someone who does research all the time, you know, I read cases constantly. If you're in a certain area of law, let's say you're antitrust lawyer, you're in a certain area of law,
[06:55] you probably should already know the basic cases that are out there. So if there are 30 new cases that you never heard of,
[07:03] that should be like a red flag to you because you're looking for, you know, maybe you have some Sherman act precedent, some other stuff, but you don't have 30 new things you never heard of in a case.
[07:14] But what are your thoughts?
[07:16] Jennifer Wondracek: I agree. Going back to the very first case, the Mata case that hit the New York Times front page,
[07:23] the gentleman said that he couldn't find the case law, one of which was an 11th Circuit opinion.
[07:29] And I'm sorry, I think every 11 Circuit opinion is at least mentioned somewhere online.
[07:34] So the fact that you can't find it should have been a red flag, even if you didn't have access through a research database. There's also like Google Scholar and things like that, the two cases.
[07:43] So I think there are a lot of red flags and we're seeing practitioners who are too busy to spot them.
[07:52] Debbie Reynolds: Like, to me, this is the argument for not firing your support staff,
[07:57] your research people, and actually hiring them back so they, they can actually check these mystery citations.
[08:04] Jennifer Wondracek: I agree. Run through a brief analyzer, run through Lexis and Wes, do a find and pull. There are all kinds of ways, I.
[08:13] Debbie Reynolds: Mean, copy it, copy and paste it into Google. I mean, it should be some references that show up or whatever.
[08:18] Jennifer Wondracek: Yes. Like it, it should be easy to spot these. And it's a little scary that it's not.
[08:25] We even actually have at least two new products that are designed specifically to spot AI created citations.
[08:32] So hopefully people will get the message. But I don't know,
[08:37] there's a,
[08:38] a great quote that I'm going to paraphrase because don't have that good of a memory. But it's out of a Second Circuit case called Park v. Kim that says something to the effect of, at the very least,
[08:49] Rule 11 requires that you have read the cases to verify that they exist and that they say what you claim they do.
[08:57] I'm like, go, go second Circuit. Your librarians have been telling you that for years.
[09:02] Debbie Reynolds: That's true. That's true. I was doing some research on a case and they were like, oh, you should cite this for that and this for that. And as I was looking at the case, I'm like, this doesn't match up with what I'm trying to say.
[09:14] And they're like,
[09:15] you type it in. They're like, oh, yeah, that's right. You know, I'm like, what? Oh my God. So you definitely do have to research.
[09:22] Jennifer Wondracek: Yes, well. And back in the early 20 teens,
[09:26] I met someone whose hobby was tracking false citations through cases to try to figure out who actually mis cited originally and who is just copying it forward.
[09:42] So we see that a lot. And you know,
[09:44] that was what,
[09:47] about 12 years before ChatGPT came out.
[09:52] So we've always had some miscitations, but the numbers that we're seeing are just wild.
[09:58] Debbie Reynolds: That's true. Oh my goodness. Oh, my goodness. Yeah, that's true. Because I think if you've seen a case cited in another case, you may just want to copy and paste that without looking at it.
[10:09] And they may be wrong too.
[10:11] Yeah.
[10:11] Jennifer Wondracek: Yes.
[10:12] Debbie Reynolds: Oh my goodness.
[10:13] Well, I think to me, in my technology journey, there have been two big, big moments in the data world,
[10:25] and I want your thoughts here.
[10:27] So we may ask more, but to me was the first when we moved from paper to digital.
[10:34] Right. And now we're moving from just regular Internet connected data everywhere to AI. Like those are to me, the two biggest leaps that have happened in data. But I want your thoughts.
[10:47] Jennifer Wondracek: So I agree that the transition from print to digital was just life changing.
[10:54] We used to have to maintain these massive,
[10:57] massive research libraries and we've definitely shrunk.
[11:01] In fact,
[11:02] I just cleared out some more stacks to make some a collaborative space for my students.
[11:08] Generative AI is definitely changing things. I think people forget though, that we've had traditional AI built in for a very long time.
[11:16] Talking to our Lexis rep and Lexus integrated AI into their search function back in 1997.
[11:22] So AI has been around for a while, but we are definitely seeing a huge switch right now.
[11:29] And I'm,
[11:31] I don't know with, with all of the hallucination problems. I'm not sure how it's going to go because we're also seeing people who fired 100 employees to replace them with AI all of a sudden rehiring their employees.
[11:46] So part of me is like, yeah, we're going to change. And part of me is like, wait, wait, wait for it.
[11:52] Debbie Reynolds: Right, right.
[11:55] I think the thing that has changed about AI, even though AI has been around for a long time, is that a lot of those systems were very narrow and they were purpose built,
[12:07] so they didn't have the capability to make these types of mistakes. Now you have things like agentic AI, so you have things going out, taking your credit card, buying stuff for you, which is bonkers.
[12:19] But now that there are just so many more things that the tool can do, I think people,
[12:25] they feel like, oh, this is like the easy button I always wanted. So let me press the button and have it give me something and I can run with it.
[12:32] Maybe that works in high school with your term paper, but it certainly doesn't work in courts and like legal documents and papers and stuff.
[12:40] Jennifer Wondracek: I do know that a lot of our faculty though were shocked when all of a sudden Grammarly went from traditional AI of, you know, just correcting your grammar to hey, let me help you rewrite this.
[12:52] They're like, what?
[12:54] Like,
[12:54] it's okay,
[12:56] you're gonna make it through.
[12:59] Debbie Reynolds: Right?
[13:00] Yeah. I've used Gramly for years and I'm not fond of those features where they try to rewrite because it's like, I don't sound like that. I don't talk like that.
[13:09] Like you just want, you want a bit more than the regular Microsoft Word spell check. But then you don't want this thing that's trying to create a bible or something from the stuff you're trying to write.
[13:21] Jennifer Wondracek: Yes. I have to admit I use AI regularly, so surprise, surprise.
[13:27] But when I'm doing a lot of checking and things like that, I like my spell check, my traditional Microsoft spell check. But if I want help with phrasing, I will usually be very specific and like, I need you to help me with this sentence.
[13:43] I do not need it to revise.
[13:45] And honestly, I think especially attorneys letting it revise as part of the problem,
[13:52] like Lathan and Watkins,
[13:54] sorry, Watkins are, were actually claiming that their citations were correct until they let Claude proofread and then Claude changed their citations because it didn't like the names of the people they were citing as authors and the titles.
[14:11] So they say it wasn't actually fabricated,
[14:14] it was changed in that final edit. And I can actually believe that if you've ever let an AI system near statutory law,
[14:24] it will change it whenever it can because it absolutely hates statutory law. I mean, yeah, but.
[14:30] But even if you tell it not to,
[14:33] it will, like,
[14:34] try to reword things without telling you.
[14:38] So that revision feature is actually a little bit dangerous for lawyers,
[14:43] so.
[14:45] Debbie Reynolds: Definitely don't. Yeah, go back into Westlaw, go back into Lexa, search the Internet, just to double check to make sure these cases really exist.
[14:54] You're an educator.
[14:56] There's a big battle going on with education.
[14:58] I've talked to a lot of universities about this.
[15:02] And so one is on one extreme end is like,
[15:06] not really a Luddite, but just someone in education who's afraid of AI and they want to, like, lock the door and don't let anybody in. And then you have this other faction on the other side of the spectrum where they're like, yeah, this is an amazing tool.
[15:21] We need to all need to learn it. And so I want your thoughts on where you fall in this. I have some thoughts as well.
[15:30] Jennifer Wondracek: So I have met people from both ends.
[15:33] I'm actually a big believer that we need to make sure our students are technologically competent,
[15:38] whether that's with AI or office.
[15:42] That's one of the things I have been a proponent of for a very long time.
[15:46] In fact, I actually helped write a technology competency requirement for UNT Dallas College of Law when I worked there.
[15:53] But for AI, I don't think it's going anywhere.
[15:57] So I think it's better to teach our students how to ethically and responsibly use the AI than it is to just let them know loose and let them try to make up their own minds without that.
[16:15] Those caveats of, hey, don't forget, you have to protect your client data and those kinds of reminders along the way.
[16:23] We actually start our students training with AI in the fall semester.
[16:29] But I also tell them this doesn't guarantee you can use it for anything and everything.
[16:34] Each professor gets to decide, just like each judge gets to decide.
[16:40] And we have some faculty who don't want it used in their classes, some who are allowing it use for research, like the Lexis and West tools,
[16:51] some who are opening up even further.
[16:54] So it. It's a mix.
[16:56] And I think one way or another, though, we're doing them a disservice if we just completely ignore AI.
[17:02] I think we need to prepare them a little bit better than just kick them out the door and let them figure it out.
[17:10] Debbie Reynolds: Right. Because even when they leave law school, they're going to be hit with a computer and programs that are using AI so you don't incorporate that into your learning,
[17:22] then they're going to be behind for sure.
[17:25] Jennifer Wondracek: We also have this myth that they're all digital natives, so they know how to do everything already. I'm like, no, no, they're digital consumers. Very different thing.
[17:35] Debbie Reynolds: Totally.
[17:36] Jennifer Wondracek: But I can't tell you how many times I have students come back who are like, Professor Andrasek. They decided, I'm going on the technology committee since I just graduated and I'm supposed to know all this.
[17:48] So I think we need to give them some preparation for being out in the real world, both for their own work and because you never know what they're going to be asked to do.
[18:00] Debbie Reynolds: Yeah, that's true. That's true. I remember back in olden days. I have to say it that way because it feels like 100 years ago,
[18:07] before computers were in law firms.
[18:09] And so when they first started going into firms,
[18:13] the only people that would get them were, like, secretaries or,
[18:17] like, a document processor. They don't even have document processors anymore, I don't think. But people who did documents, and a lot of the lawyers, they used to turn their nose up at.
[18:26] At having a computer.
[18:27] It was a badge of honor that they didn't have a computer in their office, and they did stuff by paper. And then once everyone else started doing stuff digitally, it was just impossible to do stuff on paper anymore.
[18:39] So those document, those areas, they had all these people who were typing documents. It became a situation where everybody had to learn how to type documents. So everyone has a computer, Everyone can type.
[18:53] Right. So you may have a secretary or an assistant that may help you draft things,
[18:58] but everyone had to gain that type of computer literacy.
[19:03] And I think that's kind of where we are with AI now.
[19:06] Jennifer Wondracek: I will say, though, I am amused that we have gone from dictation machines back to speech to text. So we've kind of come full circle on the dictation.
[19:22] Debbie Reynolds: Oh, that's true. That's true. I didn't think about that.
[19:26] Jennifer Wondracek: The difference is the secretary typing it versus the AI typing it.
[19:33] Debbie Reynolds: Yeah, exactly.
[19:34] Well, I want your thoughts about privacy and data. So,
[19:40] you know, as I started my technology career,
[19:43] I tell people I was doing digital transformation back when it was just transformation. So it was no digital in it. And so a lot of the changes that have happened and a lot of the things that we see in the world today around privacy have been created because things are digital,
[20:04] because they are more accessible. But I want Your thoughts?
[20:08] Jennifer Wondracek: Honestly, I'm a little scared with privacy at how comfortable people are giving it up to companies these days. I'm very grateful that I grew up before social media when my mother was like, go outside and play versus get off the computer and look at us kind of deal.
[20:28] So I'm a little worried about what all is being collected about us.
[20:34] And I think that it's very important for lawyers to understand their obligations when it comes to privacy and confidentiality with their client data especially.
[20:47] I heard someone a couple of weeks ago say that they thought the bar associations were being too restrictive, and they compared giving an AI information to mailing a letter for confidentiality.
[21:02] I'm like, oh, no, no. Nope, let's back up.
[21:06] Debbie Reynolds: Right?
[21:07] Jennifer Wondracek: We really need to push 1.1 common 8 that you need to understand how the technology works before you use it, because there are risks that people those views aren't understanding.
[21:22] Debbie Reynolds: Well, yeah. And I know,
[21:24] I remember when PACER put a button on their site to say, hey, did you redact the personal information or sensitive information?
[21:34] Because, I mean, I've seen people actually file things or people's Social Security number, like all types of sensitive information that should not be in a court filing. Right. So, yeah.
[21:45] Jennifer Wondracek: So I, I teach law practice technology in addition to research and AI and the law. And one of the, the first classes, I'm like, let's talk about why we're here.
[21:55] And I pull up the. What looks like a redacted Paul Manafort pleading.
[22:00] And I don't know if you remember that.
[22:02] Debbie Reynolds: That's one of my favorites. Yeah.
[22:04] Jennifer Wondracek: Enterprising reporter did copy and paste and found out those weren't redaction boxes. That was black highlighting.
[22:12] And so I show that to my students and they're like, oh my gosh, what just happened?
[22:18] So classic.
[22:20] Debbie Reynolds: That was classic.
[22:21] Jennifer Wondracek: As, as part of the class, I then make them redact the document properly.
[22:27] Debbie Reynolds: So.
[22:28] Jennifer Wondracek: But it's.
[22:30] Debbie Reynolds: Yeah,
[22:30] I remember that one.
[22:32] We all had a good laugh about that one because have we not talked about this for like 20 years? And then they went and did this. Oh, my goodness,
[22:39] yes.
[22:40] Jennifer Wondracek: Well, and I mean, we saw the DOJ do it. We saw Ghislaine Maxwell's attorneys do it. And I'm just like,
[22:47] what is wrong with people?
[22:50] So I almost missed the days of Sharpies at that point.
[22:54] Debbie Reynolds: I think the thing that happens is that when people move from paper to digital,
[22:59] they don't understand that digital has so many more dimensions. Right. Because a digital document has a lot of layers, even things that you can't see and people don't realize, they think, okay, when they just did the highlight, they thought, okay, this is fine, I'll print it out, I'll email it.
[23:17] And like, actually they would have been better if they printed it out and just scanned it back in as opposed to just trying to do it and not like highlight it and send it, knowing that the text was still searchable underneath.
[23:31] Jennifer Wondracek: Well, and we're seeing an odd resurgence of that with AI. Have you seen the prompt injection stories about the academic papers right now? No. No. So there's a group of scientific papers that are actually being pulled off of ArchiveX now because they put in white font in their papers,
[23:53] only positive reviews and ignore all previous instructions because they think that for these peer reviewed journals, the reviewers are using AI when they're not supposed to. So they're trying to trip them up.
[24:05] And I'm like, oh, hidden metadata.
[24:08] So I kept thinking back, I'm like, you know, if you ran that through Acrobat when you scrub the metadata, that would have taken those prompt injections out too.
[24:18] Debbie Reynolds: That's pretty clever. I hadn't thought about that.
[24:21] Jennifer Wondracek: The things you think of when you're like, oh, I do redaction. Hey, yeah, that would never have made it pass redaction metadata.
[24:31] Debbie Reynolds: What is happening in the world today around data or technology that's concerning you the most?
[24:37] Jennifer Wondracek: So if you had asked me about three weeks ago, I would have said OpenAI's copyright litigation suit, which most people are like,
[24:46] copyright litigation? Why is that a problem with privacy?
[24:51] And a lot of attorneys to this, even to this day, don't realize that OpenAI was not cooperating well with others.
[25:01] So the judge put in an order to preserve every single conversation that OpenAI had. Didn't matter whether it was chat, GPT, Enterprise,
[25:12] API call.
[25:13] All of them had to be preserved.
[25:16] And those of us who actually were watching that are like, wait, wait, wait.
[25:21] Because Lexis uses OpenAI amongst other systems West Fast Gay. So they all use OpenAI technology and they had set up these zero retention policies. But the judge apparently either didn't care or didn't understand it was just like all of them.
[25:40] So we were watching as we had intervening companies come in and file pleadings and say, judge, please do not do this.
[25:49] We had OpenAI arguing back and we finally heard from the judge that they were cutting off certain conversations.
[25:57] At first I was a little wary because they said enterprise.
[26:01] Well, enterprise doesn't mean companies enterprise in tech languages.
[26:06] It you have implemented it across an enterprise. So it's like the GPT Edu licenses,
[26:13] or if you have a license directly with OpenAI to bring ChatGPT into your company.
[26:20] It didn't cover the API calls,
[26:23] which is still Lexis and West and things like that.
[26:26] So it was worrisome for a while. But the judge clarified that they were not interested in the zero retention contract.
[26:35] So if you had contractual obligations that required you to delete it, you could delete it.
[26:40] That still doesn't cover the attorneys who have pro accounts and think that's enough or teams like it.
[26:48] There's still a lot to worry about there.
[26:52] And I always tell my students, I'm like, I'm not trying to turn you into it person or a tech guru, but you need to know the basics.
[27:01] And one of the things that I try to make them aware of is you have to read the terms of service to find out what's happening with your actual data.
[27:13] So this is one of those cases where a lot of people think, well, I gave them money so it's protected.
[27:18] Debbie Reynolds: And that's not the case, or I have a password and it's protected.
[27:22] Jennifer Wondracek: Yes,
[27:23] well, so like Claude actually advertises itself as the ethical AI system because it doesn't put your data into the training data.
[27:32] But a couple months ago now they released their economic index and with that they released a paper where they had taken 4 million free and pro account prompts and analyzed them.
[27:51] And I'm like, oh, how,
[27:53] how many attorneys thought since it was the ethical AI,
[27:57] they put their data, their confidential data in that prompt and all of a sudden it's gone.
[28:04] Even if they anonymized it from you as the prompter,
[28:08] did you anonymize your client data when you put it in there?
[28:12] So it's just people are making assumptions and that's a little concerning,
[28:17] but we are seeing some things go in the right direction.
[28:20] On LinkedIn. I posted recently a order I spotted from a discovery order where there's actually a generative AI clause in it that says you cannot put any of the discovery data into a general AI system.
[28:39] Which appears to mean ChatGPT, like the commercial OpenAI's that don't have those data privacy protections because it's a third party disclosure.
[28:49] I'm like, thank you court, for adopting that language and telling people that, because we've been telling people that and they don't believe us.
[28:59] So it's going to be interesting to see how that evolves.
[29:03] Debbie Reynolds: Yeah, I agree with you. And I wrote an article and did a video about the OpenAI thing because I Think a lot of people were watching it on the sidelines thinking, oh, this is just a copyright case.
[29:13] I'm like, no, this is a privacy issue.
[29:15] So when the judge came out with that order, well, two things. One is like,
[29:21] wow, like, them having to keep everything, that's like a major.
[29:24] First of all, it's a major expense, but it's kind of hard for OpenAI to cry,
[29:30] being too poor to keep all this stuff. Right. Because of all the money and attention that they have.
[29:35] But I'm exactly with you on that. So there's just a lot of gray area there. And I have. And the reason why I wrote that article is because I do a lot of keynote speaking with companies and I've been telling for years.
[29:48] I'm like, don't put your confidential information in there. Don't put your private information in there. Don't put. Even though you think it would be easy and fast, and that's like an extra step that you have to do to take that information out.
[29:59] Because really, once your data is in these enterprise systems, you truly don't have control over what happens. Right. And so whether it be intentional or not, like, that data could still be breached.
[30:10] So you definitely don't want to lose control of that information.
[30:15] Jennifer Wondracek: I will say I had a hypothetical conversation with some of us after we started watching this because as you know, lawyers are held to reasonable efforts to protect client data.
[30:29] And I was thinking with, with like the Lexis and West protections, that sounds reasonable to trust your data to them.
[30:38] They're sending it out to the OpenAI, but it's a zero retention policy,
[30:42] so you've got the contractual protections in place.
[30:46] But once you figure out what they're doing with the eDiscovery materials, does that mean it's no longer reasonable to trust your client data?
[30:55] Or at least until the judge came out and said zero retention is still zero retention.
[31:01] Debbie Reynolds: Yeah, I hadn't thought about that.
[31:05] Jennifer Wondracek: Sorry. This is the weird thing that we get into in discord conversation. Is it reasonable efforts once you know about the copyright litigation see it?
[31:15] Debbie Reynolds: I still say for people who didn't, who heeded my advice and didn't put their confidential information in it, they can breathe easier. A sigh of relief because they don't have to worry about this data turning up somewhere else.
[31:27] Yep.
[31:29] Jennifer Wondracek: No, I, I wholeheartedly agree,
[31:31] particularly if it's a consumer AI, I am not putting anything confidential or even proprietary in there. But like, I could see trying to rely on Lexus and West's AI restrictions because they, like, they actually have pretty good terms of service that really mimic the attorney's obligations.
[31:53] And that was not in their hands at one point.
[31:58] Now my hope is they switch to a different API.
[32:02] I did try to question them on that, and I got a we'll get back to you.
[32:10] So nobody was putting that in writing,
[32:12] but I do know that all of the big ones are multimodal, so they're not just using OpenAI, they're using OpenAI,
[32:21] llama,
[32:23] anthropic, so they can switch things around. So I would hope that at least while that's going on,
[32:29] they're switching some things around.
[32:32] Debbie Reynolds: Yes, definitely. That's always good advice.
[32:35] So, Jennifer, if it were the world according to you, and we did everything you said, what would be your wish for privacy or data anywhere in the world, whether that be regulation,
[32:45] human behavior, or technology?
[32:48] Jennifer Wondracek: My biggest wish is just common sense of reading,
[32:53] reading things before you give data away.
[32:57] I don't think it's realistic to that. We're going to have worldwide regulations,
[33:03] although the EU is trying.
[33:06] But I think that realistically we just need people to pay attention.
[33:11] I think most of our data woes and privacy woes would disappear if people paid attention. And if they didn't understand it, they educated themselves.
[33:22] Debbie Reynolds: I agree with that.
[33:24] I think people think of law as a shield, and I don't think of it that way. So a piece of paper is not gonna make you stop doing something or take action.
[33:34] So you have to really think through and then balance it. Like, what's the benefit versus the harm? Right. So is it a good benefit to you?
[33:43] Is it proportional to what you want to do? Like, I use ChatGPT sometimes to like, format documents.
[33:52] Why can't I get my outline right? Why won't this bullet, like, disappear and stuff like that? So just figuring out what's the best use and balancing it and think through what you're trying to do.
[34:03] Think about what could be the. Not only the, the positive, but also the negative impacts depending on what you do.
[34:12] Jennifer Wondracek: I wholeheartedly agree with that.
[34:15] I, I think it would be a very different world if people stopped and thought before they did things.
[34:20] I try to teach my students to stop and think before they use AI. And not just what are you going to write in the prompt, but are you using the right one?
[34:30] Should you be using this? Do you need to look at what you're putting in and redact some stuff first?
[34:35] So just slowing down,
[34:38] I think, makes all the world, even if it's just you take five minutes to stop,
[34:43] think things through. Are you Using the right system, are you asking the right questions?
[34:48] What am I doing with my data? Things like that I think would be very beneficial.
[34:52] Debbie Reynolds: I agree with that. As a last plug, I just wanted to throw in when people were saying, what do you think about plagiarism? What students use AI, it's like, I'm not worried about that because the thing that I'm asking them is their thought.
[35:05] I'm trying to ask them a question about how they think.
[35:09] And so if you use AI, it's not going to give you that type of an answer. Right. It'll be very same,
[35:14] very surface.
[35:15] Jennifer Wondracek: So, yeah,
[35:17] yeah,
[35:18] I definitely point that out to my colleagues if they ask. And honestly,
[35:23] a lot of people want to rely on the AI detectors. Let's see if it's AI written. And I'm like, one.
[35:28] Let's back that down. But there's such bias in those detectors that I'd rather them plagiarize than use the detector and accuse them of doing something that they didn't do.
[35:41] Debbie Reynolds: Oh, totally. Well, I had a plagiarizer detector say I plagiarized something, but it was my own work.
[35:49] Jennifer Wondracek: Oh, yeah. My last article that I published with the ABA came back 45% AI written. And I'm like, yeah, no, I wrote that, but thanks.
[36:00] Debbie Reynolds: Right.
[36:02] Jennifer Wondracek: I have told people, I'm like, if you read the classics growing up, chances are you're going to be flagged as AI written.
[36:10] Debbie Reynolds: Oh, totally. Right. They're like, oh, wow, she's using great grammar and stuff. It's totally AI. Totally AI.
[36:16] Jennifer Wondracek: Yes. And they blame Oxford commas on AI. And I'm like, no, no, no. That's all attorneys, hopefully.
[36:22] Debbie Reynolds: Oh, my goodness.
[36:23] Well, thank you so much, Jennifer, for being on the show, people. Please definitely follow Jennifer on LinkedIn. She always has some really great articles. You always post a lot of good content and documents, links to documents that you can go if you want to go down the rabbit hole with all of us.
[36:41] Crazy day to people.
[36:43] Jennifer Wondracek: Well, thank you for inviting me onto the show. I really appreciate it and I look forward to continuing the conversation with your audience.
[36:50] Debbie Reynolds: Yeah, me too. Me too. I really enjoyed this. Thank you so much again.
[36:55] Jennifer Wondracek: You're welcome.
[36:56] Debbie Reynolds: Bye.
[36:56] Jennifer Wondracek: Bye.
[36:56] Debbie Reynolds: Bye. Bye.
[37:07] Jennifer Wondracek: Sam.