AI for legal departments: Managing eDiscovery and data retention risks with Microsoft Copilot
Tech Law Talks - En podcast af Reed Smith
Kategorier:
Anthony Diana and Therese Craparo are joined by John Collins from Lighthouse to provide an overview of some of the challenges and strategies around data retention and eDiscovery with Microsoft’s AI tool, Copilot. This episode explores Copilot’s functionality within M365 applications and the complexities of preserving, collecting and producing Copilot data for legal purposes. The panelists cover practical tips on managing Copilot data, including considerations for a defensible legal hold process and the potential relevance of Copilot interactions in litigation. ----more---- Transcript: Intro: Hello, and welcome to Tech Law Talks, a podcast brought to you by Reed Smith's Emerging Technologies Group. In each episode of this podcast, we will discuss cutting-edge issues on technology, data, and the law. We will provide practical observations on a wide variety of technology and data topics to give you quick and actionable tips to address the issues you are dealing with every day. Anthony: Hello, this is Anthony Diana, a partner in the Emerging Technologies Group at Reed Smith, and welcome to the latest Tech Law Talks podcast. As part of our ongoing podcast series with Lighthouse on Microsoft M365 Copilot and what legal departments should know about this generative AI tool in M365. Today, we'll be focused on data retention and e-discovery issues and risks with Copilot. I am joined today with Therese Craprro at Reed Smith and John Collins of Lighthouse. Welcome, guys. So, John, before we start, let's get some background on Copilot. We've done a few podcasts already introducing everyone to Copilot. So if you could just give a background on what is Copilot generally in M365. John: Sure. So the Copilot we're talking about today is Copilot for Microsoft 365. It's the experience that's built into tools like Word, Excel. PowerPoint, Teams, Teams meetings. And basically what it is, is Microsoft's running a proprietary version of ChatGPT and they provide that to each one of their subscribers that gets Copilot. And then as the business people are using these different tools, they can use Copilot to help generate new content, summarize meetings, create PowerPoints. And it's generating a lot of information as we're going to be talking about. Anthony: And I think one of the interesting things that we've emphasized in the other podcasts is that each M365 application is slightly different. So, you know, Copilot for Word is different from Copilot for Exchange, and they act differently, and you really have to understand the differences, which we talked about generally. So, okay, so let's just talk generally about the issue, which is retention and storage. So, John, why don't you give us a primer on where is the data generally stored when you're doing a prompt and response and getting information from Copilot? John: So the kind of good news here is that the prompts and responses, so when you're asking Copilot to do something or if you're chatting with Copilot in one of the areas that you can chat with it, it's putting the back and forth into a hidden folder in the user's mailbox. So the user doesn't see it in their outlook. The prompts and responses are there, and that's where Microsoft is storing them. So there's also files that get referenced that are stored in OneDrive and SharePoint, which we may talk about further. But in terms of the back and forth, those are stored in the Exchange mailbox. Anthony: That's helpful. So, Therese, I know we've been working with some clients on trying to figure this out and doing testing and validation, and we've come across some exceptions. You want to talk about that process, I'll say. Therese: I think that's one of the most important things when we're talking about really any aspect of Copilot or frankly, new technology, right? It's constantly developing and changing. And so you need to be testing and validating and make sure you're understanding how it's working. So as you said, Anthony, you kn