AI explained: AI and recent HHS activity with HIPAA considerations

Tech Law Talks - En podcast af Reed Smith

Kategorier:

Reed Smith partners share insights about U.S. Department of Health and Human Services initiatives to stave off misuse of AI in the health care space. Wendell Bartnick and Vicki Tankle discuss a recent executive order that directs HHS to regulate AI’s impact on health care data privacy and security and investigate whether AI is contributing to medical errors. They explain how HHS collaborates with non-federal authorities to expand AI-related protections; and how the agency is working to ensure that AI outputs are not discriminatory. Stay tuned as we explore the implications of these regulations and discuss the potential benefits and risks of AI in healthcare.  ----more---- Transcript: Intro: Hello, and welcome to Tech Law Talks, a podcast brought to you by Reed Smith's Emerging Technologies Group. In each episode of this podcast, we will discuss cutting-edge issues on technology, data, and the law. We will provide practical observations on a wide variety of technology and data topics to give you quick and actionable tips to address the issues you are dealing with every day.  Wendell: Welcome to our new series on AI. Over the coming months, we'll explore the key challenges and opportunities within the rapidly evolving AI landscape. Today, we will focus on AI in healthcare. My name is Wendell Bartnick. I'm a partner in Reed Smith's Houston office. I have a degree in computer science and focused on AI during my studies. Now, I'm a tech and data lawyer representing clients in healthcare, including providers, payers, life sciences, digital health, and tech clients. My practice is a natural fit given all the innovation in this industry. I'm joined by my partner, Vicki Tankle.  Vicki: Hi, everyone. I'm Vicki Tankle, and I'm a digital health and health privacy lawyer based in Reed Smith's Philadelphia office. I've spent the last decade or so supporting health industry clients, including healthcare providers, pharmaceutical and medical device manufacturers, health plans, and technology companies navigate the synergies between healthcare and technology and advising on the unique regulatory risks that are created when technology and innovation far outpace our legal and regulatory frameworks. And we're oftentimes left managing risks in the gray, which as of today, July 30th, 2024, is where we are with AI and healthcare. So when we think about the use of AI in healthcare today, there's a wide variety of AI tools that support the health industry. And among those tools, a broad spectrum of the use of health information, including protected health information, or PHI, regulated by HIPAA, both to improve existing AI tools and to develop new ones. And if we think about the spectrum as measuring the value or importance of the PHI, the individuals individuals identifiers themselves, it may be easier to understand that the far ends of the spectrum and easier to understand the risks at each end. Regulators in the industry have generally categorized use of PHI in AI into two buckets, low risk and high risk. But the middle is more difficult and where there can be greater risk because it's where we find the use or value of PHI in the AI model to be potentially debatable. So on the one hand of the spectrum, for example, the lower risk end, there are AI tools such as natural language processors, where individually identifiable health information is not centric to the AI model. But instead, for this example, it's the handwritten notes of the healthcare professional that the AI model learns from. And with more data and more notes, the tool's recognition of the letters themselves, not the words the letters form, such as patient's name, diagnosis, or lab results, the better the tool operates. Then on the other hand of the spectrum, the higher risk end, there are AI tools such as patient-facing next best action tools that are based on an individual's patient medical history, their reported symptoms, their providers, their prescribed medications, p

Visit the podcast's native language site