Health Privacy vs. Generative AI

April 18, 2024

Privacy Plus+

Privacy, Technology and Perspective

This week, let's highlight a concerning trend we're seeing that pits health privacy against generative artificial intelligence (AI). We’re covering procurement, contracting, data security and privacy, and restrictions on data use, so read on.

Background

Organizations across industries are rushing to invest in generative AI, and the efficiencies that it can create.  But many are unprepared to deal with the issues associated with AI, including establishing AI governance, and contracting appropriately with vendors who offer AI “solutions” (often integrated with or even white-labeled/re-packaged third-party platforms and tools).

The New Problem

Generative AI tools are being marketed prolifically. However, we are focusing here on the marketing of these tools for use in healthcare settings. In healthcare settings, common generative AI applications include those advertised as assisting doctors with noting-taking and responding to patient messages.

While generative AI tools may deliver efficiencies and reduce administrative burdens, they also risk compromising sensitive data, including electronic protected health information (ePHI) protected under the Health Insurance Portability and Accountability Act (HIPAA). ePHI is high-risk data because its protection is required under that law, and its loss of confidentiality, integrity or availability could have significant adverse impacts on individuals, namely, patients.

Our Thoughts

Too often, we see a chasmic disconnect between a product’s marketing materials and its actual contract terms. The public fascination with generative AI seems to have prompted a fear of missing out (FOMO). This FOMO has propelled some organizations to contract for AI services outside of normal procurement processes, which should include diligence and drafting contract terms that address issues like data privacy and security, including restrictions on vendors’ data use.

In the healthcare setting, this issue is amplified because the HIPAA Security Rule provides detailed technical, physical, and administrative requirements that “covered entities” and their “business associates” must follow to protect the security of ePHI. In this context, AI service providers are business associates because they perform functions that involve the use or disclosure of ePHI on behalf of HIPAA-covered entities.

Yet, some of the new crop of AI services that are being marketed to doctors for “enhancing physician-patient communications” do not include business associate agreements, which are required of business associates. Moreover, the terms of service underlying these AI services typically allow unfettered access and use of any data that is input into the services – not just by the service provider, but also by third parties. This presents a profound problem. Certainly, we doubt that any patient would agree to this kind of disclosure of their ePHI.

There are privacy-enhancing technologies, like cryptographic techniques including homomorphic encryption, that enhance the security and privacy of sensitive information when processed through AI-based services. If AI services are offered based on marketing materials that assure privacy and security, then these marketed security specifications should appear in the contract/terms of service, along with data use restrictions (though most often, they don’t).

Further, providers of AI services marketed to doctors should offer business associate agreements as required under HIPAA. If they do not, those covered by HIPAA cannot contract for the services without violating the law.

So: if your organization has not yet adopted guidelines for the responsible use of generative AI, or sorted through the procurement issues associated with those technologies, now is the time. Also, if you individually are using generative AI services, consider opting out of sharing data for AI iterative-learning when possible. For instructions on how to opt out, you can listen to this recent podcast by Wired Security:

https://podcasts.apple.com/us/podcast/wired-security/id1211754233?i=1000652477599

--- 

Hosch & Morris, PLLC is a boutique law firm dedicated to data privacy and protection, cybersecurity, the Internet, and technology. Open the Future℠.

Previous
Previous

Hosch & Morris Co-Founders Counted Among the Best Lawyers in Dallas

Next
Next

AI Legal Ethics: New Guidance from New York