Published
Reading time
2 min read
Illustration of health related icons connected to a cloud

Google spent the past year training an AI-powered health care program using personal information from one of the largest hospital systems in the U.S. Patients had no idea — until last week.

What happened: The tech giant gave the Ascension hospital network access to a system for managing healthcare information called Project Nightingale. In exchange, Ascension gave Google access to the medical records of up to 50 million patients, according to an exposé in the Wall Street Journal. The effort triggered an investigation by U.S. privacy regulators.

How it works: Google designed Project Nightingale as a machine learning tool for matching patient information with healthcare decisions. Once trained, it would suggest treatment options or additional tests and highlight suggestions for special care based on a patient’s history.

  • The system would perform administrative tasks, such as reassigning doctors based on changes in the patient’s condition or special needs. It would also enforce policies to prevent unlawful prescriptions and suggest ways to generate more income from patients.
  • Ascension gave Google personal information (including patient names and addresses along with the names of patients’ family members) as well as medical records such as lab results, diagnoses, prescriptions, and hospitalizations.
  • Ascension didn’t inform patients or doctors that it was sharing data with Google.
  • At least 150 Google employees had access to the data.

The controversy: The U.S. Health Insurance Portability and Accountability Act of 1996 (HIPAA) protects patient data from being used or shared for purposes unrelated to healthcare. The law allows providers to share data with business partners without telling patients as long as the goal is better care. Google and Ascension said Project Nightingale is intended for that purpose. However, regulators at the Department of Health and Human Services are concerned that the companies aren’t properly protecting data. They launched an investigation, which is in progress.

We’re thinking: We need Google, Ascension, and other organizations to keep innovating in healthcare. But we also need rules that are crystal clear about allowable uses of sensitive health data. When HIPAA was passed, public information about how AI works was far less available and data sharing among companies was much less common. An update is long overdue.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox