GOOGLE DENIES USING PRIVATE HEALTH DATA FOR RESEARCH

 

GOOGLE IS ONCE AGAIN ACCUSED OF THROWING ETHICS OUT THE WINDOW

Google has partnered with Ascension, The US’ second-largest health system. This project dubbed “Project Nightingale” enables Google to access the health information of about 50 million American patients. The Wall Street Journal reports that Ascension did not tell doctors and patients that it was sharing their data which includes sensitive information such as names, diagnoses, and lab results.

This story has caught the attention of the Office for Civil Rights in the Department of Health and Human Services. They will attempt to learn more information about this mass collection of individuals’ medical records to ensure that nothing illegal is taking place.

WHAT IS GOOGLE WORKING ON

data

An overview of Project Nightingale.

Google is using this data to develop a system for Ascension to predict the outcome and risks of certain procedures and medications. This is why personal health information is being uploaded to a network that is accessible by both Ascension and Google staff.

Google maintains Project Nightingale complies with all federal regulations. They also claim that they are not using the data to train its own systems, only the AI-powered system for Ascension. They also claim that they are not combining patient data to use across its other healthcare partners.

In a recent FAQ, Google wrote: “We are building tools that a single customer can use with their own patients’ data. The data is siloed, access controlled, and auditable. We do not combined data across partners, and we would not be allowed to under our agreements or the law.” A Google spokesperson said the company is happy to cooperate with the federal investigation and believes its work with Ascension adhers to industry-wide regulations including HIPAA regarding patient data and comes with strict guidance on data privacy, security, and usage.

THE IMPLICATIONS OF PROJECTS LIKE THIS

This just highlights the serious privacy concerns that have been dismissed by large tech companies in the past. As we have already established on this blog in the past data is more valuable than gold in today’s world and these companies will use whatever means necessary to get their hands on it.

Every time a large tech company starts a project like this using our personal data they justify it in the same way. They tout the potential benefits of such a predictive system and claim that it justifies the gathering of personal information. I am inclined to argue that it does not. In the current climate of AI development personal privacy is more important than ever. Lets look at what they believe they are going to accomplish with this new predictive system.

MODERN MEDICINE IS INHERENTLY FLAWED

The main idea is that they plan to create a system that can predict the outcome and risks of medical procedures and medications. We have established before on this blog that the major flaw with modern healthcare procedures is that they seek to “treat” conditions instead of curing them for the purpose of making money. Until this flaw is addressed any increase in our ability to predict the outcome of these procedures can only decrease the number of tragedies that occur from botched procedures. AI is not really needed to accomplish that and even if it is used for that purpose it does not justify the mass gathering of our data without consent.

Modern medications are plagued by a similar problem. Trying to use AI to predict the effects of medications on patients after said medication has already been approved by the FDA is putting the cart before the horse. Any and all predictive analysis on the effects of medications by an AI should take place during human trials not afterwards.

All of this suggests that the benefits of Project Nightingale do not justify the invasion of privacy and that Google likely has other plans for the data it gathers despite claiming that they do not. What is most unfortunate about this whole thing is that the Health Insurance Portability and Accountability Act (HIPPA) typically allows hospitals to share data with business partners without telling patients or giving them a chance to opt out as long as the info is used to help the hospital provide healthcare. That is a very open ended definition which allows large tech companies to basically do whatever they want with our data.

Leave a Reply

Your email address will not be published. Required fields are marked *