Big Tech comes for your health data. Here you can read how you can protect your information.

Big Tech comes for your health data. Here you can read how you can protect your information.

A new national health data system launched by the Trump administration will dramatically expand the access of companies to medical records of companies.

The system, announced by the White House on July 30is a digital “ecosystem” with which patients can register to store their health data and make them accessible to different apps and providers. This would not only make doctors possible, but also apps such as Apple Health To collect information from recipes, test results and even fitness data to save in one place.

The system will be maintained and led by the Centers for Medicare and Medicaid Services (CMS). More than 60 companies-variant from technical giants such as Google and Amazon to health care companies such as CVS Health and UnitedHealth Group -have already promised to participate By developing the infrastructure and apps for the initiative or by providing patient data. OpenAI, the artificial intelligence company behind Chatgpt, is also planned to contribute, although it is unclear how, every company participates in the new system.

The administration that claims This initiative will improve the access of patients to health files and promote innovation.

“We tear up digital walls, return power to patients and rebuild a health system that is of service to people,” said Secretary of Health and Human Services Robert F. Kennedy Jr. In a statement of July 30.

But the simultaneous apart from the administration of Federal Health Research Financing Will undermine those goals. The cuts limit the ability of scientists to perform large -scale clinical tests, the Development of new medical treatmentsand reduce the support for Fundamental public health initiatives such as vaccines.

There is also another risk for this technical-forward approach to health: data privacy. As an activist and privacy researcher who helps argue for consumer-oriented privacy policy at the American Civil Liberties Union (ACLU), here are the risks that I distinguish in the new health data system of the Trump administration and some concrete ways to keep health information safe.

Big Tech applies consumer health data

Big Tech has a financial interest in getting data from consumers health, because that information feeds the advertisements that the business models of these companies drives.

Insights in reproductive health, medical disorders or mental health of users are already from their browsing habits, location data and app use and are used to focus people with advertisements on expensive medicines or fertility treatments. Some advertisers even Paddel Unproven Wellness Products To chronic sin people.

In 2021, users of the period -Tracking -app Flo complained the company Wall Street Journal Report unveiled That Flo had embedded Facebook software that shared sensitive personal information – as a user was in their period or indicated his intention to become pregnant – with Facebook. The FTC also filed a complaint against Flo, which was arranged in 2021.

In August 2025 a jury found in California found Meta, Facebook’s parent company, liable for violating the Privacy Laws of the State Because it secretly and deliberately collected the reproductive health information of sensitive people from Flo without their permission.

Likewise, Amazon 2023 Acquisition of the first -line chain one medical Has fueled new risks with regard to the access of Big Tech to sensitive personal information.

Amazon One Medical Pushes patients to sign an agreement Giving Amazon access to their complete patient file. And it is not clear that information is always kept private. A 2024 Unlawful death lawsuit claimed Those nine Amazon/One Medical Employees Responding to the medical records of a deceased patient After his death, Media attracted attention.

A legal gap

The examples of Flo and Amazon emphasize the dangers of confidence of companies with sensitive health information – especially when their core business models depend on the generation of consumer data.

The new data program for the White House Healthcare would give technology companies unprecedented access to personal health data without offering meaningful new protection for consumers. And hipaa-de Federal Health Insurance Portability and Accountability Act that protects health information that is shared with doctors, hospitals and insurers-Is does not apply if the same information is collected by apps from third parties, technology companies or other entities that are not considered care providers.

This means that if you enter sensitive data into a fertility tracker or buy vitamins through Amazon, who are not covered by hipaa protection, although they can reveal intimate details about your health.

This legal gap offers technology companies the opportunity to earn money with sensitive medical information without sufficient supervision, by using it to direct advertisements or feed an artificial intelligence system that develops new health tools.

Companies could analyze health information to focus on advertisements, for example promoting weight loss products for someone with obesity-related disorders, utilizing people’s health struggle and strengthening stigma for monetary profit. Or companies can sell health-related insights to insurers that can adjust premiums or coverage decisions for those patients on the basis of predicted risks donate the most needed patients.

Broaden existing differences

Apart from the risks for a person, the creation of a huge “ecosystem” threatens to deepen enormous health data that deepen systemic damage. Data from the program can, for example, the discrimination of communities that have made it further normalize and perpetuate Historically, the goals of surveillanceSuch as black and brown women.

This is how that could happen: if technology companies use the data in these communities to train their artificial intelligence systems, these data sets would be Skew In access to care, quality of treatment and historical medical bias – because the same communities that have been disproportionately investigated have also been historically abused by the medical system.

As a result, predictive algorithms that have been trained on biased data may incorrectly diagnose, see depioritize or overlooking marginalized patients. This effective Strengthens long -term inequalities.

Users see already targeted advertisements based on their personal medical information. In 2022, A Waded Report has established that a writer used with external pregnancy tracking apps Quickly changed her data to targeted advertisements and disinformation. Within a few minutes after registration, she received e-mails from WebMD and Pottery Barn Kids, as well as advertisements for expensive, discretionary postal services such as Cord Blood Banking.

Likewise the Federal Trade Commission has a fine of a TeleHealth company a fine called Cerebral in April 2024 more than $ 7 million for the use of the sensitive personal health information of his patients for third -party advertising purposes.

The initiative of the Trump government threatens to speed up this damage dramatically, so that risks are scaled up that will become disproportionately the most marginalized.

And information about consumers health cannot only stay within companies – it can find its way to the courts.

Large technology companies have met subpoenas and government requirements for information about users under criminal or civil research, often with little attention to consumer rights or civil freedoms. In states where abortion is illegal, law enforcement can use menstrual tracking data, pharmacy data or digital communication between patients and providers to build criminal cases.

For example, the Facebook messages from Meta were Key to continue a Nebraska Mother To help her daughter seek an abortion in 2022.

Even when data remains anonymous, collecting so much information about the reproductive health of so many people and making it available in broad lines, the door opens to abuse.

In general, this data can be used as proof to support restrictive political policy, such as prohibited abortion for medication. It can also reinforce challenges for contraceptive access by legislators and interest groups to give statistical justification to claim that these services are used too much, unsafe or morally offensive.

LGBTQ+ people can experience devastating consequences such as sensitive health details, such as their HIV status or history of receiving gender-confirming care, were also exposed. That information has historically led to discrimination in employment, housing and even access to care.

The point is: in the hands of a hostile government, which starts as “neutral” medical data can quickly become a weapon.

6 steps to protect your information

Here are some practical steps to protect your health data:

  • Share online private -Health information online: Direct messages on social media Often collect details that can be shared with law enforcement. Say nothing on Facebook or Tiktok that you would not want to share wide wide.
  • Use encrypted communication: Sensitive medical discussions, if they have to prevent online, must take place via secure, coded platforms. Signal, An open-source platformis safer than WhatsApp, owned by Meta. iMessage is end-to-end coded only if both users have iPhones.
  • Exercise your existing data rights: Twenty states have expanded Privacy laws for consumers Allow people to gain access to, delete or transfer them to their browsing history, SOFI numbers and other personal information. In six other states, users have narrower privacy rights, such as choosing not to have their data sold to data brokers or advertisers. Consumers who live in states with privacy laws can exercise their rights by submitting data access requests.
  • Investigate service conditions: Patients must ask how care providers and pharmacies can share their data and agree with app permissions without a distinction. Display for targeted advertisements or sale of data if possible and read the small print of apps or third -party providers.
  • Limit the use of third -party apps: Perform a digital hygiene assessment of your commonly used health apps. This can include viewing all app authorizations on your phone, using services such as Deletema to remove your information from data brokers, to read the privacy policy of apps (Privacy policy Checklists Can help online), and considering alternatives to apps that you use by understanding where your data can go.
  • Consider offline health education: When following menstrual cycles or making pregnancy notes, traditional magazines or encrypted digital files can offer more protection.

Individual caution is not a sufficient replacement for systemic shields. But until the legislators pass an extensive federal privacy and limit the commercial sale of health information, we are all responsible for keeping our data safe.

#Big #Tech #health #data #read #protect #information

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *