{*}
Add news
March 2010 April 2010 May 2010 June 2010 July 2010
August 2010
September 2010 October 2010 November 2010 December 2010 January 2011 February 2011 March 2011 April 2011 May 2011 June 2011 July 2011 August 2011 September 2011 October 2011 November 2011 December 2011 January 2012 February 2012 March 2012 April 2012 May 2012 June 2012 July 2012 August 2012 September 2012 October 2012 November 2012 December 2012 January 2013 February 2013 March 2013 April 2013 May 2013 June 2013 July 2013 August 2013 September 2013 October 2013 November 2013 December 2013 January 2014 February 2014 March 2014 April 2014 May 2014 June 2014 July 2014 August 2014 September 2014 October 2014 November 2014 December 2014 January 2015 February 2015 March 2015 April 2015 May 2015 June 2015 July 2015 August 2015 September 2015 October 2015 November 2015 December 2015 January 2016 February 2016 March 2016 April 2016 May 2016 June 2016 July 2016 August 2016 September 2016 October 2016 November 2016 December 2016 January 2017 February 2017 March 2017 April 2017 May 2017 June 2017 July 2017 August 2017 September 2017 October 2017 November 2017 December 2017 January 2018 February 2018 March 2018 April 2018 May 2018 June 2018 July 2018 August 2018 September 2018 October 2018 November 2018 December 2018 January 2019 February 2019 March 2019 April 2019 May 2019 June 2019 July 2019 August 2019 September 2019 October 2019 November 2019 December 2019 January 2020 February 2020 March 2020 April 2020 May 2020 June 2020 July 2020 August 2020 September 2020 October 2020 November 2020 December 2020 January 2021 February 2021 March 2021 April 2021 May 2021 June 2021 July 2021 August 2021 September 2021 October 2021 November 2021 December 2021 January 2022 February 2022 March 2022 April 2022 May 2022 June 2022 July 2022 August 2022 September 2022 October 2022 November 2022 December 2022 January 2023 February 2023 March 2023 April 2023 May 2023 June 2023 July 2023 August 2023 September 2023 October 2023 November 2023 December 2023 January 2024 February 2024 March 2024 April 2024 May 2024 June 2024 July 2024 August 2024 September 2024 October 2024 November 2024 December 2024 January 2025 February 2025 March 2025 April 2025 May 2025 June 2025 July 2025 August 2025 September 2025 October 2025 November 2025 December 2025 January 2026 February 2026 March 2026 April 2026
1 2 3 4 5 6 7 8 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
News Every Day |

Using ChatGPT Health? Read this first.

Step aside, WebMD — health advice has become the most common way people use ChatGPT.

The chatbot's parent company, OpenAI, reported that 40 million people query ChatGPT daily to decode convoluted medical bills, appeal unfair insurance claims, or manage their own treatment. According to a February Gallup poll, nearly 16 percent of U.S. adults already use AI or social media to find medical information.

Meanwhile, Americans owe over $220 billion in medical debt, according to 2024 figures. The country's health workforce currently faces widespread shortages, with high turnover rates for first-year nurses and a need for 114,000 more physicians by 2028 to meet demand. Around half of Americans reported struggling to afford healthcare last year, as the federal government narrowed Affordable Care Act subsidies.

In the eyes of many, the healthcare system has broken.

Meanwhile, widespread AI adoption has been touted as a solution for an overburdened medical system. Narrowly-designed, clinical-grade AI, trained for specific tasks, could potentially revolutionize imaging, patient charting, and insurance processing. But AI developers aren't stopping there — they want AI in the patient's hands, too.

In January, OpenAI launched ChatGPT Health, the company's free, consumer-facing solution for those seeking health guidance — and anyone willing to upload their medical histories for the chatbot to digest.

Digital doctor or privacy nightmare?

ChatGPT Health, which incentivizes users to upload their personal medical records for tailored medical assistance, was announced on Jan. 7, promising to "securely" link your health information with ChatGPT’s brain. In the months since, other tech companies have followed suit, including the recently announced Amazon Health AI assistant and Microsoft Copilot Health.

Not everyone sees Health GPTs and other AI-related health tools as a net positive.

"Generative AI chatbot products starting to spin off into these healthcare-adjacent submarkets is deeply concerning," Melodi Dinçer, senior staff attorney for the Tech Justice Law Project, told Mashable.

In the hours following ChatGPT Health's launch, Dinçer published a scathing statement characterizing OpenAI's release as a strategic business move to access more personal data while jeopardizing the privacy of struggling Americans. The Tech Justice Law Project is currently representing individuals suing OpenAI over mental health concerns with ChatGPT. 

You're creating a larger ecosystem in this non-HIPAA covered space.
- Andrew Crawford, Center for Democracy and Technology

Other privacy watchdogs said their alarm bells went off, too.

"We don't have a comprehensive federal privacy law in the United States," explained Andrew Crawford, senior policy counsel for the Center for Democracy and Technology's Data and Privacy Project. At least, he said, none that puts real limits on how companies handle consumer data, especially sensitive data sets.

Tech companies, including Meta and OpenAI, have lobbied to keep robust privacy laws off the books, and government officials like Secretary of State Marco Rubio have pushed for less regulation of American tech companies.

In the absence of federal regulation on data, Americans are provided a patchwork of state-level laws and industry-specific regulations, including protections under the Health Insurance Portability and Accountability Act, or HIPAA.


A new Mashable series, AI + Health, will examine how artificial intelligence is changing the medical and health landscape. We'll explore how to use AI to decipher your blood work, effectively prompting chatbots when it comes to health questions, and learn how two women are using AI to detect a dangerous form of heart disease, and much more.


Passed in 1996, HIPAA established a federal standard for protecting patient medical data and related identifying information in cases where data is shared with or without patient consent. Its Privacy Rule has also become a benchmark for assessing a medical product's privacy standards.

HIPAA, however, isn't a failsafe. Its protections aren't attached to data itself, explained Crawford, but to the institutions that process and store it. Consumer data is shielded only when it's in the hands of an institution bound by HIPAA laws, not when it exists in other marketplaces or is stored elsewhere online. 

Institutions bound by HIPAA laws are known as covered entities. This includes health insurance companies, HMOs, company health plans, and other coverage providers like Medicaid and Medicare; most (but not all) care providers like doctors, dentists, psychologists, nursing homes, and even chiropractors; and, finally, clearinghouses, or businesses that process and transmit health data. Anyone that does business with one of those entities, like a lawyer or billing company, is also under HIPAA's oversight. 

Oura, Apple, Strava: Personal wellness apps and ChatGPT Health 

Most popular health apps are not covered by HIPAA, according to the HIPAA Journal. Not your Oura ring, Apple Health app, or running buddy Strava. When you share your data with something like ChatGPT Health, even if you use it to inform your conversations with a covered entity later, that information is not legally bound by anything outside of the company's privacy policy.

But many, like OpenAI, promise that data is being treated carefully.

We are buying into this idea that something so complex as health can be reduced to numbers on a screen.
- Melodi Dinçer, Tech Justice Law Project

Covered entities are blocked by law from using your data for things like targeted advertising or user behavioral profiles, without authorization. But any other companies that get a hold of your medical information can do whatever they please, in accordance with their own privacy policies, Crawford says.

Lily Li, a data privacy and AI risk management attorney and founder of Metaverse Law, explained company privacy policies often include reasonable security protocols and opt-out features, but aren't required to include HIPAA oversights like specific authorization, time limitations on storing data, or disclosure obligations.

Take the case of DNA processing site 23andMe, which, upon filing for bankruptcy, announced it would be selling itself and its library of DNA samples to a company that users hadn't consented to consult with. Medical information, Dinçer explained, is one of the most valuable markets for data brokers online.

Many AI companies have erected walls between versions of their product that are compliant with laws like HIPAA and those that aren't, including the "enterprise level" products touted by OpenAI and its competitors. These aren't the same products being marketed for use by the general public. For example, OpenAI launched ChatGPT for Healthcare, a HIPAA compliant version for health professionals not to be confused with ChatGPT Health, one day later. That same week, Anthropic announced HIPAA-compliant Claude for Healthcare.

Much like ChatGPT Health, Microsoft's Copilot Health is not HIPAA compliant but guided by internal privacy policies. The company explains, "data in Copilot Health is protected with industry leading safeguards, including encryption at rest and in transit, strict access controls, and the ability to manage and delete your information when you choose."

Amazon Health AI, on the other hand, is automatically looped into HIPAA compliance as an offering underneath Amazon One Medical.

The situation starts to get real confusing, real fast for the average consumer.

This muddled privacy grey area is where fitness and wellness apps have thrived, hinging their marketplace clearance on the distinction between a product that seeks to provide treatment versus one that operates merely as a health "assistant." It's why you will almost always see a note emblazoned across the app: Consult with your doctor.

Now enter LLM products, which not only gather data from users' chats, but also emphatically encourage uploading your personal medical records and linking third-party apps — like MyFitnessPal, Weight Watchers, or Apple Health and its wearables — to get the "best" results from your chatbot. Many of these fitness apps have previously come under fire for tracking users without consent and illegally collecting data. 

Copilot Health, for example, is compatible with more than 50 wearable wellness devices, Microsoft says, including Oura rings and Fitbit watches. Amazon initially incentivized Amazon One Medical users to upload their personal medical information by offering early Health AI access to those who consented. "You do not have to allow One Medical to access your health records to use Health AI. However, to ensure the best experience, we are prioritizing early access to Health AI to those who do," wrote Amazon in early versions of the product's FAQ.

"You're creating a larger ecosystem in this non-HIPAA covered space, where health data is being shared and used by lots of companies," Crawford said. "That's going to create large troves of sensitive health data that all these companies will be in possession of."

Opting Out vs. Opting in

Dinçer also flagged that ChatGPT Health isn't being piloted to people in the European Union or the UK — places with more robust consumer data privacy laws and, specifically, requirements that data collection is opt-in.

Most U.S. law is an opt-out system, Dinçer explained, which places the onus on users to be aware of privacy laws and pay attention to the minutiae of a non-HIPAA product's terms of service. Often U.S. consumers are up against intentionally deceptive design, like confusing language and complicated interfaces referred to as dark patterns, that make rules on data storage difficult to parse. 

"We see these endemic, horrible practices around actually safeguarding our personal information when in the hands of these kinds of companies," Dinçer said. "There's no indication to me that that's suddenly going to change just because the technology looks a little different or you're disclosing it to something that feels like an intelligent conversation partner."

Over the years, state laws have started to catch up, Li said. California recently expanded its Confidentiality of Medical Information Act (CMIA), outlining unlawful uses of sensitive data and requiring a patient's written authorization to disclose medical information. Washington state passed the My Health My Data Act in 2023, considered one of the strongest consumer data privacy laws in the country.

Even so, there are exceptions across state and federal laws. 

One day before ChatGPT Health launched, the FDA announced it would be limiting its regulation of wearable technology and associated software designed to foster "healthy lifestyles." These technologies and others like fitness trackers are considered "low-risk non-medical devices," and as long as they don't make any diagnoses or treatment claims, they fall out of the FDA's strict oversight. 

Two weeks after the ChatGPT Health announcement, OpenAI announced it was in the early design stages of its first AI wearable device

Medical "partners" in the era of AI

A recent report by healthcare research nonprofit ECRI argued that AI chatbots are the "most significant health technology hazard" heading into 2026, citing risks of AI models perpetuating bias and exacerbating existing health disparities.

Similarly, many experts warn that LLMs aren't yet robust enough to effectively curb misinformation. A recent Guardian investigation found that Google's AI overviews often spat out inaccurate, gender-biased medical answers and could pose a public health risk. A study published in Nature Medicine in February found that ChatGPT Health failed to effectively triage medical emergencies and make appropriate care recommendations when compared to real-world physicians.

And the expansion of tech companies into the medical sphere poses additional concerns about the law. Will companies like OpenAI be subject to further inquiry from law enforcement requesting personal health or chat log data? What would that mean for people with stigmatized health conditions or precarious legal statuses, including people seeking reproductive healthcare, abortions, and gender-affirming care?

"We're already conditioned to think it's OK or normal to go to the internet with our health inquiries, sharing really intimate information online and with commercial products," Dinçer said. "We are buying into this idea that something so complex as health can be reduced to numbers on a screen."

________________________________________________________________________________________________________

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health propeovider regarding any questions you may have about a medical condition or health objectives.

Disclosure: Ziff Davis, Mashable’s parent company, previously filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Ria.city






Read also

James Maddison provides relief to Tottenham fans after strange viral leg video

Kevin Hart caddies for Bryson DeChambeau in Augusta National debut, delivering hilarious Par 3 Contest moments

Inside a derailed human smuggling operation on the Quebec-U.S. border

News, articles, comments, with a minute-by-minute update, now on Today24.pro

Today24.pro — latest news 24/7. You can add your news instantly now — here




Sports today


Новости тенниса


Спорт в России и мире


All sports news today





Sports in Russia today


Новости России


Russian.city



Губернаторы России









Путин в России и мире







Персональные новости
Russian.city





Friends of Today24

Музыкальные новости

Персональные новости