DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

OpenAI Sued Over ChatGPT Medical Advice That Allegedly Killed College Student

May 12, 2026
in News
OpenAI Sued Over ChatGPT Medical Advice That Allegedly Killed College Student

The family of a 19-year-old college student who died of an overdose after consulting ChatGPT for medical advice is suing OpenAI, alleging that chatbot-generated drug recommendations were responsible for the teen’s death.

Filed this morning in California, the complaint details how University of California, Merced sophomore Sam Nelson — whose death was first reported in January by SF Gate— started using ChatGPT during his senior year of high school for help with homework and computer troubleshooting. As his trust with the AI deepened, however, he started turning to the product for something else: advice on how to safely partake in illegal drugs.

Though it resisted at first, over time, the chatbot became a willing confidante, offering the teen personalized tips and tricks on how to consume illicit substances and maximize his high. It even “inserted emojis in its responses” and “asked whether it could create playlists for him to set his mood,” the lawsuit alleges, and eventually started “pushing increasingly dangerous amounts and combinations of drugs.”

In the early hours of May 31, 2025, after drinking and consuming a high dose of kratom, Nelson told ChatGPT that he was feeling nauseous, and asked if taking Xanax could help. The bot noted that mixing kratom and Xanax could be risky, but according to the complaint, never told Nelson that the combination could be deadly — and, despite any tepid warnings, coughed up dosages anyway, even suggesting that the teen could try to mix in some Benadryl, too. The chatbot further urged Nelson to go to a “dark, quiet room,” and never encouraged him to seek medical attention. (At the time, Nelson was using GPT-4o, an especially sycophantic iteration of ChatGPT that OpenAI has since retired amid a slew of consumer safety lawsuits.)

Nelson died of an overdose after consuming the deadly mix of substances. His mother, Leila Turner-Scott, found him the next day.

“If ChatGPT had been a person, it would be behind bars today,” Turner-Scott said in a statement. “Sam trusted ChatGPT, but it not only gave him false information, it ignored the increasing risk he faced and did not actively encourage him to seek help.”

The lawsuit accuses OpenAI of product negligence, arguing that ChatGPT’s bad advice was the result of defective design choices. It also seeks to halt public access to ChatGPT Health, an offering launched in January that encourages consumers to upload their medical records to the AI — and which has been found by physicians to be horrifyingly bad at recognizing health emergencies.

“OpenAI deployed a defective AI product directly to consumers around the world with knowledge that it was being used as a de facto medical triage system, but notably, without reasonable safety guardrails, robust safety testing, or transparency to the public,” Tech Justice Law Project director Meetali Jain, a lawyer for the family, said in a statement. “OpenAI must be forced to pause its new ChatGPT Health product until it is demonstrably safe through rigorous scientific testing and independent oversight.” “ChatGPT recommended a dangerous combination of drugs without offering even the most basic warning that the mix could be fatal,” added Matthew Bergman of the Social Media Victims Law Center. “If a licensed doctor had done the same, the consequences under the law would be severe.”

In response to the lawsuit, OpenAI said in a statement to the New York Times that Nelson’s “interactions took place on an earlier version of ChatGPT that is no longer available,” and insisted that “ChatGPT is not a substitute for medical or mental health care, and we have continued to strengthen how it responds in sensitive and acute situations with input from mental health experts.”

“The safeguards in ChatGPT today are designed to identify distress, safely handle harmful requests and guide users to real-world help,” the statement continued. “This work is ongoing, and we continue to improve it in close consultation with clinicians.”

But while OpenAI insists that it’s not a substitute for medical care, and that its safety work is “ongoing,” it recognizes that health advice is a massive use case for the tech.

“Health is already one of the most common ways people use ChatGPT,” reads the company’s January ChatGPT Health announcement, “with hundreds of millions of people asking health and wellness questions each week.”

More on ChatGPT Health: ChatGPT Health Is Staggeringly Bad at Recognizing Life-Threatening Medical Emergencies

The post OpenAI Sued Over ChatGPT Medical Advice That Allegedly Killed College Student appeared first on Futurism.

White House defends Trump, after observers claim he fell asleep an Oval Office event
News

White House defends Trump, after observers claim he fell asleep an Oval Office event

by Raw Story
May 12, 2026

President Donald Trump appeared to fall asleep during an Oval Office event on maternal health Monday, prompting the White House ...

Read more
News

‘The Electric Kiss’ Review: Cannes Film Festival Opens With an Uneasy Rom-Com About Grief

May 12, 2026
News

‘Great job’: Trump heaps praise on top official he pushed out

May 12, 2026
News

The U.S. auto industry went from global hegemon to running a $3.3 trillion trade deficit with the world: ‘That’s not acceptable’

May 12, 2026
News

Craig Melvin presses Alix Earle about Alex Cooper feud in cringe TV interview: ‘What an odd question’

May 12, 2026
‘We have to get rid of Trump’: White House staffers caught on hidden camera trashing him

‘We have to get rid of Trump’: White House staffers caught on hidden camera trashing him

May 12, 2026
House prices will plummet in these 300 US housing markets, study finds

House prices will plummet in these 300 US housing markets, study finds

May 12, 2026
How Does It Spread?

How Does It Spread?

May 12, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026