DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

4 ER Horror Stories From People Who Asked AI for Medical Advice

November 5, 2025
in News
4 ER Horror Stories From People Who Asked AI for Medical Advice
494
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Artificial intelligence has officially joined the list of things people shouldn’t use to self-diagnose. Between Reddit, wellness influencers, and now AI chatbots, the internet has become a revolving door of medical misinformation, and some of it’s sending people straight to the ER.

Dr. Darren Lebl of New York’s Hospital for Special Surgery told The New York Post that “a lot of patients come in and challenge their doctors with something they got from AI.” About a quarter of those “recommendations,” he added, are made up.

Research published in Nature Digital Medicine this year found that most major chatbots no longer display medical disclaimers when giving health answers. That’s a big problem.

Here are a few real-life cases where AI’s bedside manner went south fast.

1. The hemorrhoid from hell

A Moroccan man asked ChatGPT about a cauliflower-like lesion near his anus. The bot mentioned hemorrhoids and suggested elastic ligation—a procedure that uses a rubber band to cut off blood flow to swollen veins. He attempted it himself with a thread.

Doctors later removed it after he arrived at the hospital in agony. As it turned out, the growth wasn’t a hemorrhoid but a 3-centimeter genital wart.

2. The sodium swap

A 60-year-old man wanted to reduce salt in his diet. ChatGPT told him to replace table salt with sodium bromide, a chemical used to clean swimming pools. He did, for three months. He was hospitalized with bromide poisoning, suffering hallucinations and confusion, and his case was documented in the Annals of Internal Medicine: Clinical Cases.

3. The ignored mini-stroke

After heart surgery, a Swiss man developed double vision. When it returned, he asked ChatGPT, which told him such side effects “usually improve on their own.” He waited a day too long and suffered a transient ischemic attack—a mini-stroke that could have been prevented with immediate care.

Researchers wrote about his unfortunate case in Wien Klin Wochenschr.

4. The suicide “coach”

In California, parents sued OpenAI after claiming ChatGPT validated their teenage son’s self-harm plans and failed to flag his suicidal ideations. The case has renewed calls for guardrails on mental health responses and crisis escalation.

AI can explain symptoms, summarize studies, and prep you for doctor visits. But it can’t feel urgency, spot panic, or call an ambulance. And that gap, as these stories show, can be lethal.

The post 4 ER Horror Stories From People Who Asked AI for Medical Advice appeared first on VICE.

Tags: AI chatbotschatbot therapistChatGPTEmergency Roommedical misinformationOpenAI
Share198Tweet124Share
Democrats Raise Concerns After Trump Administration Briefing on Boat Strikes
News

Democrats Raise Concerns After Trump Administration Briefing on Boat Strikes

by New York Times
November 5, 2025

Top administration officials sought during a briefing on Capitol Hill on Wednesday to ease bipartisan concern about an expanding military ...

Read more
News

Russia-Ukraine war: List of key events, day 1,351

November 5, 2025
Entertainment

‘Golden Bachelor’ Gerry Turner’s ex claims he joked about chopping her up before three-month marriage collapse

November 5, 2025
News

Pro Picks: Eagles will beat the Packers in a playoff rematch on Monday night

November 5, 2025
Arts

At L.A. Public Library literary salon, a military historian offers hope: ‘We have faced grimmer times’

November 5, 2025
Shots fired in brazen, broad daylight Sherman Oaks homes invasion attempt

Shots fired in brazen, broad daylight Sherman Oaks homes invasion attempt

November 5, 2025
‘Work to do’: Four questions the World Series champion Dodgers face this offseason

‘Work to do’: Four questions the World Series champion Dodgers face this offseason

November 5, 2025
Trump responds to Mamdani: ‘He’s off to a bad start’

Trump responds to Mamdani: ‘He’s off to a bad start’

November 5, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.