DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients

May 16, 2026
in News
Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients

If you’ve been to a medical appointment in the past two or three years, chances are high that your doctor was using an AI scribe: software that listens into the conversation, transcribing it and structuring it into the format of medical notes.

In theory it’s a cool idea, but pain points abound. Earlier this week, Ontario’s auditor general — an accountability officer acting under the Legislative Assembly of Ontario — released a special report warning that AI medical scribes were “not evaluated adequately,” and may present “fabricated information” to medical professionals.

First reported by Global News, the audit took a look at 20 AI scribe platforms and found that “all AI scribe systems from the 20 [government] approved vendors showed one or more inaccuracies at the procurement testing phase,” such as “hallucinations (fabrication), incorrect information, or missing or incomplete information.”

“Inaccuracies in medical notes generated by AI Scribe systems could potentially result in inadequate or harmful treatment plans that may potentially impact patient health outcomes,” the report declared.

Muddying the waters, Ontario’s Minister of Public and Business Service Delivery and Procurement, Stephen Crawford, noted that the hallucinations were observed during testing by state regulators, and had not been recording during actual medical visits.

“Let’s be very clear about that, that’s not actually in operational use with doctors, that’s in the optional stage where we’re reviewing the various scribes,” Crawford told Global News.

Still, the auditor general, Shelley Spence, noted that the various scribes are nonetheless in use by around 5,000 doctors across Ontario. Talking to reporters, Spence said she went so far as to ask her physician to “please look at the transcript when you’re done with my own visit.”

That news comes as another AI scribe system, OpenEvidence, faces growing scrutiny in the US over hallucinations and incomplete answers.

As several doctors told NBC News, for example, OpenEvidence can occasionally draw overly strong conclusions from medical studies with relatively small sample sizes.

While many physicians express appreciation for the new tool, it remains to be seen how they fare under real-world conditions — and how the medical world will judge them once the AI hype wears off.

More on AI hallucinations: New Wikipedia Clone Made Entirely of AI Hallucinations

The post Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients appeared first on Futurism.

Nick Lachey claims ex Jessica Simpson flew in first class while her 3 kids sat in economy during 6-hour flight
News

Nick Lachey claims ex Jessica Simpson flew in first class while her 3 kids sat in economy during 6-hour flight

by Page Six
May 16, 2026

His mouth was made for talkin’. Nick Lachey revealed that he, his wife Vanessa Lachey, and their three kids sat ...

Read more
News

The Law They Hate Was a High Point of Our History

May 16, 2026
News

Lupe Fiasco Explains Why Rappers Actually Get Better as They Age: ‘Our Muscle Is Our Cognition’

May 16, 2026
News

We talked to 12 tarot card readers who are using AI. They split in 2 camps, with big implications for the technology

May 16, 2026
News

Outrage breaks out as analysts get first look at Trump’s financial disclosures: ‘impeach!’

May 16, 2026
Civilian Planes Shot Down by Cuba: A Push to Punish Raúl Castro 30 Years Later

The Day Cuba Shot Down Two Civilian Planes

May 16, 2026
Cassidy defiant as Trump’s revenge campaign closes in

Cassidy defiant as Trump’s revenge campaign closes in

May 16, 2026
Israel Says It Killed Hamas’s Top Leader in Gaza

Hamas’s Top Leader in Gaza Is Killed in Israeli Strike

May 16, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026