DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients

May 16, 2026
in News
Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients

If you’ve been to a medical appointment in the past two or three years, chances are high that your doctor was using an AI scribe: software that listens into the conversation, transcribing it and structuring it into the format of medical notes.

In theory it’s a cool idea, but pain points abound. Earlier this week, Ontario’s auditor general — an accountability officer acting under the Legislative Assembly of Ontario — released a special report warning that AI medical scribes were “not evaluated adequately,” and may present “fabricated information” to medical professionals.

First reported by Global News, the audit took a look at 20 AI scribe platforms and found that “all AI scribe systems from the 20 [government] approved vendors showed one or more inaccuracies at the procurement testing phase,” such as “hallucinations (fabrication), incorrect information, or missing or incomplete information.”

“Inaccuracies in medical notes generated by AI Scribe systems could potentially result in inadequate or harmful treatment plans that may potentially impact patient health outcomes,” the report declared.

Muddying the waters, Ontario’s Minister of Public and Business Service Delivery and Procurement, Stephen Crawford, noted that the hallucinations were observed during testing by state regulators, and had not been recording during actual medical visits.

“Let’s be very clear about that, that’s not actually in operational use with doctors, that’s in the optional stage where we’re reviewing the various scribes,” Crawford told Global News.

Still, the auditor general, Shelley Spence, noted that the various scribes are nonetheless in use by around 5,000 doctors across Ontario. Talking to reporters, Spence said she went so far as to ask her physician to “please look at the transcript when you’re done with my own visit.”

That news comes as another AI scribe system, OpenEvidence, faces growing scrutiny in the US over hallucinations and incomplete answers.

As several doctors told NBC News, for example, OpenEvidence can occasionally draw overly strong conclusions from medical studies with relatively small sample sizes.

While many physicians express appreciation for the new tool, it remains to be seen how they fare under real-world conditions — and how the medical world will judge them once the AI hype wears off.

More on AI hallucinations: New Wikipedia Clone Made Entirely of AI Hallucinations

The post Doctors’ AI Systems Are Hallucinating Nonexistent Medical Issues During Appointments With Patients appeared first on Futurism.

LIRR Strike out! Suffolk man shelled out $200 for Uber to MSG for trip that would cost $13.50
News

LIRR Strike out! Suffolk man shelled out $200 for Uber to MSG for trip that would cost $13.50

by New York Post
May 16, 2026

A Suffolk County man says Uber charged him and two others a whopping $200 Saturday for a one-way trip from ...

Read more
News

CBS News insiders fear ‘something monumental’ coming as Bari Weiss targets top program

May 16, 2026
News

Pete Davidson’s ex Elsie Hewitt and infant daughter spotted for first time since shock split

May 16, 2026
News

NASA Satellite Images Show Huge Colored Plumes Staining the Ocean

May 16, 2026
News

Cannes Day 5: ‘Club Kid’ and ‘Gentle Monster’ Break Out, John Travolta Gets a Surprise

May 16, 2026
Iran foreign minister on dealings with Trump: ‘We are in doubt about their seriousness’

Iran foreign minister on dealings with Trump: ‘We are in doubt about their seriousness’

May 16, 2026
Trump’s Visit to China

Trump’s Visit to China

May 16, 2026
Why Is Annapurna’s Mixtape Getting So Much Backlash Despite Perfect Review Scores?

Why Is Annapurna’s Mixtape Getting So Much Backlash Despite Perfect Review Scores?

May 16, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026