DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

AI Isn’t Coming for Doctors. It’s Already in the Room

November 10, 2025
in News
AI Isn’t Coming for Doctors. It’s Already in the Room

As hospitals turn to AI, patients may no longer know who—or what—is making their medical decisions. As ER physicians, we see how AI guidance is changing what it means to walk into an emergency room. 

This isn’t a policy story. It’s a cultural one: about what it means to have faith in your doctor when the “doctor” might be an algorithm. According to a recent Journal of American College of Emergency Physicians Primer, AI applications in emergency departments are already being used for triage, risk-prediction, and staffing models, the plans that help hospitals make sure they have the right number and mix of doctors, nurses, and other staff working at the right times to care for patients. Patients may not know if the person treating them is a doctor or an AI-assisted hybrid. That can feel seamless, or unsettling, depending on the stakes. We are grappling with a quiet transformation of the ER, where cost pressures, staffing shortages, and AI copilots are rewriting what it means to see a doctor, and what it means to trust one.

The shift from physicians to AI isn’t just a staffing solution, but rather a seismic change in how medical decisions are made. Each comes with trade-offs. AI can process mountains of data in seconds, but it cannot look a patient in the eye and recognize fear, appreciate the quiet moments of human suffering, or pick up on the unspoken clues that come from holding the hand of someone in pain. Part of our 10,000-plus hours of medical training to become ER doctors is developing the gut instinct that something is wrong, even when a patient’s vital signs and lab work look fine. It’s catching the subtle clues—a hint of confusion, a faint slur in a patient’s speech, the quiet panic in their eyes—that a patient might not mention, nor can an algorithm perceive. The human element, the essence of trust and compassion, is exactly where AI stumbles.

Tech companies are racing to integrate AI into the clinical space by creating digital triage systems, diagnostic copilots, and decision-support tools designed to augment or even replace physician oversight. And hospitals are moving quickly to adopt it, drawn by the promise of lower costs and sharper diagnostic accuracy. In one recent Nature study, AI performed on par with non-expert physicians, evidence of how quickly algorithms are catching up to human clinicians in the exam room. OpenAI, Google, and Microsoft are explicitly testing AI-based health care applications. One of those companies, Open Evidence AI, is building an AI-powered tool to give clinicians quick, evidence-based answers to medical questions, and is already valued at $3.5 billion. 

To be sure, there are places where AI can shine. It can surface patterns invisible to even the most experienced clinician, linking a lab result from months ago with a medication list and a cluster of symptoms to flag a severe infection risk before anyone else sees it. It can pull up obscure drug interactions, support decision-making, and speed up documentation, leaving physicians more time for patients and with less burnout. Used correctly, AI is less a replacement for intuition and more a force multiplier for it.

Read More: AI Can Fix the Most Soul-Sucking Part of Medicine

Maybe more than anything, what’s new is that both patients and physicians are now using AI, but not in the same way. 

A few nights ago, a young woman came into the ER with chest pain. Her tests were all normal, but she still seemed on edge. When I asked if she was worried about something, she admitted she’d gone down a ChatGPT rabbit hole after noticing a few skipped heartbeats. The chatbot told her she might have arrhythmogenic right ventricular dysplasia: a rare, deadly heart condition. (She didn’t.) The panic that followed likely caused the symptoms that brought her in.

Another patient, a young man, arrived certain he had appendicitis because ChatGPT told him so. This time, he was right. His symptoms were textbook, and the medical student seeing him independently surmised the same diagnosis. The AI helped the patient to find his diagnosis sooner and seek treatment. Yet he still required the skillful hands of a surgeon to remove his appendix. 

Read More: 9 Doctor-Approved Ways to Use ChatGPT for Health Advice

That’s the paradox of this moment: the same technology that fuels confusion and fear can also sharpen insight and speed care. It’s not just changing how we diagnose, it’s also changing how patients arrive and who is caring for them. Cost, staffing, and technology have blurred the line between human and machine care, ushering in a new kind of medicine: patients treated by clinicians whose most powerful colleague may be an algorithm.

The problem isn’t just that AI could get a diagnosis wrong, it’s that long-term AI use could also jeopardize a clinician’s insight. In one Lancet study, doctors were less likely to detect possibly-cancerous spots on colonoscopy after they were used to using an AI tool. Authors hypothesized that the more they relied on an algorithm, the less human judgment they exercised.

Read More: Using AI Made Doctors Less Skilled at Spotting Cancer

That shift to integrating AI or non-physician clinicians into the ER is not inherently bad, but it is often invisible to patients. And that’s the problem. 

Patients deserve to know when their care is guided by AI, who is ultimately responsible for the decisions being made, and what safeguards exist when the “doctor in the room” might be an algorithm. 

Transparency won’t stop the march of technology, but it may help preserve something medicine can’t afford to lose: trust.

The post AI Isn’t Coming for Doctors. It’s Already in the Room appeared first on TIME.

Trump’s Peace Deal Blown Up By Landmine After Two Weeks
News

Trump’s Peace Deal Blown Up By Landmine After Two Weeks

November 10, 2025

Thailand has suspended the peace accord with Cambodia, signed two weeks ago, that Trump claims he helped negotiate. Two Thai ...

Read more
News

Surgeons Reveal Spike in ‘Mar-a-Lago Face’ Requests From Trump Insiders

November 10, 2025
News

YouTube TV is offering $20 credits during its Disney blackout — but there’s a catch

November 10, 2025
News

Trump’s Jan. 6 Pardons Keep Child Sex Predator Out of Prison

November 10, 2025
News

I tried recipes from chefs Alton Brown, Ree Drummond, and Alex Guarnaschelli to find the best green-bean casserole

November 10, 2025
Keystone Kash Travel Tantrum Offends Closest U.S. Ally

Keystone Kash Travel Tantrum Offends Closest U.S. Ally

November 10, 2025
Passenger Who Died on Carnival Cruise Ship ID’d as 18-Year-Old Cheerleader

Passenger Who Died on Carnival Cruise Ship ID’d as 18-Year-Old Cheerleader

November 10, 2025
My husband and I spent Thanksgiving on a cruise ship with friends for about $1,300 a person. We’d do it again for other holidays.

My husband and I spent Thanksgiving on a cruise ship with friends for about $1,300 a person. We’d do it again for other holidays.

November 10, 2025

DNYUZ © 2025

No Result
View All Result

DNYUZ © 2025