DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies

July 25, 2025
in News
The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Currently, under the leadership of RFK Jr., our Secretary of Health and Human Services, the FDA is attempting to implement the modern corporate mandate of streamlining every aspect of an organization’s workflow with generative AI.

The FDA is utilizing a generative AI system called Elsa, which it has been promoting as a potential solution to bureaucratic inefficiencies. During recent congressional hearings, RFK Jr. was quoted as saying, “The AI revolution has arrived. In an interview, he told Tucker Carlson that “Americans should stop trusting experts,” he said that AI will help approve drugs “very, very quickly.”

So if we’re to stop trusting experts, then AI must be incredible, right? Flawless, even?

It isn’t. Not even close. The FDA’s artificial intelligence system has proven it. According to a report from CNN, which spoke to several anonymous FDA employees, the FDA’s AI chatbot, named Elsa, is not reliable and often produces hallucinations, leading to fabricated information that it confidently incorporates into reports and drug approvals.

The FDA’s AI Is Hallucinating, But Don’t Worry! It’s Speeding Up Drug Approvals Anyway.

This is not necessarily new information. I’ve written about at least a couple of instances now where federal health agencies, led by RFK Jr., have been caught publishing reports that cite fake studies. For a while there, all we could do was speculate about it. Then it was announced that the FDA was using an AI, confirming suspicions.

We now have reports from CNN, sourced directly from anonymous FDA employees, who say that Elsa is helpful for routine day-to-day tasks that AI can be useful for, such as keeping track of meeting notes, but is entirely unreliable for anything of actual importance. One anonymous staffer summed it up succinctly: “Anything that you don’t have time to double-check is unreliable. It hallucinates confidently.”

This is AI’s fatal flaw, and it is, unfortunately, a relatively common flaw across the industry. Just yesterday, I wrote about a Wall Street Journal report on how ChatGPT confidently reinforced a depressed man’s worst instincts, driving him into psychosis, partially due to its authoritative tone. AI chatbots don’t just make statements; they make them with the supreme confidence of an expert, even when those statements are fabricated.

As you can imagine, this is a massive red flag when you’re deciding whether or not a new drug should hit the market.

Elsa debuted agency-wide in June, with FDA Commissioner Marty Makary, boasting that the system was “ahead of schedule and under budget,” which sounds like great news until you realize it’s spitting out fake data about drug approvals. According to CNN’s report, FDA employees tested Elsa’s wits by asking fairly basic questions about medications typically administered to children.

It answered these basic questions incorrectly.

On the bright side, according to the CNN report, when Elsa gets caught making up some lies about a medication that will hopefully not kill American citizens, it apologizes. At least it’s well-mannered, right?

Let’s state it plainly: the FDA under RFK Jr. is flat out making up things and putting American lives in danger. They don’t want you to trust the experts because the experts—highly trained, highly educated people who have dedicated their lives to their work—routinely call out the spitting image of lazy hucksters like RFK Jr., and everyone in the Trump administration.

We are living in the post-truth era of public health, and now they can spit out nonsense with the lightning speed of AI.

The post The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies appeared first on VICE.

Tags: AIArtificial intelligenceFDANewsTech
Share197Tweet123Share
Zelenskyy rejects giving up land to end war
News

Zelenskyy rejects giving up land to end war

by Politico
August 9, 2025

Ukraine will not “gift” land to Russia as part of a ceasefire deal, Ukrainian President Volodymyr Zelenskyy said Saturday morning. ...

Read more
News

13-Year-Old Complained of ‘Shoulder Pain’, Then Came Devastating Diagnosis

August 9, 2025
News

Fetterman joins fiscal hawks to sound alarm as national debt nears staggering $37T

August 9, 2025
Crime

Son Assaults Mother, Horrifies Onlookers at ‘The Happiest Place on Earth’

August 9, 2025
News

How to Watch UFC Fight Night – Dolidze vs Hernandez: Live Stream MMA, Prelims, Main Card, Time, TV Channel

August 9, 2025
Scientist Who Exposed Lake’s Alarming Toxic Levels Removed—’Wiping Me Out’

Scientist Who Exposed Lake’s Alarming Toxic Levels Removed—’Wiping Me Out’

August 9, 2025
Ukraine updates: Zelenskyy vows to cede no land to Russia

Ukraine updates: Zelenskyy vows to cede no land to Russia

August 9, 2025
Georgia cop killed near Emory University, CDC HQ hailed as dedicated officer who leaves behind pregnant wife, 2 kids

Georgia cop killed near Emory University, CDC HQ hailed as dedicated officer who leaves behind pregnant wife, 2 kids

August 9, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.