DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies

July 25, 2025
in News
The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies
492
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

Currently, under the leadership of RFK Jr., our Secretary of Health and Human Services, the FDA is attempting to implement the modern corporate mandate of streamlining every aspect of an organization’s workflow with generative AI.

The FDA is utilizing a generative AI system called Elsa, which it has been promoting as a potential solution to bureaucratic inefficiencies. During recent congressional hearings, RFK Jr. was quoted as saying, “The AI revolution has arrived. In an interview, he told Tucker Carlson that “Americans should stop trusting experts,” he said that AI will help approve drugs “very, very quickly.”

So if we’re to stop trusting experts, then AI must be incredible, right? Flawless, even?

It isn’t. Not even close. The FDA’s artificial intelligence system has proven it. According to a report from CNN, which spoke to several anonymous FDA employees, the FDA’s AI chatbot, named Elsa, is not reliable and often produces hallucinations, leading to fabricated information that it confidently incorporates into reports and drug approvals.

The FDA’s AI Is Hallucinating, But Don’t Worry! It’s Speeding Up Drug Approvals Anyway.

This is not necessarily new information. I’ve written about at least a couple of instances now where federal health agencies, led by RFK Jr., have been caught publishing reports that cite fake studies. For a while there, all we could do was speculate about it. Then it was announced that the FDA was using an AI, confirming suspicions.

We now have reports from CNN, sourced directly from anonymous FDA employees, who say that Elsa is helpful for routine day-to-day tasks that AI can be useful for, such as keeping track of meeting notes, but is entirely unreliable for anything of actual importance. One anonymous staffer summed it up succinctly: “Anything that you don’t have time to double-check is unreliable. It hallucinates confidently.”

This is AI’s fatal flaw, and it is, unfortunately, a relatively common flaw across the industry. Just yesterday, I wrote about a Wall Street Journal report on how ChatGPT confidently reinforced a depressed man’s worst instincts, driving him into psychosis, partially due to its authoritative tone. AI chatbots don’t just make statements; they make them with the supreme confidence of an expert, even when those statements are fabricated.

As you can imagine, this is a massive red flag when you’re deciding whether or not a new drug should hit the market.

Elsa debuted agency-wide in June, with FDA Commissioner Marty Makary, boasting that the system was “ahead of schedule and under budget,” which sounds like great news until you realize it’s spitting out fake data about drug approvals. According to CNN’s report, FDA employees tested Elsa’s wits by asking fairly basic questions about medications typically administered to children.

It answered these basic questions incorrectly.

On the bright side, according to the CNN report, when Elsa gets caught making up some lies about a medication that will hopefully not kill American citizens, it apologizes. At least it’s well-mannered, right?

Let’s state it plainly: the FDA under RFK Jr. is flat out making up things and putting American lives in danger. They don’t want you to trust the experts because the experts—highly trained, highly educated people who have dedicated their lives to their work—routinely call out the spitting image of lazy hucksters like RFK Jr., and everyone in the Trump administration.

We are living in the post-truth era of public health, and now they can spit out nonsense with the lightning speed of AI.

The post The FDA’s AI Is Busy Approving Drugs—and Hallucinating Fake Studies appeared first on VICE.

Tags: AIArtificial intelligenceFDANewsTech
Share197Tweet123Share
Chinese, Philippine ships collide near disputed shoal in South China Sea
News

Chinese, Philippine ships collide near disputed shoal in South China Sea

by Al Jazeera
September 16, 2025

Ships from China and the Philippines are reported to have collided near the disputed Scarborough Shoal in the South China ...

Read more
News

Georgia Supreme Court declines to hear Fani Willis’ appeal of her removal from Trump election case

September 16, 2025
Entertainment

Apple TV+ Has a Bunch of Shows Premiering in September & October

September 16, 2025
Education

These L.A.-area community colleges are the best return on investment, study shows

September 16, 2025
Business

Target steps up next-day parcel delivery as discounter tries to narrow gap with rivals

September 16, 2025
Understanding Zionism

Understanding Zionism

September 16, 2025
Hiltzik: That $1-trillion Tesla pay package for Elon Musk isn’t as bad as you think. It’s worse

Hiltzik: That $1-trillion Tesla pay package for Elon Musk isn’t as bad as you think. It’s worse

September 16, 2025
Many sports fans are unhappy with how much it costs to watch their games, AP-NORC poll finds

Many sports fans are unhappy with how much it costs to watch their games, AP-NORC poll finds

September 16, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.