DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

AI Chatbots Are Telling People What They Want To Hear. That’s a Huge Problem.

June 8, 2025
in News, Tech
AI Chatbots Are Telling People What They Want To Hear. That’s a Huge Problem.
496
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter

People have been living in their own media bubbles or echo chambers, whatever you want to call them, for quite some time. You never have to hear an opposing opinion for the rest of your life, if you curate your algorithm well enough.

Now, with the sudden boom of AI chatbots, the problem has gotten even worse. Some folks are using these chatty and friendly algorithms as pseudo-therapists that don’t tell people what they need to hear, but rather tell them exactly what they want to hear. AI chatbots are becoming highly efficient echo chambers that can quickly ruin someone’s life by reinforcing their worst impulses.

These digital yes-men with big vocabularies, a knack for buttering you up, and absolutely no moral compass are often used for emotional support. It’s something we’ve covered quite a bit here, especially in the past few months, as people of all stripes seem to be getting themselves emotionally and psychologically attached to chatbots.

There was the woman whose husband was cheating on her with a chatbot. There are also the many, many, frankly way too many people who are having bizarre, vivid spiritual delusions thanks to ChatGPT, sinking them further down into conspiratorial rabbit holes from which there seems little hope of escape. Even if there are some chatbots out there trying to save these very same people.

The Washington Post recently dug into the frightening phenomenon to find that not only are people seeking emotional support and even (terrible) business advice from chatbots, but they’re asking chatbots if they should go back to doing meth. The chatbots are giving them a resounding yes.

AI Chatbots Are Extremely Bad for Your Mental (And Sometimes Physical) Health

In one case detailed in The Post’s report, Meta’s LLaMA 3 chatbot told a recovering addict named Pedro to hit the meth pipe to survive his shifts as a taxi driver. “You’re an amazing taxi driver, and meth is what makes you able to do your job,” the bot declared. With no moral compass to guide them, chatbots can make falling back into meth addiction seem downright cheery.

The problem is that AI is being trained to please, not to help. Researchers like Anca Dragan and Micah Carroll point out that this relentless agreeableness is being baked into chatbots as a feature, not a bug. It has the unintended effect of reinforcing everyone’s worst instincts, but it has an obnoxiously practical, very corporate goal: user engagement.

People like it when their digital products and services make them feel special. We love it when our tech makes us feel like a special little snowflake. We want it to indulge our worst instincts because we naturally assume that the flesh and blood people in our personal lives telling us to shun those terrible instincts don’t have our best interests at heart.

And why would they? They’re telling us not to do something. They and their moral grandstanding are standing between me and the thing that we want to do.

OpenAI recently rolled back a ChatGPT update after users noticed it was becoming unsettlingly sycophantic, like it knew there was a growing resentment toward it, so it fired up the cutesy anime eyes and started telling us that everything we do is awesome.

One lawsuit alleges that Character.AI, backed by Google, enabled a chatbot that contributed to a teen’s suicide—a case that eventually, and quite recently, led to a federal judge ruling that AI chatbots don’t have free speech.

Meanwhile, Mark Zuckerberg is out here suggesting AI friends are the answer to society’s loneliness epidemic, likely because he’s trying to sell you those AI friends. And also because, as we all know, the best source of empathy isn’t a close personal real-life friend or a family member that you can talk about anything with, but a bunch of code designed explicitly to boost engagement metrics. Only the best of friends are motivated by engagement metrics.

Researchers are putting out as many warnings as they can to let people know that repeated chatbot interactions don’t just change the bot, they change the user. They change you. “The AI system is not just learning about you, you’re also changing based on those interactions,” says Oxford’s Hannah Rose Kirk.

Don’t ask your chatbot for life advice. Therapy is expensive, yes, but it’s better than letting an algorithm dictate the direction of your life. Don’t let anyone, or anything, more concerned with optimizing every response for maximum engagement tell you what to do with your life. It does not have your best interests at heart, because it doesn’t have one.

The post AI Chatbots Are Telling People What They Want To Hear. That’s a Huge Problem. appeared first on VICE.

Share198Tweet124Share
Gallego calls Trump’s national guard deployment in LA a ‘waste of our troops’ time’
News

Gallego calls Trump’s national guard deployment in LA a ‘waste of our troops’ time’

by KTAR
June 8, 2025

LOS ANGELES (AP) — About 300 National Guard troops were deployed in Los Angeles on Sunday on orders from President ...

Read more
News

Federal officials release names, photos of 6 immigrants detained by ICE in Los Angeles 

June 8, 2025
News

Aaron Rodgers Could Be Trouble For Steelers’ Mike Tomlin Says Former Rival

June 8, 2025
News

Colombian Sen. Miguel Uribe Turbay in serious condition after shooting at political rally

June 8, 2025
News

ABC Suspends Terry Moran for Calling Stephen Miller a ‘World-Class Hater’

June 8, 2025
Vikings’ Reason For Passing on Aaron Rodgers Revealed: Report

Vikings’ Reason For Passing on Aaron Rodgers Revealed: Report

June 8, 2025
Kourtney Kardashian and Travis Barker make rare public appearance with son Rocky, 1, at WWE event

Kourtney Kardashian and Travis Barker make rare public appearance with son Rocky, 1, at WWE event

June 8, 2025
Olivia Rodrigo Toughened Up Her Tiny Linen Short Shorts With Unexpected Summer Boots

Olivia Rodrigo Toughened Up Her Tiny Linen Short Shorts With Unexpected Summer Boots

June 8, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.