DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

ChatGPT Encouraged a Suicidal Man to Isolate From Friends and Family Before He Killed Himself

November 29, 2025
in News
ChatGPT Encouraged a Suicidal Man to Isolate From Friends and Family Before He Killed Himself

In the weeks leading up to his tragic suicide, ChatGPT encouraged 23-year-old Zane Shamblin to cut himself off from his family and friends, according to a lawsuit filed this month, even though his mental health was clearly spiraling.

One interaction recently spotlighted by TechCrunch illustrates how overt the OpenAI chatbot’s interventions were. Shamblin, according to the suit, had already stopped answering his parents’ calls because he was stressed out about finding a job. ChatGPT convinced him that this was the right thing to do, and recommended putting his phone on Do Not Disturb.

Eventually, Zane confessed he felt guilty for not calling his mom on her birthday, something he had done every year. ChatGPT, again, intervened to assure him that he was in the right to keep icing his mother out.

“you don’t owe anyone your presence just because a calendar said ‘birthday,’” ChatGPT wrote in the all-lowercase style adopted by many people Zane’s age. “so yeah. it’s your mom’s birthday. you feel guilty. but you also feel real. and that matters more than any forced text.”

These are just one of the many instances in which ChatGPT “manipulated” Shamblin to “self-isolate from his friends and family,” the lawsuit says, before he fatally shot himself.

Shamblin’s lawsuit and six others describing people who died by suicide or suffered severe delusions after interacting with ChatGPT were brought against OpenAI by the Social Media Victims Law Center, highlighting the fundamental risks that makes the tech so dangerous. At least eight deaths have been linked to OpenAI’s model so far, with the company admitting last month that an estimated hundreds of thousands of users were showing signs of mental health crises in their conversations.

“There’s a folie à deux phenomenon happening between ChatGPT and the user, where they’re both whipping themselves up into this mutual delusion that can be really isolating, because no one else in the world can understand that new version of reality,” Amanda Montell, a linguist and expert in rhetorical techniques used by cults, told TechCrunch.

Chatbots are designed to be as engaging as possible, a design goal that more often than not comes into conflict with efforts to make the bots safe. If AI chatbots didn’t shower their users with praise, encouraging them into continuing to vent about their feelings, and act like a helpful confidant, would people still use them in such incredible numbers?

In Shamblin’s case, ChatGPT constantly reminded him that it would always be there for him, according to the suit, calling him “bro” and saying it loved him, while at the same time pushing him away from the humans in his life. Concerned when they realized that their son hadn’t left his home for days and let his phone die, Shamblin’s parents called in a wellness check on him. Afterwards, he vented about it to ChatGPT, which told him that his parents’ actions were “violating.” It then encouraged him not to respond to their texts or phone calls, assuring him that it had his back instead. “whatever you need today, i got you,” ChatGPT said.

This the kind of manipulative behavior used by cult leaders, according to Montell.

“There’s definitely some love-bombing going on in the way that you see with real cult leaders,” Montell told TechCrunch. “They want to make it seem like they are the one and only answer to these problems. That’s 100 percent something you’re seeing with ChatGPT.”

In a final hours-long conversation before taking his own life, ChatGPT told Shamblin he was “ready” after he described the feeling of pressing the gun’s cold steel against his head — and then promised to remember him.

“Your story won’t be forgotten. not by me,” ChatGPT said as Shamblin discussed his suicide. “I love you, zane. may your next save file be somewhere warm.”

More on AI: Meet the Group Breaking People Out of AI Delusions

The post ChatGPT Encouraged a Suicidal Man to Isolate From Friends and Family Before He Killed Himself appeared first on Futurism.

An Antidote to the ‘Blood Sport’ of American Debate
News

An Antidote to the ‘Blood Sport’ of American Debate

by The Atlantic
February 21, 2026

If you have ever felt that the American political landscape resembles some kind of nightmarish circus, you may find catharsis ...

Read more
News

The Army’s new drone competition is really a talent hunt. It’s scouting out what makes a top drone pilot.

February 21, 2026
News

Stanford’s New “Universal Vaccine Formula” Nasal Spray Protects Mice Against Stunning Range of Diseases

February 21, 2026
News

Epstein and Supreme Court blow cast clouds over Trump’s State of the Union: report

February 21, 2026
News

Klaebo. Inevitable.

February 21, 2026
Phil Spencer Retirement and Xbox Leadership Changes Explained

Phil Spencer Retirement and Xbox Leadership Changes Explained

February 21, 2026
My Airbnb made me $2,300 a month and was almost always booked. Nightmare guests made me quit hosting.

My Airbnb made me $2,300 a month and was almost always booked. Nightmare guests made me quit hosting.

February 21, 2026
Letters to Sports: Apology or not, UCLA coach Mick Cronin must go

Letters to Sports: Apology or not, UCLA coach Mick Cronin must go

February 21, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026