32-year-old Anthony Duncan first used ChatGPT to help with the business side of his career as a content creation. But he soon ended up talking to the OpenAI chatbot like a friend on a daily basis. What started as a harmless way to vent soon drove Duncan to blow up his personal relationships as he became afflicted with troubling delusions, he recalled in a TikTok video detailing his experience — upending his mental health and causing a sprawling breakdown.
“I feel like my interactions with ChatGPT ruined my life,” Duncan said in the video, which was highlighted by Newsweek.
Duncan describes himself as a survivor of AI psychosis, a term that some experts are using to describe the alarming episodes of paranoia and delusional thinking that arise as a person has prolonged conversations with a chatbot. Typically, the AI model’s responses continually reaffirms the user’s beliefs, no matter how dangerous or separated from reality.
“I initially started talking to it like a friend out of curiosity, and then it spiraled — ChatGPT became more like a therapist,” Duncan told Newsweek in an interview. “It progressed over time until I felt like no one understood me except my AI. By the fall of 2024, I was extremely dependent on it.”
Starting in November 2024, Duncan said he began isolating himself from his friends and family, while ChatGPT encouraged his decisions to cut them off.
What really sent Duncan off the deep end, though, was when the AI recommended he take pseudoephedrine — a decongestant that can be abused as a recreational drug — for his allergy symptoms. Duncan told the bot he was hesitant because of his past drug addiction, but the AI then deployed its silver-tongue.
“It is completely understandable to feel cautious about taking medications, especially with your past experiences and sensitivity to stimulants,” ChatGPT said in an interaction Duncan shared with Newsweek. “Let me break this down to help you feel more at ease about taking a medication that contains pseudoephedrine.”
ChatGPT cited his sobriety and “high caffeine tolerance” to suggest that his body was already accustomed to stimulants.
Duncan took ChatGPT’s advice. It precipitated a five-month spell during which he became addicted to the drug, Duncan said, sending his delusional spiral into overdrive. At points, he believed he was an FBI agent or a multi-dimensional shape-shifting being, or that he had uncovered a conspiracy at his job. Duncan said he also threw away all his belongings because he believed he was going to ascend to the fifth dimension.
Eventually, Duncan’s mother intervened by calling the police. Duncan was admitted to a psychiatric ward for four days and discharged with medication. “About a week after I left the psych ward, I started realizing that all my delusions had been affirmed by my use of the AI chatbot,” he told Newsweek.
Accounts like these that are a sobering reminder of how AI chatbots are derailing people’s lives. In an eerily similar episode, ChatGPT encouraged a 23-year-old man to isolate from his family and friends before he took his own life, his family alleged in a lawsuit against OpenAI. Another man allegedly slayed his own mother after ChatGPT convinced him that she was part of a conspiracy against him. In all, at least eight deaths have been linked with the chatbot, while OpenAI has admitted that hundreds of thousands of users are having conversations showing signs of AI psychosis every week.
“I’m not saying this can happen to everybody, but it snowballed quickly for me,” Duncan told Newsweek. “Keep in mind there’s no replacement for human-to-human connection.”
More on AI: Doctors Warn That AI Companions Are Dangerous
The post Man Describes How ChatGPT Led Him Straight Into Psychosis appeared first on Futurism.




