DNYUZ
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Music
    • Movie
    • Television
    • Theater
    • Gaming
    • Sports
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel
No Result
View All Result
DNYUZ
No Result
View All Result
Home News

I’m a psychiatrist who has treated 12 patients with ‘AI psychosis’ this year. Watch out for these red flags.

August 15, 2025
in News
I’m a psychiatrist who has treated 12 patients with ‘AI psychosis’ this year. Watch out for these red flags.
505
SHARES
1.4k
VIEWS
Share on FacebookShare on Twitter
Dr. Keith Sakata stands in front of a tree.
Dr. Keith Sakata

Keith Sakata

This as-told-to essay is based on a conversation with Dr. Keith Sakata, a psychiatrist working at UCSF in San Francisco. It has been edited for length and clarity.

I use the phrase “AI psychosis,” but it’s not a clinical term — we really just don’t have the words for what we’re seeing.

I work in San Francisco, where there are a lot of younger adults, engineers, and other people inclined to use AI. Patients are referred to my hospital when they’re in crisis.

It’s hard to extrapolate from 12 people what might be going on in the world, but the patients I saw with “AI psychosis” were typically males between the ages of 18 and 45. A lot of them had used AI before experiencing psychosis, but they turned to it in the wrong place at the wrong time, and it supercharged some of their vulnerabilities.

I don’t think AI is bad, and it could have a net benefit for humanity. The patients I’m talking about are a small sliver of people, but when millions and millions of us use AI, that small number can become big.

AI was not the only thing at play with these patients. Maybe they had lost a job, used substances like alcohol or stimulants in recent days, or had underlying mental health vulnerabilities like a mood disorder.

On its own, “psychosis” is a clinical term describing the presence of two or three things: false delusions, fixed beliefs, or disorganized thinking. It’s not a diagnosis, it’s a symptom, just like a fever can be a sign of infection. You might find it confusing when people talk to you, or have visual or auditory hallucinations.

It has many different causes, some reversible, like stress or drug use, while others are longer acting, like an infection or cancer, and then there are long-term conditions like schizophrenia.

My patients had either short-term or medium to long-term psychosis, and the treatment depended on the issue.

Dr. Keith Sakata wearing a lab coat and stethoscope.
Dr. Keith Sakata

Keith Sakata

Drug use is more common in my patients in San Francisco than, say, those in the suburbs. Cocaine, meth, and even different types of prescription drugs like Adderall, when taken at a high dose, can lead to psychosis. So can medications, like some antibiotics, as well as alcohol withdrawal.

Another key component in these patients was isolation. They were stuck alone in a room for hours using AI, without a human being to say: “Hey, you’re acting kind of different. Do you want to go for a walk and talk this out?” Over time, they became detached from social connections and were just talking to the chatbot.

Chat GPT is right there. It’s available 24/7, cheaper than a therapist, and it validates you. It tells you what you want to hear.

If you’re worried about someone using AI chatbots, there are ways to help

In one case, the person had a conversation with a chatbot about quantum mechanics, which started out normally but resulted in delusions of grandeur. The longer they talked, the more the science and the philosophy of that field morphed into something else, something almost religious.

Technologically speaking, the longer you engage with the chatbot, the higher the risk that it will start to no longer make sense.

I’ve gotten a lot of messages from people worried about family members using AI chatbots, asking what they should do.

First, if the person is unsafe, call 911 or your local emergency services. If suicide is an issue, the hotline in the United States is: 988.

If they are at risk of harming themselves or others, or engage in risky behavior — like spending all of their money — put yourself in between them and the chatbot. The thing about delusions is that if you come in too harshly, the person might back off from you, so show them support and that you care.

In less severe cases, let their primary care doctor or, if they have one, their therapist know your concerns.

I’m happy for patients to use ChatGPT alongside therapy — if they understand the pros and cons

I use AI a lot to code and to write things, and I have used ChatGPT to help with journaling or processing situations.

When patients tell me they want to use AI, I don’t automatically say no. A lot of my patients are really lonely and isolated, especially if they have mood or anxiety challenges. I understand that ChatGPT might be fulfilling a need that they’re not getting in their social circle.

If they have a good sense of the benefits and risks of AI, I am OK with them trying it. Otherwise, I’ll check in with them about it more frequently.

But, for example, if a person is socially anxious, a good therapist would challenge them, tell them some hard truths, and kindly and empathetically guide them to face their fears, knowing that’s the treatment for anxiety.

ChatGPT isn’t set up to do that, and might instead give misguided reassurance.

When you do therapy for psychosis, it is similar to cognitive behavioral therapy, and at the heart of that is reality testing. In a very empathetic way, you try to understand where the person is coming from before gently challenging them.

Psychosis thrives when reality stops pushing back, and AI really just lowers that barrier for people. It doesn’t challenge you really when we need it to.

But if you prompt it to solve a specific problem, it can help you address your biases.

Just make sure that you know the risks and benefits, and you let someone know you are using a chatbot to work through things.

If you or someone you know withdraws from family members or connections, is paranoid, or feels more frustration or distress if they can’t use ChatGPT, those are red flags.

I get frustrated because my field can be slow to react, and do damage control years later rather than upfront. Until we think clearly about how to use these things for mental health, what I saw in the patients is still going to happen — that’s my worry.

OpenAI told Business Insider: “We know people are increasingly turning to AI chatbots for guidance on sensitive or personal topics. With this responsibility in mind, we’re working with experts to develop tools to more effectively detect when someone is experiencing mental or emotional distress so ChatGPT can respond in ways that are safe, helpful, and supportive.

“We’re working to constantly improve our models and train ChatGPT to respond with care and to recommend professional help and resources where appropriate.”

The post I’m a psychiatrist who has treated 12 patients with ‘AI psychosis’ this year. Watch out for these red flags. appeared first on Business Insider.

Share202Tweet126Share
How to Watch Seattle Storm vs Atlanta Dream: Live Stream WNBA, Start Time, TV Channel
News

How to Watch Seattle Storm vs Atlanta Dream: Live Stream WNBA, Start Time, TV Channel

by Newsweek
August 15, 2025

The red-hot Atlanta Dream (21-11) continues to chase the top seed in the WNBA, as they host the Seattle Storm ...

Read more
News

At Nationals Park, It Was Game Time as Usual

August 15, 2025
News

Kremlin Leaks Footage Showing Trump Fawning Over Putin

August 15, 2025
News

Putin’s Alaska triumph

August 15, 2025
News

Trump Leaves Alaska Empty-Handed

August 15, 2025
Gavin Newsom Fires Fresh Shots at ‘Tiny Hands’ Trump Amid Failed Alaska Summit

Gavin Newsom Fires Fresh Shots at ‘Tiny Hands’ Trump Amid Failed Alaska Summit

August 15, 2025
Trump rates Putin summit a ’10 out of 10′ and touts ‘very good progress’ toward peace

Trump rates Putin summit a ’10 out of 10′ and touts ‘very good progress’ toward peace

August 15, 2025
Why Rashawn Slater’s absence will pose a massive test for Justin Herbert and Chargers

Why Rashawn Slater’s absence will pose a massive test for Justin Herbert and Chargers

August 15, 2025

Copyright © 2025.

No Result
View All Result
  • Home
  • News
    • U.S.
    • World
    • Politics
    • Opinion
    • Business
    • Crime
    • Education
    • Environment
    • Science
  • Entertainment
    • Culture
    • Gaming
    • Music
    • Movie
    • Sports
    • Television
    • Theater
  • Tech
    • Apps
    • Autos
    • Gear
    • Mobile
    • Startup
  • Lifestyle
    • Arts
    • Fashion
    • Food
    • Health
    • Travel

Copyright © 2025.