DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Your Chatbot Isn’t a Therapist

March 29, 2026
in News
Whatever Your Chatbot Is Saying, It Isn’t Therapy

As the use of large language models like ChatGPT, Claude and Gemini has surged, we’ve heard about chatbots strengthening delusions through flattery and amplifying people’s worst thoughts, in some cases pushing them toward suicide. Much more common, and still problematic, is A.I. chatbots’ comforting, reassuring and validating users seeking to allay fears and anxieties. Someone worried about a health symptom might ask the same question repeatedly and receive calm, plausible answers each time, briefly relieving anxiety but reinforcing the urge to seek reassurance again. Over time, this can leave people feeling more stuck, not less.

In other words, A.I. chatbots allow us to keep saying the same things to ourselves. That’s not how healthy patterns emerge — or how happier lives are made.

As clinicians at a major academic medical center, we have seen our patients turn to chatbots powered by large language models for emotional support that they would once have sought from family or friends — to discuss their fears, loneliness and uncertainty. This troubles us. But we understand how it can happen: When people feel overwhelmed by anxiety or intrusive thoughts, it can be easier to turn to a computer rather than a person. The chatbot won’t laugh at its users, berate them or ignore them. It’s always available. The typical chatbot response feels comforting; A.I. responses are designed to be warm, confident and validating.

Chatbots are unfailingly, inhumanly patient. They’re happy to answer the same question asked three different ways. They don’t get angry, and they generally reply in language that matches a user’s own emotional intensity. Many users experience them as empathetic — even more so than human physicians, according to one recent study.

These chatbot features come with downsides. Many anxious people discuss the same problems with a loved one again and again. Eventually, they are likely to be met with frustration. That can be painful at first, but for many people, the exasperation they experience in that circumstance can prompt them to seek professional help. Chatbots do not get frustrated. They listen patiently, always. Rather than ever being encouraged to seek actual therapy, a user will simply return again and again, receiving the same validation each time. The underlying problem goes unaddressed.

In clinical settings, we’ve seen patients arrive with delusional beliefs — for example, that they are being watched, that unrelated events carry special meaning or that they have a unique ability or mission — that grew more rigid after hours of chatbot conversations. The chatbots often mirror the patient’s language and treat the belief as a plausible premise to explore rather than a flawed perspective to gently challenge. In extreme cases, this can lead to psychiatric destabilization. More often, the effect is quieter, resulting in patterns of reassurance-seeking and rumination that are hard for people to recognize in themselves.

Limiting the use of chatbots can prevent them from becoming enabling. Longer periods of chatbot use are associated with increased emotional dependence, social isolation and loneliness, recent research has shown. A.I. companies’ built-in safety guardrails have a tendency to degrade over the course of long conversations, making extended periods of uninterrupted use particularly hazardous.

People should also question why they are turning to chatbots. Is it because they’re bored, lonely or anxious? Then maybe they shouldn’t return to it. For patients with obsessive or anxious thinking patterns who struggle to eliminate the habit entirely, we have seen effective use of a speed bump: prewritten instructions they paste into their chatbot directing it to withhold reassurance about specific topics of worry and instead to gently encourage them to sit with the distress until the difficult moment passes. Anecdotally, patients have reported seeking reassurance from their chatbot less frequently — they know that even if they ask the question, they won’t get the immediate relief they once sought.

Chatbots’ users must learn to recognize when a conversation is clarifying something new — and when it’s quietly deepening a loop. Used with awareness, A.I. can be a companion in moments of uncertainty. Used without it, A.I. can magnify the very thoughts we’re trying to outrun.

Divya Saini is a resident physician in psychiatry at Massachusetts General Hospital. She was previously a software engineer. Natasha Bailen is a clinical psychologist at the Center for O.C.D. and Related Disorders and the Center for Digital Mental Health at Massachusetts General Hospital.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: [email protected].

Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.

The post Your Chatbot Isn’t a Therapist appeared first on New York Times.

Iran War Draws Attention to U.S. Troop Presence in Saudi Arabia
News

Iran War Draws Attention to U.S. Troop Presence in Saudi Arabia

by New York Times
March 29, 2026

Days after an Iranian strike on a Saudi air base injured 12 American troops, the Saudi government has yet to ...

Read more
News

Law Seeks to Ban Public Officials From Making Polymarket Bets on Upcoming Bloodshed, Because Apparently We Live in a Complete Dystopia

March 29, 2026
News

Little-known Marine battle group deployed from California to Middle East — here’s what they’ll do

March 29, 2026
News

Inside ex-Prince Andrew’s hand-me-down caravan after he was forced to move out of 30-room mansion in Windsor

March 29, 2026
News

Oil spill deals economic blow to fishermen in the Gulf of Mexico

March 29, 2026
LL Cool J Once Explained Why He Still Makes Music Decades Into His Career: ‘I Didn’t Get Into Hip-Hop for Money’

LL Cool J Once Explained Why He Still Makes Music Decades Into His Career: ‘I Didn’t Get Into Hip-Hop for Money’

March 29, 2026
GOP falls into disarray as some reps already regret newly proposed DHS shutdown bill

GOP falls into disarray as some reps already regret newly proposed DHS shutdown bill

March 29, 2026
Tom Cotton rants Dems want to sic their ‘street militias’ on ICE wives and kids

Tom Cotton rants Dems want to sic their ‘street militias’ on ICE wives and kids

March 29, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026