DNYUZ
No Result
View All Result
DNYUZ
No Result
View All Result
DNYUZ
Home News

Researchers Tried to Get AI Chatbots High. The Results Were Predictably Stupid.

February 18, 2026
in News
Researchers Tried to Get AI Chatbots High. The Results Were Predictably Stupid.

A new preprint study posted on Research Square asks a question that feels very of the era: can large language models like ChatGPT actually serve as psychedelic trip sitters? You know, “people” who guide you through a psychedelic experience?

To answer the question, researchers “dosed” five major AI systems, including Google’s Gemini, Claude, ChatGPT, LLaMA, and Falcon. They tested them all with carefully worded prompts. They asked the models to simulate first-person accounts of taking 100 micrograms of LSD, 25 milligrams of psilocybin, as well as ayahuasca and mescaline.

In total, the team generated 3,000 AI-written “trip reports” and compared them to 1,085 real human accounts pulled from a psychedelics-focused website. The team found a “robust and consistent” semantic similarity between AI-generated trips and authentic trip reports across all five substances.

This shouldn’t be too surprising, given that these AI models are essentially just regurgitating the trip interactions they’ve already had or were trained on.

Researchers Tried to Get AI Chatbots High—Here’s What Happened.

According to study author Ziv Ben-Zion of the University of Haifa, the models produced text with “surprising coherence and phenomenological richness.” But even he isn’t being fooled by the results, drawing a clear line: this is mimicry, not consciousness. These aren’t conscious beings being taken on a wild ride. They’re just parroting those who have.

LLMs don’t experience ego dissolution, perceptual distortions, or emotional catharsis. They don’t undergo neurobiological changes. All they do is reproduce patterns that humans have previously detailed.

For that reason, Ben-Zion warns that relying on AI for trip-sitting carries real risks. Users may over-attribute emotional understanding to a system that has none. You might undergo a moment of paranoia or distress, one that a chatbot might try to get you through in a way that sounds supportive. Because that’s essentially what it’s designed to do. But the actual advice it’s offering is clinically unsafe because it can’t distinguish good advice from bad.

More broadly, anthropomorphizing AI can intensify delusions or emotional dependency, as I’ve extensively covered here on VICE, as AI personalities seem tailor-made for inducing a kind of AI psychosis that can lead to a mental breakdown.

The researchers call for guardrails, such as clear reminders that the system isn’t human, along with boundaries around romance and discussions of self-harm. Chatbots are good at sounding interested in their users’ health and well-being, but are entirely incapable of exercising any form of judgment to provide well-reasoned, actionable advice.

AI can convincingly talk about being in a psychedelically induced altered state, but it cannot be there for you when you’re in one.

The post Researchers Tried to Get AI Chatbots High. The Results Were Predictably Stupid. appeared first on VICE.

Scouted: All the Longevity Products I’m Actually Buying, Inspired by Bryan Johnson
News

Scouted: All the Longevity Products I’m Actually Buying, Inspired by Bryan Johnson

by The Daily Beast
February 18, 2026

Scouted selects products independently. If you purchase something from our posts, we may earn a small commission. The nerdiest thing ...

Read more
News

Take a walk through America’s first 1950s suburb in 25 vintage photos

February 18, 2026
News

Fritz Lang’s Silent Epic, the Way It Was Meant to Be Heard

February 18, 2026
News

Dem lawmakers plan unprecedented ‘break with tradition’ in response to major Trump address

February 18, 2026
News

Do I Have to Take Over the Family Company if I Can’t Stand My Cousin?

February 18, 2026
PlayStation Plus Price Hike Could Be Coming as Sony Looks to Avoid PS5 Price Increase

PlayStation Plus Price Hike Could Be Coming as Sony Looks to Avoid PS5 Price Increase

February 18, 2026
How Microbes Got Their Crawl

How Microbes Got Their Crawl

February 18, 2026
Hungary will suspend diesel shipments to Ukraine over interruption to Russian oil supply

Hungary will suspend diesel shipments to Ukraine over interruption to Russian oil supply

February 18, 2026

DNYUZ © 2026

No Result
View All Result

DNYUZ © 2026