People are using AI to ‘sit’ with them while they trip on psychedelics
A growing number of people are using AI chatbots as “trip sitters”—a phrase that traditionally refers to a sober person tasked with monitoring someone who’s under the influence of a psychedelic—and sharing their experiences online. In this edition of What’s Next in Tech, find out what it’s like for someone to use chatbots like ChatGPT as trip sitters—and what experts have to say about it.
AI agents are evolving fast, shifting from helpful co-pilots to systems that can plan, reason, and act on their own. So how far have we come, and where are we headed? Join MIT Technology Review editors on July 22 for a special LinkedIn Live as they discuss navigating the rise of AI agents. Register for free today.
Some people believe chatbots like ChatGPT can provide an affordable alternative to in-person psychedelic-assisted therapy. Many experts say it’s a bad idea.
Peter sat alone in his bedroom as the first waves of euphoria coursed through his body like an electrical current. He was in darkness, save for the soft blue light of the screen glowing from his lap. Then he started to feel pangs of panic. He picked up his phone and typed a message to ChatGPT. “I took too much,” he wrote.
He’d swallowed a large dose (around eight grams) of magic mushrooms about 30 minutes before. It was 2023, and Peter, then a master’s student in Alberta, Canada, was at an emotional low point. His cat had died recently, and he’d lost his job. Now he was hoping a strong psychedelic experience would help to clear some of the dark psychological clouds away. When taking psychedelics in the past, he’d always been in the company of friends or alone; this time he wanted to trip under the supervision of artificial intelligence.
Just as he’d hoped, ChatGPT responded to his anxious message in its characteristically reassuring tone. “I’m sorry to hear you’re feeling overwhelmed,” it wrote. “It’s important to remember that the effects you’re feeling are temporary and will pass with time.” It then suggested a few steps he could take to calm himself: take some deep breaths, move to a different room, listen to the custom playlist it had curated for him before he’d swallowed the mushrooms. (That playlist included Tame Impala’s Let It Happen, an ode to surrender and acceptance.)
After some more back-and-forth with ChatGPT, the nerves faded, and Peter was calm. “I feel good,” Peter typed to the chatbot. “I feel really at peace.”
Peter—who asked to have his last name omitted from this story for privacy reasons—is far from alone. A growing number of people are using AI chatbots as “trip sitters.” It’s a potent blend of two cultural trends: using AI for therapy and using psychedelics to alleviate mental-health problems. But this is a potentially dangerous psychological cocktail, according to experts. While it’s far cheaper than in-person psychedelic therapy, it can go badly awry. Read the story.
⚡ What's Next in Tech Readers: Claim your special, FREE 30-day trial subscription today.
Get ahead with these related stories:
- The first trial of generative AI therapy shows it might help with depression — The evidence-backed model delivered impressive results, but it doesn’t validate the wave of AI therapy bots flooding the market.
- How scientists are trying to use AI to unlock the human mind — Understanding the mind is hard. Understanding AI isn’t much easier.
- It’s pretty easy to get DeepSeek to talk dirty — Most mainstream AI chatbots can be convinced to engage in sexually explicit exchanges, even if they initially refuse.
Be prepared for the impact AI will have on our world with our weekly newsletter, The Algorithm. Sign up today for in-depth stories and exclusive AI insights.
Image: Sarah Rogers/MITTR | Getty
What could possibly go wrong??
Expert Software Engineer, Architect, and Leader in Microsoft, Web, and Cloud Technology.
1dOne pill makes you smaller…
AI Strategy & Governance Leader | Program Manager, AI Integration | Board Advisor | Aspiring Chief AI Officer (CAIO)
2dThis is unsettling. Generative AI tools are not trained or certified to support people in altered states of consciousness. Without cognition or ethical comprehension, their output, while calm-sounding, can’t adapt to psychological risk. A 2024 IEEE study identified “misuse of AI” and “lack of transparency” as the top public concerns, particularly when AI begins to replace critical human roles. Even if the interaction feels supportive, it’s no substitute for trained care. As AI becomes increasingly integrated into our lives, the need for guardrails in emotionally or clinically sensitive contexts is urgent. Ethical frameworks must keep pace with the pace of adoption, or we risk people trusting systems that cannot reciprocate human responsibility. #AI #ResponsibleAI #AIEthics
Computer Tech Support | Software and Web Dev | Project Management | Tech Consulting
2dthere's just a skill shortage among therapists, skills shortage duh.
--
2dwe are officially doomed as a species. What a collosal waste of precious resource...water, electricy, and land!