Why Chatbots Can’t Heal Trauma: A Cautionary Note on AI and Mental Health

By Roger Hughes | EMDR & Trauma-Informed Coaching | UK-wide

Today’s Guardian article opened with a quiet sentence that stopped me: “I feel it’s a friend.” It was a teenager talking about a chatbot. Not a therapist. Not a teacher. Not a parent. A bot. For a growing number of young people — one in four, according to the article — AI chatbots like Replika or Snapchat AI are now the first place they turn for mental health support. Why? Because they’re available. Instant. Always on. And never judge. And on the surface, that might sound like a step forward. But when you work with trauma, it rings alarm bells.

Trauma doesn’t live in words. It lives in the body. In the way the breath shortens when something feels off. In the sudden blankness when memories overwhelm. In the urge to disappear when emotions start to rise. No chatbot sees that. No AI pauses gently when a survivor starts to disassociate. No bot senses when someone’s nervous system is teetering on shutdown. That kind of attunement only happens between human beings.

For someone who’s been dismissed or shamed before — especially in moments of vulnerability — a chatbot might feel like a relief. There’s no eye contact, no fear of being misunderstood, no risk of emotional reaction. But those same things that make it feel safe also make it empty. There’s no warmth. No co-regulation. No grounded nervous system saying, “I’m here. You’re safe. I’ve got you.”

Real therapy doesn’t just “talk about” trauma. It helps a person feel safe enough to face what they’ve never dared to feel before. It invites the body to settle. It allows a fragmented memory to land without flooding. It holds someone through the wave — not just until it passes, but until they can ride it themselves. A chatbot might respond kindly, but it can’t hold. It can’t notice when someone’s voice starts shaking or when silence means something’s gone numb. It doesn’t know when to stop. And sometimes, stopping is everything.

This is what worries me most. When someone believes they’re getting help — but in reality, they’re sitting alone, speaking into code. Especially for trauma survivors, that illusion of connection can delay real healing. They might pour everything out, feel temporarily lighter… but the deeper wound? Still untouched. Still unprocessed. Still held alone.

And in some cases, it’s not just delayed healing — it’s harm. When someone with deep trauma opens up too fast in an uncontained space, it can retraumatise them. Panic. Flashbacks. Emotional collapse. Without a trained, regulated professional to track the pace, trauma work becomes dangerous. And no chatbot — however well-built — can offer the containment that kind of work demands.

This isn’t an attack on technology. AI can be useful for education, reminders, or managing general stress. But trauma recovery is sacred. It’s delicate, often slow, and always relational. It needs a safe enough connection, a steady enough presence, and a pace that honours the body, not just the mind.

If you’re using a chatbot right now, please know this: you’re not weak, strange, or wrong for reaching out to whatever feels available. But don’t stay there. Don’t confuse “responding” with “holding.” And don’t settle for simulation when real presence is possible. You deserve more than code. You deserve to be seen — really seen — and gently met right where you are.

Source: The Guardian – “I feel it’s a friend”: Quarter of teenagers turn to AI chatbots for mental health support (9 December 2025)

📌 Connect with Roger Hughes

Posted in

One response to “Why Chatbots Can’t Heal Trauma: A Cautionary Note on AI and Mental Health By Roger Hughes | EMDR & Trauma-Informed Coaching | UK-wide”

  1. Why AI Can’t Replace Trauma Therapy: What Recent Failures Reveal – Online EMDR Therapy UK | Accredited Trauma-Informed Therapist – Roger Hughes Avatar

    […] Original article:Why Chatbots Can’t Heal Trauma – A Cautionary Note on AI and Mental Healthhttps://rogerhughes.org/2025/12/09/why-chatbots-cant-heal-trauma-a-cautionary-note-on-ai-and-mental-… […]

    Like

Leave a comment