Why AI Will Never Replace Your Therapist (And I Say That as Someone Who Uses It.)
Let's get something out of the way: I use AI. I use it to brainstorm, to draft, to organize my thoughts when my brain has seen six clients and can't form a sentence. It is a genuinely useful tool.
And it will never, ever replace what happens in a therapy room.
I know that's a bold claim in a moment when everyone seems convinced that AI is coming for every job that involves a screen and a brain. But I'm not worried. And by the end of this post, I hope you won't be either, whether you're a client wondering if a chatbot could do what your therapist does, or a clinician side-eyeing every new app that promises "mental health support."
Here's why therapy isn't going anywhere.
Healing happens in relationship, not in information.
If you could think your way out of trauma, you already would have.
Most people who walk into therapy aren't missing information. They're not confused about whether their situation is hard. They know it's hard. What they're missing is an experience — a felt sense of being seen, held, and understood by another human being who isn't going anywhere.
That's not something a language model can provide. AI can reflect your words back to you. It can offer coping strategies, psychoeducation, even surprisingly thoughtful responses. What it cannot do is be present with you. It doesn't feel the shift in the room when something important surfaces. It doesn't notice that you just changed the subject right when things got real.
Therapists do. That noticing? That's the work.
The body keeps score. AI can't read it.
So much of what happens in therapy is nonverbal. The way someone's shoulders drop when they finally say something true. The nervous laugh that comes out when a topic gets too close. The stillness that settles in after a breakthrough.
Somatic and body-based approaches just like the ones I use in my own practice, are built entirely on the premise that healing lives in the nervous system, not just the narrative. You can't regulate a dysregulated nervous system through a chat interface. You need a real human being, attuned and present, helping you find your way back to safety.
AI doesn't have a nervous system. It doesn't co-regulate. And co-regulation, which is one person's calm nervous system helping to settle another's, is one of the most powerful things therapy offers.
High-conflict situations require human judgment.
This one is personal to my work.
When someone is navigating a divorce from a narcissistic partner, untangling years of gaslighting, or trying to co-parent with someone who turns every conversation into a battlefield, they don't need an algorithm. They need someone who can hold complexity, track patterns over time, and make real-time clinical decisions based on what's happening in front of them.
AI can generate a list of "tips for co-parenting conflict." What it cannot do is recognize that the client sitting across from you has just minimized something serious, or that the story they're telling doesn't quite add up, or that what they need right now isn't a strategy but permission to fall apart for a minute.
Clinical judgment isn't a feature you can code. It's built from training, experience, supervision, and thousands of hours of sitting with human beings in pain.
Therapy is about being witnessed not just heard.
There's a difference between a tool that processes your words and a person who receives them.
When a client tells me something they've never told anyone, sometimes something they've carried alone for years, what matters in that moment isn't the response I give. It's that another human being heard it and didn't flinch. Didn't minimize. Didn't redirect to a coping skill.
Just stayed.
AI cannot stay with you. It has no stake in your healing. It doesn't think about you between sessions, notice a pattern three months in, or feel genuinely moved when you have a breakthrough. It produces outputs. Therapists show up.
So what IS AI good for in mental health?
I'm not here to pretend it has no role. AI tools can be genuinely helpful for psychoeducation, journaling prompts, tracking mood patterns, and reducing barriers to basic mental health information. For someone who can't access care, whether because of cost, geography, or stigma, a well-designed app is better than nothing when used with caution.
But "better than nothing" is a low bar. And it is not therapy.
The bottom line:
Therapy works because of the relationship. The attunement. The lived human presence of someone trained to sit with pain without trying to fix it prematurely.
AI is a tool. A useful one. A fascinating one.
But it doesn't know what it feels like to lose yourself in a relationship. It's never had to rebuild. It has never had to choose between who you were and who you're becoming.
Your therapist might have. And that changes everything.