If you’ve been using ChatGPT or Claude to talk through your thoughts, you’re not alone. And honestly, it makes sense.
AI is accessible. It’s private. It’s available at 2am when your brain won’t stop. You can say things to a chatbot that you might not say out loud to anyone else, without judgment or interruption.
I’m a therapist, and I think it’s a good thing that people are reaching for something. Even if it’s a chatbot. It means they’re curious about their inner world. That curiosity matters.
But as someone who sits across from people every day and helps them untangle what’s really going on inside, I’ve been curious too: what does AI actually do when someone brings it their struggles? And how does that compare to what happens in therapy?
So I looked at the research. Here’s what I found.
What the Research Says About AI and Therapy
Two recent studies caught my attention, not because they trash AI, but because they clarify what AI is actually good at, and where it still comes up short.
The first, from the University of Washington (Chiu et al., 2024), analyzed how large language models respond in therapeutic conversations compared to human therapy behavior. They didn’t measure outcomes. They measured response style. In other words, how does AI tend to respond when you’re struggling?
Here’s what they found.
AI produced substantially more problem-solving responses than average human therapy behavior. When someone shares something difficult, AI tends to move toward solutions, reframes, and advice rather quickly.
By contrast, high-quality human therapy behavior shows more emotional reflection. Instead of solving, skilled therapists spend more time staying with the feeling, naming it, exploring it, helping you tolerate it, and helping you understand what it’s signaling.
They also found that AI asked fewer exploratory questions about experiences and emotions. That pattern—less curiosity, more answers—resembled what researchers see in lower-quality therapy behavior.
The second study (Cho et al., 2023) took a different approach. A panel of clinical psychologists and psychiatrists evaluated an LLM’s performance in simulated therapy sessions. They assessed empathy, communication, adaptability, engagement, and—importantly—the ability to form a therapeutic alliance.
Their conclusion was nuanced. AI showed real strengths in empathetic engagement and adaptability. It could validate experiences, maintain conversation, and respond appropriately in many scenarios.
At the same time, the researchers noted two limits that matter enormously in therapy. First, depth of personalization—the responses sometimes lacked the tailoring you get when a human therapist truly understands your specific situation. Second, depth of connection—the study noted that the digital nature of the interaction could limit the depth of connection achievable compared to a human therapist.
To be clear, these studies are not saying AI is useless. They’re saying AI does something different than therapy. And that difference matters.
Why This Matters
Here’s the clinical truth that shows up over and over in therapy:
The thing that’s bothering you is rarely the thing that’s actually bothering you.
Work stress that keeps you up at night is often not just about workload. It’s usually something underneath—a fear of failure, a fear of being exposed, a pattern you learned early, a feeling you’ve been carrying for years without realizing it. Relationship conflict rarely stays about the surface issue. It usually comes back to feeling unseen, unvalued, unsafe, or alone in the relationship.
AI is great at helping you work with the surface. It mirrors what you say, organizes the story, reframes the problem, and offers solutions. That can be genuinely helpful, especially when you feel flooded.
But therapy goes underneath. It’s not just about organizing the thought. It’s about confirming the feeling—the one you might not even know is there yet.
Affirm the Thought vs. Confirm the Feeling
Here’s the distinction I keep coming back to.
AI affirms the thought. It takes what you said, validates it, reorganizes it, and gives it back to you in a cleaner form. That can be useful. It helps you feel heard. It helps you think more clearly.
Therapy confirms the feeling over time. It goes beyond what you said to what you felt—and often what you didn’t even realize you were feeling. It names the emotion underneath the narrative. It tracks patterns across weeks and months. It helps you build tolerance for the discomfort you’ve been avoiding, and it helps you respond differently when the loop shows up again.
Both matter. They’re just not the same thing.
And here’s the part that’s hard to accept when you’re high-functioning and analytical:
Change usually lives in the feeling, not the thought.
You can understand your patterns intellectually and still repeat them. You can know exactly why you’re anxious and still feel anxious. Insight alone doesn’t create change. What creates change is experiencing something different—feeling understood, feeling safe enough to sit with discomfort, feeling connected to someone who can hold the weight with you and tell you the truth kindly when you start spinning.
The Honest Take
I’m not here to tell you to stop using AI. If it’s helping you process, keep going. That curiosity about your inner world is a strength.
But I do want to be honest about what therapy offers that AI can’t—at least not yet.
Therapy is slower. It takes more commitment. It isn’t always convenient, and it can’t be done entirely on your own. It asks more of you—more vulnerability, more patience, more willingness to sit with things that don’t feel good.
That’s why it works.
The therapeutic relationship—the bond between you and someone who truly knows your story—is one of the most powerful ingredients in effective mental health care. It’s not just the techniques. It’s not the advice. It’s the relationship.
AI can simulate empathy and generate supportive language. But it can’t know you—not the way a human can. And when life gets hard, you want someone who already knows what steady looks like for you.
What’s Next
If you’ve been using AI to talk things through, you’ve already taken a step. You’re reflecting. You’re trying. That matters.
The question is whether you’re ready to go deeper.
If you’re not sure, start with our Curiosity Check—five questions, two minutes. It won’t diagnose you or tell you what’s wrong. It’ll help you notice what’s already there.
And if you already know something needs to shift, consider Get Established—a single session to define your baseline, document your ‘normal,’ and have someone ready when life gets loud.
AI affirms the thought. Therapy confirms the feeling.
You’ve been curious long enough. Let’s go deeper.
References
Chiu, Y. Y., Sharma, A., Lin, I. W., & Althoff, T. (2024). A computational framework for behavioral assessment of LLM therapists. University of Washington.
Cho, Y., Kim, S., Kim, M., et al. (2023). Evaluating the efficacy of interactive language therapy based on LLM for high-functioning autistic adolescent psychological counseling. arXiv:2311.09243.