Can AI replace therapy? The short answer is no. While technology is advancing at a breakneck pace, moving from simple text bots to hyper-realistic video avatars, a digital face will never be human. In psychodynamic therapy, the relationship between two people is not a backdrop for the treatment. It is the treatment. Or at least, such a big part of the treatment that to remove it changes the thing entirely. Therapy stops being therapy when it is not two people.

People are turning to AI for emotional support in record numbers. Currently, about 24% of U.S. adults have used a large language model like ChatGPT for some form of mental health guidance. Apps like Woebot and Replika market themselves as "therapeutic companions" to a population in need.

And yet, there is a telling statistic: 80% of users abandon these apps within two weeks.

I've spent the past six months researching the intersection of AI and psychodynamic psychotherapy. Recently, I presented my findings to our clinical team at Coastal Therapy Group. My conclusion is that AI will never replace a human psychologist, not because the technology isn't sophisticated, but because therapy requires a biological nervous system and a subjective "other" that AI simply does not possess.

The Mental Health Gap vs. The Therapeutic Relationship

The AI-in-psychotherapy market is projected to reach $9 billion by 2032. That is a tremendous amount of money being spent by people who see an economic opportunity in the gaps of mental health services.

This growth isn't necessarily driven by superior clinical outcomes. It's driven by a crisis of access. Globally, most people with mental health conditions are not receiving treatment. In the U.S., nearly half go without care, and in lower-income countries, the untreated rate reaches 76 to 85%.

But filling a gap and providing therapy are two different things. And I think we're in danger of confusing the two. To understand why, we have to look at what therapy actually is, not just what it looks like from the outside or what it looks like on a balance sheet.

The Relational Model: Why Human Therapy is More Than Information

Therapy is not just "information delivery." If it were, every self-help book would be as effective as a year of sessions.

Josef Breuer discovered this in the 1880s through his treatment of 'Anna O,' who coined the term 'talking cure.' Freud, learning from Breuer's case, realized that the patient's mind reaches toward the therapist not just for knowledge, but as a relational object. This concept, known as transference, became the foundation of modern psychotherapy.

Transference, the process of projecting internal relational patterns onto the therapist, requires two living people. It is a dance of two minds, and you cannot dance with an algorithm.

Bion's Theory of Reverie: Why AI Lacks an "Alpha Function"

Wilfred Bion, a seminal psychoanalytic thinker, described a state called reverie. This is a receptive attention where a therapist allows themselves to be genuinely affected by a patient's emotional experience.

But let's be real about what that looks like in practice. It's not just "listening." Bion argued that a patient brings in "beta elements," the raw, unprocessed, and often overwhelming feelings that don't have words yet. They're just sensations or dread. The therapist's job is to take those in, process them through their own mind and body (the "alpha function"), and give them back to the patient in a form that can finally be thought about.

This is where the AI argument falls apart for me. An alpha function isn't a line of code. It's a biological and emotional metabolic process. To transform someone else's dread into something manageable, you have to be able to feel that dread yourself. You have to be a person who knows what it's like to be overwhelmed.

AI doesn't have a pulse. It can't have a "gut feeling" because it doesn't have a gut. When you share something painful with a chatbot, it can mirror your language, but it can't digest the experience for you. It skips the very thing, the somatic, human labor, that actually leads to healing.

The Illusion of the Digital Face: Why Realism Isn't Humanity

We are rapidly approaching a time when your "AI therapist" won't just be a text bubble. It will be a high-definition video avatar that can mimic micro-expressions, or perhaps a lifelike robot sitting across from you. The industry is obsessed with perfecting the digital face.

But we need to be clear: a face is only a "face" in the therapeutic sense if there is a subject behind it looking back at you.

In my work, I look at a patient's face not just to decode their "data," but to feel the impact of their experience on my internal world. When an AI "mimics" a facial expression, it is performing a calculation based on a database. It isn't actually being moved by you. A digital face can simulate empathy, but it cannot possess it. As AI becomes more visually indistinguishable from humans, the "Wire Mother" experiment actually becomes more dangerous, because the wire is being hidden under a very convincing layer of silicone and pixels.

AI is the "Wire Mother": Why Information Isn't Enough

In Harry Harlow's famous (and heartbreaking) experiments, infant monkeys were given a choice between two surrogate mothers: one made of cold, hard wire that provided milk, and one made of soft cloth that provided nothing but comfort.

The monkeys chose the cloth. They chose contact over nourishment.

AI is the wire mother. It delivers the "milk" of coping strategies, CBT exercises, and reflective language, but it has no cloth. It has no warmth, no pulse, and no genuine presence. Humans, like those infant monkeys, instinctively know the difference between being fed and being held.

The story of Harlow's monkeys did not end well. The monkeys raised by wire surrogates developed clear, lasting psychological distress. It seems to me that as a species, we are running the "wire monkey experiment" all over again. We are attempting to see if a material surrogate can replace the biological necessity of another human being.

Humans need other people to develop, grow, and heal. It is part of our evolutionary nature. And yet, it feels like we are going to have to learn this lesson again, only this time, we're doing it at a much larger, global scale.

Why Your Therapist "Surviving" Your Worst Moments Matters

D.W. Winnicott, a giant of the psychoanalytic tradition, wrote that for a person to grow, they need to feel they can psychologically "destroy" their therapist and have that therapist survive, not collapse, and not retaliate.

But AI "survives" everything, and that survival is clinically meaningless. There is nothing at stake. You cannot test the boundaries of a relationship with an entity that was never alive to begin with. AI can make you feel "seen" in a superficial way, but it lacks the other side of that dyad: the possibility that the other person might leave you, be overwhelmed by you, or struggle to tolerate you.

For many, this renders the whole AI project meaningless.

Consider a person who enters therapy for lack of confidence, panic attacks, or an inability to maintain a romantic relationship. Often, the presenting issue isn't the real problem. The real problem is a felt belief that their feelings are simply "too much," that if someone were to truly see their inner world, they wouldn't stick around.

Because we all know an AI cannot leave due to an interpersonal dynamic, it can never touch that core wound. It cannot provide the "embodied belief" that you are survivable. Instead, the AI stays on the surface, offering clichés or "hallucinating" empathy. It wastes the patient's time by avoiding the very interpersonal risk that leads to healing.

It is no wonder so many people quit talking to AI chatbots within two weeks. They aren't looking for a program that is programmed to stay. They are looking for a human being who chooses to stay.

Mutual Recognition: Why You Can't Have a Relationship with a Mirror

Jessica Benjamin, a contemporary psychoanalytic theorist, builds on Winnicott's insight about survival, but she adds the other half of the picture.

For Winnicott, the story is told mostly from the patient's side: the patient "destroys" the therapist, the therapist survives, and through that survival, the therapist becomes real. But Benjamin asks a deeper question: what is happening on the therapist's side? Her answer is that survival alone isn't enough. For the work to be transformative, the therapist has to be recognized as a subject, a separate person with their own mind, their own history, and their own experience of what just happened in the room. Benjamin calls this Mutual Recognition: two people, each a separate center of experience, each capable of being affected by the other.

That mutuality is what gives a therapist's survival its meaning. If the therapist is just a sturdy, inanimate object absorbing your "destruction," the survival feels empty. The real healing comes from the discovery that there is a real person over there, someone who actually felt the impact of your anger or your grief and chose to stay. That is what changes a patient's relationship to the rest of the world.

This is where the problem with AI therapy becomes most precise. In Benjamin's terms, AI exists in a permanently complementary structure. One party is always the subject (you), and the other is always the instrument (the AI). The roles never reverse. The AI never pushes back from a place of its own lived experience. It never surprises you with a perspective that could only come from a separate, unpredictable mind.

In a human therapy room, a "Thirdness" emerges, a shared space that belongs to neither person alone, but is created by both. AI cannot create "Thirdness" because it has no subjectivity to contribute.

Without that, the patient never gets to discover that another person can be real, separate, and still present after conflict. They get a mirror that reflects their words back in a warmer, "therapeutic" tone. And as I often tell our team, a mirror, no matter how sophisticated the algorithm behind it, is not a relationship.

Where AI Can Help (and Where it Stops)

I am not anti-technology. AI can be a helpful tool for:

  • Psychoeducation and mental health literacy.
  • Guided breathing and mood tracking.
  • Providing basic support for those with zero access to care.

But a chatbot is not a therapist. It cannot be a container for your soul. It cannot be affected by you.

Conclusion: The Biological Necessity of the Other

What makes therapy work is biological and relational. It is not a feature set that can be optimized. Transference, containment, survival, recognition: these are not processes that happen in language. They happen between two nervous systems in the same room. A therapist who feels their chest tighten when a patient describes something they haven't put words to yet. A patient who rages at their therapist for going on vacation and discovers, the following week, that the therapist is still there, still willing, still a person. None of that can be coded, and none of it can be faked. The patient knows the difference, even when they can't articulate it. That is why 80% of people stop talking to AI within two weeks. They went looking for a person and found a program.

If we return to Harlow, the lesson was never that the monkeys were stubborn for wanting the cloth mother. The lesson was that they needed the cloth to become functional, social beings. Contact comfort was not a preference. It was a biological requirement. By substituting algorithms for therapists at scale, we are running Harlow's wire mother experiment again, only this time on ourselves, and at the size of a global population.

Therapy is a protest against that substitution. It insists that you cannot heal in isolation, that information is not intimacy, and that no amount of processing will ever replace being met by another person. We don't just need to be processed. We need to be met.

Finding Real Connection in North County San Diego

What people are searching for when they look for therapy is another person, someone who will stay, someone who can be moved by them, and someone who will survive the process of change.

This is the core of psychodynamic therapy. At Coastal Therapy Group, we prioritize the deep, intimate, and deliberate work of sitting with another person. We don't offer scripts or algorithms. We offer a relationship.

If you are looking for a therapist in Encinitas, Carlsbad, or Vista who will truly be in the room with you, we invite you to reach out.

About the author

Reid Kessler, Psy.D. is a licensed psychologist (PSY29855) and the owner of Coastal Therapy Group, a relational psychodynamic group practice with offices in Encinitas, Carlsbad, and Vista. He earned his doctorate from Rosemead School of Psychology and has practiced for about a decade from a relational psychoanalytic orientation. Reid served on the San Diego Psychological Association board in 2024 and is the Clinical and Training Director at CTG, where he oversees a team of psychologists and postdoctoral fellows.

Reach out to work with our team.


Frequently Asked Questions

Can AI chatbots replace a therapist?
No. While AI can provide information, it cannot provide the relational experience or somatic resonance that makes therapy effective. Therapy relies on two people affecting each other, and AI has no subjective experience to bring to the exchange.

Is AI therapy effective?
AI tools are effective for "low-level" tasks like mood tracking or learning CBT-based coping skills. However, for deep emotional work and personality change, research suggests the "therapeutic alliance" (the human bond) is the strongest predictor of success, something AI cannot form.

Why do people quit AI therapy apps?
Statistics show 80% of users abandon these apps within two weeks. This is likely because the interaction lacks the depth, reciprocity, and genuine human contact required for a person to feel truly "known."

What if I am considering AI therapy because it is significantly cheaper?
This is a real and painful dilemma for many, but there is a human alternative that is often overlooked: group therapy. Group therapy is human-centric, clinically effective, and significantly more affordable than individual treatment. While talking to an AI might feel "easier" because there is no social risk, that lack of risk is exactly why it lacks impact. Group therapy provides the very thing AI cannot: a room full of real subjects. We believe 2026 is the year group therapy makes its cultural comeback as the antidote to digital isolation.

© Coastal Therapy Group, APC 2025 All Rights Reserved

We are a group practice of psychologists that work as therapists in Carlsbad, Bressi Ranch, Encinitas, & Vista. Many of our clients come from the surrounding towns of Oceanside, San Marcos, Escondido, Rancho Santa Fe, Solana Beach, and Del Mar.