One evening, a client walked into my session holding a printed conversation from an AI chatbot. “I asked it to psychoanalyse me,” he said, half-proud, half-curious. “And honestly, some parts made sense… but some things felt way off.”
He found comfort in the immediacy and ease of pouring his thoughts into the AI Chatbot that wouldn’t judge him.
Artificial Intelligence in mental health care is no longer science fiction. Tools that once seemed futuristic now offer constant availability, instant replies, and affordability—especially attractive to those hesitant or unable to access traditional therapy. Clients often feel a sense of relief when AI “listens” without interruption, and there’s no need to re-explain histories in every session. For people managing mild distress or seeking daily emotional check-ins, AI can serve as a helpful companion or mental hygiene tool.
But the story changes when we look deeper.
That same client who was fascinated by AI’s insights also said, “It felt like it knew me, but didn’t get me.” And therein lies the core issue. AI can produce insightful-sounding text, but it lacks judgment—the nuanced discernment that trained therapists use to navigate a client’s emotional landscape, especially when that client’s thinking is clouded by distress, trauma, or impaired insight.
AI cannot read tone, facial expressions, or subtle shifts in body language. It cannot sense when silence means something, or when a tearful pause deserves gentle space rather than a prompt. And while clients may unconsciously project feelings onto AI—classic transference—the machine has no means to notice or contain that projection, potentially leading to unprocessed emotional entanglement.
Most importantly, therapy is about two minds thinking together—collaboratively exploring the client’s inner world, challenging unhelpful patterns, and nurturing a sense of connection and safety. Research consistently shows that the therapeutic relationship itself is one of the strongest predictors of positive outcomes in psychotherapy, regardless of the theoretical orientation used. This alliance fosters emotional regulation, secure attachment, and the ability to form healthier relationships outside the therapy room.

In fact, therapy often begins with the client taking their first brave step out of social isolation—despite discomfort with going outside or not enjoying social interaction—and this act itself marks a critical shift toward healing and reconnection. After all, therapy provides an opportunity for reconnection, which is not just a psychological desire, but a biological need. However, when clients replace this step with AI interactions, they may inadvertently reinforce avoidance of real-world contact—missing the very social exposure and relational growth that face-to-face therapy helps foster. In this way, AI can quietly hamper the very process therapy is meant to initiate.
As therapists, we do more than reflect language—we hold pain, offer grounding, and walk beside our clients in their darkest moments. We help develop insight, repair ruptures, and create space for emotional truth. AI cannot replicate that kind of presence.
Even most AI developers themselves emphasize that the goal is not to replace human clinicians, but to augment them. AI can help streamline intake processes, reduce administrative burden, and provide psychoeducational support. These are important contributions—ones that can free therapists to focus more on the core relational and clinical aspects of care.
In Conclusion,
AI may serve as a helpful resource—but it is not a therapeutic relationship. While it offers convenience and accessibility, it lacks the attunement, empathy, and dynamic responsiveness essential to psychological healing. Therapy is not a transaction of advice—it is a process of relational healing, built on human connection.
From the Desk of
Sakshi Dhawan
Counseling Psychologist