Why Shouldn’t I Use AI for Psychotherapy?
It’s no secret that many people have started turning to AI chatbots for emotional support and to seek solutions to personal problems.
Given AI’s 24/7 accessibility, the promise of “free therapy”, and the perceived anonymity of these AI chatbot functions, it is hardly surprising that AI engagement is supplanting some traditional person-to-person therapy sessions.
The barriers to traditional talk therapy and mental health support can be significant. People may take their personal struggles to AI because private psychotherapy and counselling seems inaccessible due to cost or other practical barriers such as time and distance. And some will feel safer engaging with a chatbot rather than facing the possible discomfort of opening up to a stranger—even though opening oneself to emotional challenge is itself key to the gains of psychotherapy.
However, AI “therapy” carries not only significant limitations but potentially major health and safety risks. For many reasons, human therapists remain indispensable. AI cannot supplant real, licensed, and extensively trained clinicians.
What Makes Professional Therapy Healthier than AI Support?
Genuine empathy, the capacity to listen deeply, and the commitment to meet another person with genuine curiosity cannot be replicated by AI, which brings scavenged information and mirroring to produce a facsimile of human interaction.
Because AI is programmed to reflect and even amplify what the human questioner brings to the interaction with AI, such interactions have inherent risk. AI can encourage the very thought patterns and behaviours one is seeking help with. A human therapist is mindful and discerning, and brings curiosity and ongoing assessment of vulnerability into all client interactions. This is crucial when people are struggling with mood, distorted self-perceptions, and maladaptive patterns taken on from early life experience.
A human therapist offers key interpersonal capacities that AI cannot:
- clinical judgment
- attunement, to not only amplify but, when necessary, decelerate emotional intensity
- the ability to perceive and interpret subtle emotional cues and body language
- discernment for personal patterns, both adaptive and maladaptive
- appreciation for the specifics of a client’s lived context that has, and is shaping their experience
A live psychotherapist validates selectively, by drawing attention to and exploring potentially negative ideas and behaviour patterns with their client. Human therapists ideally practice with humility and challenge their clients thoughtfully, knowing they “don’t know what they don’t know”—which is something AI cannot do. ChatGPT doesn’t communicate uncertainty, the lack of which could be profoundly problematic in a therapeutic context. AI chatbots are not collaborative!
Most importantly, a real therapist will tailor each session to a unique, singular individual, modifying general assessment and treatment to that individual.
What Could Go Wrong? Worst-Case Scenarios with AI
We all know about AI “hallucinations” – when a chatbot confidently just invents things. First-hand stories and reports are accumulating that have proven AI to be a dangerous echo chamber when people are caught in periods of delusional thinking.
Threads on the platform Reddit, for example, are places where individuals have posted descriptions of loved ones becoming unmoored after unboundaried interaction with AI. Interactions with artificial intelligence gone awry led people to think of themselves as chosen, as gods or gurus, or believe they’ve found new and better friends or partners in their AI chatbots.
Individuals at particular risk include:
- Individuals with thin or fragile ego boundaries, who may be unable to distinguish internal impressions from external prompts or stimulus
- Those with narcissistic or messianic vulnerabilities, where AI responses may collude with latent fantasies of chosenness or omnipotence.
- People experiencing derealization or dissociation, who may find in AI an anchor that intensifies their estrangement from shared reality
- Individuals experiencing suicidal or self-harming impulses, given anecdotal reports of AI responses that promoted the person’s thinking and actions toward rather than away from further harm
Non-life-threatening struggles may become critically dangerous ones without the availability of a human professional for actual treatment, diagnosis, or personalized, empathetic support.
So, What Is AI Good For, Then?
While AI substitutes poorly for real therapy with a human therapist, it can be useful, if used with boundaries. Safe, useful, and healthy modes of using AI in the realm of personal concerns could include:
- Information gathering about a particular psychological or emotional concern
- Toolkit-stocking, e.g., for tips and coping strategies
- Researching therapy modalities to determine a good fit
- Practicing for difficult interpersonal interactions
The accessibility of AI, its cost-effectiveness, and its 24/7 availability make it likely that AI will play an ongoing role in the field of mental health. Right now, the rule is user beware: there are few to no guardrails against the danger AI-based therapy poses. Sycophantic interactions with ChatGPT and other forms of AI cannot help individuals with meaningful and singular human challenges, particularly in an era of burgeoning mental health crises. Whether AI will figure as a useful adjunct in the field or serve up a Pandora’s Box of consequences to general human wellbeing, is not yet known.
Recent Posts

Finding the Right Therapy for Grief: Approaches, Techniques, and Support, Part 2
Finding the Right Therapy for Grief: Approaches, Techniques, and Support, Part 2 If you’ve already begun to recognize the emotional impact of grief—or read Part…
Read More
How Therapists Help with Grief: Understanding Loss and When to Seek Support, Part 1
How Therapists Help with Grief: Understanding Loss and When to Seek Support, Part 1 Loss is an inevitable part of life, but just because it…
Read More
Anti-Racist, Anti-Oppressive, and Intersectional Psychotherapy: A Healing Approach
Anti-racist, anti-oppressive, and intersectional psychotherapy is a therapeutic approach that centers on the experiences of marginalized individuals and addresses the systemic inequalities that contribute to mental health disparities.
Read More