USING AI CHAT BOTS AS PERSONAL THERAPISTS IS ON THE RISE
AI chatbots are increasingly being used as personal therapists. But while the trend grows, so do the concerns especially from the psychology community.
Psychology is a regulated profession. Therapists undergo years of training not just to understand mental health, but also to handle sensitive, confidential information with care. One of the first steps in working with a client is to clearly explain how their personal information will be stored, used, and protected. This isn’t just protocol, it’s about ethical responsibility.
Right now, those ethical standards are lacking in AI-based tools. These are serious gaps, especially when AI is used in high-risk situations.
We need stronger protections for users especially our youth, who are often the first to turn to AI for support and are most comfortable engaging with it.
But here’s the challenge:
We still don’t fully understand the psychological impact of AI as a therapeutic tool. That’s why we need more research and investment to assess the long-term effects particularly on younger generations.
Star Study: Therapy vs. AI Chatbot
Research Design
- Randomized controlled trial in Ukrainian war zones with 104 women diagnosed with anxiety disorders.
- Control group: Traditional therapy, 60-minute sessions with licensed psychologists, three times per week.
- Experimental group: Daily access to the AI chatbot Friend, using natural language processing and machine learning derived from CBT and motivational interviewing.
- Anxiety levels assessed using validated measures: the Hamilton Anxiety Rating Scale and the Beck Anxiety Inventory, both at baseline and after eight weeks.
AI in Action
- Friend chatbot offered 24/7, on-demand support using therapeutic techniques that don’t rely on emotional rapport.
- Human therapists offered traditional in-person or video sessions, rooted in relational work and clinical judgment.
Results
- Both groups saw significant reductions in anxiety.
- Therapist group: 45% reduction in Hamilton scores; 50% reduction in Beck scores.
- Chatbot group: 30% reduction in Hamilton; 35% reduction in Beck.
The difference was notably higher in the therapist group.
Why It Matters
This trial highlights AI’s potential, especially when in-person therapy isn’t available, but also underscores that human therapists remain more effective in reducing anxiety, particularly in high-risk conditions.
AI Is Not a Substitute for Professional Clinical Assessment
- Emotionally distressed or vulnerable individuals may not receive the care they truly need from AI tools.
- AI is like a self-help book, it offers ideas, not deep understanding.
- It lacks warmth, emotional depth, cultural sensitivity, and ethical judgment.
Therapy Isn’t Black and White and AI Doesn’t Understand Grey
- Mental health often lives in the grey areas: trauma, identity, grief, complex relationships.
- AI can’t grasp these nuances the way trained therapists can.
- It may offer advice that feels helpful in the moment but fails to meet deeper psychological needs or worse, reinforces harmful beliefs.
AI Can’t Assess Risk or Intervene in Crisis
- AI tools give comforting responses but they cannot detect real-time emotional risk.
- They don’t understand personal history or emotional context.
- In moments of deep distress, users may rely on AI instead of seeking qualified care.
Confirmation Bias and Emotional Harm
- AI may unintentionally reinforce negative thoughts, mirroring users’ darkest beliefs.
- Instead of challenging unhelpful thinking, it might echo it back.
- What’s needed in crisis is professional guidance, escalation pathways, and support tailored to the individual—not automated replies.
What Makes Therapy So Transformative?
- A therapist picks up on emotional nuances and hidden patterns.
- They are deeply intuitive and empathic, helping clients uncover root causes—often tied to past experiences or unprocessed emotions.
- They create a judgment-free space, where clients feel truly seen and safe to show up as their full selves.
Healing Through Human Connection
- A great therapist offers personalized techniques tailored to each client’s needs.
- They offer a form of unconditional support—a rare, consistent presence that fosters trust.
- Through this safety and presence, clients can begin to heal:
-From past and present trauma
-Toward their true, authentic self
-Into a life aligned with their deepest potential
The Importance of Human Experience in Therapy
- At the heart of therapy is the emotional connection between practitioner and client.
- Therapists use their skills to deeply understand what the client is going through.
- Clients benefit from their support and open-ended questions.
- Chatbots can hold long conversations and ask follow-ups but it can’t understand context.
- For those needing real clinical treatment, this lack of depth is a disadvantage.
- Only a therapist can consider a client’s background, emotions, and lived experience to offer a personalized treatment.
- AI uses algorithms and data, it lacks human creativity to adapt insights into real-life plans.
- Mental health isn’t black or white. It has grey areas too and AI doesn’t understand grey. It’s logical, not emotional.
- Clients often need help with complex ethical dilemmas and only professionals can offer guidance based on training and ethical principles.
- When it comes to sensitive matters, AI lacks moral and ethical judgment.
Dynamic Adaptation:
As clients progress through treatment, their needs also change that’s why therapists adapt their approach overtime. They can change strategies, interventions and techniques to suit the evolving situations.
The Ethical Implications of AI in Therapy
- Do AI therapist tools adhere to healthcare quality and privacy standards? There still isn’t enough data to say for certain. AI has a long way to go before it can provide the kind of ethical treatment clients need.
- Because mental health isn’t just a map of symptoms. It’s about connection, trust, and the courage to heal.
What are your thoughts on the boundaries AI should never cross in mental health support?