top of page

AI Therapy: How ChatGPT Became My Friend's Breakup Counselor

  • Writer: Dr. Jennifer Chang Wathall
    Dr. Jennifer Chang Wathall
  • Aug 5
  • 4 min read
Image by Flux Pro 1.1 Ultra
Image by Flux Pro 1.1 Ultra

Breaking up is hard. We all know this. But watching my closest friend navigate the emotional wreckage of their recent split has revealed something unexpected: technology stepping into a deeply human role.


My friend turned to ChatGPT in a moment of late-night desperation. What started as casual questions evolved into deeper conversations about their relationship dynamics, their ex's behavioral patterns, and their own emotional responses.

What surprised me most wasn't that they were using AI - it was how effective it seemed to be. The advice was measured, thoughtful, and remarkably balanced. It helped them identify patterns they couldn't see while emotionally entangled. The AI didn't take sides but helped contextualize behaviors in ways that made sense.


I was so skeptical initially. I have read all the research on how one of the top uses of AI is for therapy, and as a skeptic, I never thought this was possible. How could algorithms and probability distributions possibly understand the nuances of human heartbreak? Yet watching my friend process their grief through these conversations challenged my assumptions. The AI offered a judgment-free space where they could express raw emotions without fear of burdening others or receiving biased advice.


There's something uniquely valuable about having an entity that never tires of your story, never gets emotionally exhausted by your pain, and maintains consistent emotional equilibrium while you're riding the breakup rollercoaster.


The Research Behind AI Therapy


Research increasingly supports what my friend experienced firsthand. A comprehensive 2023 systematic review published in Nature Digital Medicine analyzed 15 studies involving 1,744 participants and found that AI-based conversational agents significantly reduce psychological distress and depression symptoms, with effect sizes comparable to many traditional interventions (Li et al., 2023). Perhaps most striking is a 2025 survey by Sentio University revealing that 49% of people with ongoing mental health conditions now use large language models like ChatGPT for support, with 63% reporting improved mental health outcomes (Rousmaniere et al., 2025).

The data shows people primarily seek AI help for anxiety (79.8%), depression (72.4%), and stress (70%), with 90% citing accessibility and 70% citing affordability as key motivations. What's particularly noteworthy is the sustained engagement—64% of users continue using AI for mental health support for four months or more, showing higher retention rates than typical digital mental health applications. While 9% of users report encountering inappropriate responses, highlighting the need for caution, the overall picture suggests AI therapy has moved from experimental curiosity to a significant mental health resource that millions are already using.

Additional research from the Journal of Medical Internet Research found that users of AI therapy apps like Youper experienced significant reductions in anxiety (Cohen's d = 0.57) and depression symptoms (Cohen's d = 0.46) within just two weeks of use, with high user satisfaction ratings averaging 4.36 out of 5 stars (Mehta et al., 2021).



Why AI Therapy Works


I am not saying that we should be replacing human connection with AI - my friend still needs our late-night calls and in-person hugs. But it's become a complementary tool in their healing journey, available at 3 AM when the rest of their support system is asleep.

Perhaps what AI lacks in lived experience, it makes up for in pattern recognition and emotional neutrality. The research suggests several key factors contribute to AI therapy's effectiveness:


•Consistent Availability: Unlike human therapists with limited hours, AI provides 24/7 support when emotional crises often occur

•Reduced Barriers: No scheduling, no insurance requirements, no geographical limitations

•Non-judgmental Responses: AI doesn't experience fatigue, frustration, or personal bias

•Pattern Recognition: Advanced algorithms can identify behavioral and emotional patterns that might be difficult to see from within the situation

•Cost Accessibility: Free or low-cost options make mental health support available to those who couldn't otherwise afford it


Using AI Therapy Safely and Effectively


While the research is promising, it's important to approach AI therapy thoughtfully. Here are some guidelines based on current evidence:

Best Practices:

•Be specific about your concerns and symptoms when interacting with AI

•Use AI as a complement to, not replacement for, professional mental health care

•Ask for coping strategies, perspective-taking exercises, and practical advice

•Verify important insights or advice with qualified professionals

Safety Considerations:

•AI cannot handle mental health crises or suicidal ideation appropriately

•Some responses may be inappropriate or potentially harmful (reported by 9% of users)

•Privacy concerns exist when sharing personal mental health information

•AI lacks the nuanced understanding and ethical training of licensed therapists


It's made me reconsider what "therapy" can look like in our increasingly digital world. The evidence suggests we're witnessing the emergence of a new form of mental health support that, while not perfect, is filling critical gaps in accessibility and availability.



References


Li, H., Zhang, R., Lee, Y. C., Kraut, R. E., & Mohr, D. C. (2023). Systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. npj Digital Medicine, 6(1), 236. https://www.nature.com/articles/s41746-023-00979-5


Mehta, A., Niles, A. N., Vargas, J. H., Marafon, T., Couto, D. D., & Gross, J. J. (2021). Acceptability and effectiveness of artificial intelligence therapy for anxiety and depression (Youper): longitudinal observational study. Journal of Medical Internet Research, 23(6), e26771. https://www.jmir.org/2021/6/e26771/


Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025). Large Language Models as Mental Health Resources: Patterns of Use in the United States. Practice Innovations.https://sentio.org/ai-blog/ai-survey

 
 
 

Comentarios


Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page