top of page

The AI Therapist Trap: Why Indians use ChatGPT for mental health instead of Therapy

By Synapse Mental Wellbeing | March 2026 | 8 min read
AI therapy chatbot being used


You've had a bad day. Perhaps it was an argument with someone you care about, or an anxious cycle of thoughts in the wee hours of the morning, or that feeling of slowness that you can't quite put your finger on. You open up your phone, begin typing out a message to the chatbot, and it listens to you with perfect attention, patience, and no judgment.

 

It offers you advice that is well-thought-out, that acknowledges your feelings, that even offers you advice that sounds legitimate.

 

But the real question that we should be asking ourselves is this: is it actually helping you, or is it simply comfortable?

 

This is not an essay that argues that AI is bad or that technology should not be used in the realm of psychology. In fact, the truth is far more complex than that. Instead, the real question that we are asking is this more precise one: when is the use of an AI chatbot actually a replacement for the support that you actually need, and what does the research say about this phenomenon?



First, Let's Acknowledge Why People Are Turning to AI

A person holding a mental health chatbot tab

The appeal is completely understandable. India is in the middle of a serious mental health crisis, and access to care remains deeply limited.

According to the Government of India's Ministry of Health and Family Welfare, 70% to 92% of people with mental disorders in India receive no formal treatment. The reasons are familiar: stigma, cost, long wait times, and a devastating shortage of professionals.

India has only 0.75 psychiatrists per 100,000 people, far below the WHO's recommended minimum of 3 per 100,000. In Bengaluru, a city of over 12 million people, the wait time for a first therapy appointment can stretch to weeks.

"India carries one of the world's highest mental health treatment gaps. Despite national programme expansions, large proportions of people with common and severe mental disorders remain undiagnosed or untreated." - Indian Journal of Psychiatry, PMC 2025

Against this backdrop, an AI that is available at midnight, asks how you're feeling, and never makes you feel judged? It's not surprising that millions are turning to ChatGPT for mental health. A 2024 report found that mental health chatbot research quadrupled from 14 studies in 2020 to 56 studies in 2024, reflecting just how fast this space is exploding.



What the Research Actually Says: The Honest Picture

Let's be fair to the research here, because the picture is genuinely mixed - and that complexity matters.

The promising side: In March 2025, Dartmouth College published the first-ever randomized controlled trial of a generative AI therapy chatbot, called Therabot. The results were striking. People diagnosed with depression experienced a 51% average reduction in symptoms. Those with generalised anxiety showed a 31% reduction. Participants said they felt they could trust and communicate with the system at a level comparable to working with a mental health professional.

But here is what the researchers themselves were careful to emphasise. Dr. Nicholas Jacobson, who led the trial, told Dartmouth News:

"There are a lot of folks rushing into this space since the release of ChatGPT, and it's easy to put out a proof of concept that looks great at first glance, but the safety and efficacy is not well established. This is one of those cases where diligent oversight is needed."

The trial was conducted under close clinical supervision. The research team was on standby to intervene if any participant expressed suicidal ideation or if the chatbot responded outside of best therapeutic practices. In the real world when someone opens a chatbot app alone at 2 AM, that safety net does not exist.

A 2025 systematic review of 160 studies on AI mental health chatbots found that while LLM-based tools surged to 45% of new research in 2024, only 16% of those studies underwent clinical efficacy testing. The majority 77% were still in early validation stages. The review concluded there is a critical gap in robust validation of therapeutic benefit.

And a meta-analysis of chatbot interventions for young people found something telling: chatbots significantly reduced distress scores in the moment, but had no statistically significant effect on psychological well-being. Feeling slightly less bad right now is not the same as actually getting better.



The Safety Gap Nobody Talks About

Perhaps the most urgent concern is around safety, particularly for people in acute distress.

Research published in JMIR Mental Health found that a significant proportion of AI chatbots endorsed harmful proposals when tested with messages from distressed teenagers. These were not fringe apps, they were widely available consumer tools.

This matters because people in crisis are precisely the ones most likely to reach out to an AI. They may be reaching out because they feel they cannot tell a real person. And if the AI responds poorly, minimising, misdirecting, or worse - the consequences can be severe.

"A chatbot that performs well in scripted tests may still fail in real-world empathy or crisis management. Short-term usability does not guarantee long-term adherence or relapse prevention." - 2025 Systematic Review, PMC

Skilled human therapists are trained for exactly this: to read what is not being said, to pick up on shifts in tone, to hold space for the things clients cannot yet articulate. That capacity, built over years of clinical training and lived human experience cannot be replicated by pattern-matching on text.



What using ChatGPT for mental health cannot provide: The Therapeutic Relationship

An organic therapy session

There is a concept in therapy called the therapeutic alliance, the relationship of trust and collaboration between a client and their therapist. Decades of research show it is one of the strongest predictors of whether therapy actually works, independent of the specific techniques used.

A 2024 study found that users expressed genuine discomfort with AI's ability to handle their emotions and personal data, and that this uncertainty could undermine openness and overall treatment efficacy. Interestingly, research also found that when users knew they were talking to an AI, the emotional benefit was measurably reduced, even when the responses were identical to a human's.

A human therapist does not just respond to your words. They notice if you've been quieter than usual. They remember what you said six sessions ago and gently connect it to something you said today. They sit with you in silence when silence is what's needed. They bring their own humanity to the room.

None of this is a technology problem. It is a human problem, and some problems can only be met by another human.



So When is AI Useful in Mental Health?

It would be dishonest to say AI has no role to play. For a country like India with an 84.5% mental health treatment gap, digital tools can be meaningful bridges. Research suggests AI tools are most useful in specific, bounded contexts:

1. As a supplement to real therapy - not a replacement. The Dartmouth Therabot trial was most credible precisely because it was researcher-supervised. AI used alongside professional care, to help between sessions, may genuinely support recovery.

2. For mild, early-stage distress. If you are having a hard week and want to reflect or structure your thoughts, AI can help. If you are experiencing clinical anxiety, depression, trauma, or a crisis, it cannot.

3. For psychoeducation. Learning about what anxiety is, how CBT works, or what to expect from your first therapy session - AI can do this reliably and accessibly.

4. As a first step, not a last resort. If talking to an AI helps you put words to what you're feeling and builds the courage to book a real session, that is genuinely valuable.



The Deeper Question Worth Asking Yourself

The next time you find yourself opening an AI chatbot instead of reaching out to a person - a friend, a therapist, a family member - it's worth pausing and asking: Am I using this because it's helpful, or because it's easier?

Comfort and healing are not the same thing. An AI will never push back on you the way a skilled therapist does when they notice a pattern you haven't seen yourself. It will never sit with you in the weight of a difficult silence. It will never, in a moment of crisis, make a professional judgment call that keeps you safe.

Venting to ChatGPT can feel good. It can release the pressure in the moment. But mental health is not about reducing pressure momentarily - it is about building the capacity to understand yourself, process your experiences, and grow. That kind of work requires relationship. And relationship, by definition, requires another person.



A Note from Synapse

We understand that access to therapy in India is genuinely difficult. Cost, stigma, availability - these are real barriers and we take them seriously. That's why we offer flexible 'Pay What You Can' programs and pro-bono therapy for those who need it.

If you have been relying on an AI chatbot for emotional support and have been wondering whether you need something more, the answer is probably yes, and that is okay. Reaching out is the brave part.

We are here when you are ready: SynapseMentalWellbeing.com or WhatsApp +91 91488-05435.


References

  1. PMC — Systematic Review of AI Mental Health Chatbots (2025): https://pmc.ncbi.nlm.nih.gov/articles/PMC12434366/

  2. Dartmouth / NEJM AI — First RCT of Generative AI Therapy Chatbot, Therabot (2025): https://home.dartmouth.edu/news/2025/03/first-therapy-chatbot-trial-yields-mental-health-benefits

  3. JMIR Mental Health — Chatbots & Adolescent Safety (2025): https://mental.jmir.org/2025/1/e78414

  4. Government of India, Ministry of Health — Mental Healthcare Access Data: https://www.mohfw.gov.in

  5. PMC — Bridging India's Mental Health Treatment Gap (2025): https://pmc.ncbi.nlm.nih.gov/articles/PMC12468826/

  6. Business Standard — 197 Million Indians Need Mental Health Support (2025): https://www.business-standard.com/health/197-million-indians-need-mental-health-support

  7. American Psychoanalytic Association — Are Therapy Chatbots Effective? (2025): https://apsa.org/are-therapy-chatbots-effective-for-depression-and-anxiety/

  8. Image by <a href="https://pixabay.com/users/mohamed_hassan-5229782/?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=9559330">Mohamed Hassan</a> from <a href="https://pixabay.com//?utm_source=link-attribution&utm_medium=referral&utm_campaign=image&utm_content=9559330">Pixabay</a>

Comments


bottom of page