In a world where anxiety is just a notification away, mental health apps promise instant comfort — a chatbot that listens, a mood tracker that advises, an AI coach that “cares.” It does sound convenient, but can a machine ever truly understand the human mind? As artificial intelligence enters therapy rooms, the line between emotional support and automation becomes increasingly blurred.
According to Dr Paramjeet Singh, Consultant Psychiatrist at PSRI Hospital, Delhi, AI therapy might be a lifeline for some, but also a reminder that no code can replace compassion.
Can AI therapy apps really match up to a human therapist?
Not quite. According to Dr Malini Saba, psychologist, human and social rights activist and founder of the Saba Family Foundation, AI therapy apps are like “a quick friend in your pocket.” They are great for short-term stress — calming exam jitters, workplace anxiety before a big meeting, or feeling overwhelmed at home. Apps with mood trackers, guided meditation, and simple Cognitive Behavioural Therapy (CBT) modules can be helpful in those moments.
However, AI cannot pick up subtle cues. It does not notice the slump in your shoulders, the tremor in your voice, the pauses in your speech, your sighs, the fidgeting, or even cultural expressions — things a human therapist instantly registers.
“Most communication is not words,” says Dr Saba. “A person in one situation may say, ‘I’m tired all the time,’ but really mean she’s lonely or anxious. Another person in a different situation may express it differently. AI doesn’t understand those layers.”
Also Read
Dr Singh adds, “AI misses eye contact, pauses, nods, patience — everything that builds trust in therapy.”
“Life is not just a few moments of stress. Deep-rooted sadness, trauma, or family conflict needs human empathy,” says Dr Saba.
Dr Singh agrees, “AI may be useful for mild stress and anxiety, but moderate to severe conditions like depression or PTSD (post-traumatic stress disorder) require human judgment, clinical training, and compassion that no algorithm can mimic.”
So, while chatbots can give structured responses, they lack the warmth and intuition that make therapy truly healing.
Can AI bridge the mental health care gap in India?
India has a severe shortage of mental health professionals. According to the Ministry of Health and Family Welfare (MoHFW), India has 0.75 psychiatrists per 100,000 people, whereas the World Health Organization (WHO) recommends at least three per 100,000. For city dwellers, this means long waitlists; for rural India, it may mean no access at all.
AI apps could help, to an extent.
A student in a small town can use a free meditation app. A stressed corporate employee may turn to a chatbot for quick relief between meetings. Post-Covid, online platforms already changed the game by making consultations accessible, Dr Singh points out.
However, both experts warn that AI therapy is not a cure-all — it is a patch, not a permanent fix.
What about privacy and ethics — are our emotions safe with AI?
When you type your deepest fears into an app, where does that data go? Both experts caution against privacy breaches, biased algorithms, and the lack of accountability. “If something goes wrong, who takes responsibility? Humans can, machines cannot,” says Dr Saba.
Dr Singh adds that India currently lacks a strong legal framework for AI therapy tools. “The Mental Healthcare Act 2017 does not cover AI therapists. In Europe, AI bots are considered high-risk. In India, regulation is patchy at best,” he explains.
Is there a risk of becoming too attached to AI chatbots?
Absolutely. Imagine checking your AI app every time you feel low instead of calling a friend. Dr Saba compares it to “talking to a mirror and thinking it’s a friend.”
Dr Singh says he has even seen cases where people develop unhealthy dependency on chatbots, believing the machine “understands” them — which can worsen isolation.
The experts stress that AI can comfort you for a moment, but it cannot replace a real human connection. AI should complement, not replace, human therapy.
Dr Singh explains that AI could play a role in early triage — tracking moods, spotting risks, or handling basic CBT exercises — allowing therapists to focus on deeper, more complex issues.
Dr Saba emphasises the need for honesty in marketing. “People must know this is support, not therapy. Clear disclaimers should be mandatory,” she says.
For more health updates, follow #HealthWithBS
This content is for informational purposes only and is not a substitute for professional medical advice.

)