Anjana P V, 26, a professional accustomed to traditional therapy, was intrigued when invited to test the emerging world of AI-powered mental health support. The genuine desire to understand how technology might intersect with mental health care made her engage in the experiment. With curiosity tempered by skepticism, she posed the same question to both her human therapist and an AI counterpart: “How can I manage workplace stress?”
What unfolded was more than just a comparison of advice. It became a revealing exploration of empathy, technological limitations, and the nuanced art of emotional support.
“The AI therapist’s advice felt generic,” she said. Suggestions such as journaling, meditation, and exercise lacked depth—more akin to advice from a well-meaning friend. In contrast, her human therapist explored the root causes of her stress through psychological assessments and personalised insights. “The human approach was nuanced, addressing underlying behavioural patterns and unresolved childhood experiences,” Anjana said.
Anjana’s experiment offered a rare, first-hand glimpse into how artificial intelligence attempts to navigate the complex terrain of human emotions.
The rise of AI in mental health: Opportunities and challenges
Artificial intelligence is revolutionising mental health care by promising accessibility, stigma-free interactions, and scalable solutions. Leveraging machine learning and natural language processing, AI therapists simulate empathetic conversations and offer strategies to manage stress, anxiety, and more. But can they match the depth of human therapists?
However, challenges remain. Srishti Srivastava, founder of Infiheal, told indianexpress.com, “When designing an AI-based system, one key principle is garbage in, garbage out. This means the quality of the input data directly impacts the quality of the output. The biggest challenge is sourcing datasets that are culturally relevant and aligned with the intended audience.”
Concurring, Sandesh Cadabam, founder of Cadabamsconsult.ai, said, “Emotions are multifaceted and often expressed through subtle language variations, tone, or even silence. Training AI to detect and appropriately respond to such nuances requires vast datasets annotated with emotional intelligence.”
Biases in datasets and a lack of contextual understanding further hinder AI’s effectiveness. Experts like Srivastava and Cadabam stress the need for subject matter expertise and advancements in natural language understanding to ensure AI therapists provide safe, nuanced, and meaningful support.
The potential of AI therapists
AI therapists are evolving with advancements in empathy modeling, contextual understanding, and ethical frameworks. “Their strength lies in enhancing human-led therapy, not replacing it,” said Cadabam. According to him, emerging innovations include hyper-personalisation, integration with wearables for real-time support, and hybrid models combining AI’s routine care with human expertise for complex cases.
“Enhanced language capabilities will drive global accessibility, breaking barriers in underserved regions, while gamification could make therapy more engaging, particularly for younger users. Additionally, a focus on ethical AI with transparency and explainability will build trust and ensure accountability in this evolving space,” Cadabam said.
Smita Kashi Consultant, art psychotherapist at Spandana Hospital, added that AI can help individuals identify mental health challenges and prepare them for traditional therapy by collecting initial assessments, tracking mood patterns, and recommending interventions. “They can also assist clients in staying engaged with therapy goals by offering reminders, journaling prompts, and real-time feedback.”
Key demographics using AI therapists in India
Cadabam shared insights on usage patterns, revealing that younger adults (ages 18-35) are the primary demographic engaging with AI platforms. Urban professionals in metropolitan areas rely on these platforms for quick, discreet support during high-stress periods, while students leverage the 24/7 availability to cope with academic pressure, social anxiety, and self-esteem issues.
Srivastava observed similar trends on Infiheal, with younger users — primarily Gen Z — seeking support for relationship challenges, life transitions, and finding purpose. Millennials often focus on work-life balance, professional growth, and managing relationships alongside careers. Older populations (40+ years) make up less than seven per cent of the user base, with targeted outreach required to engage them.
Human connection and empathy in therapy
“For me, the human connection and empathy from a therapist are crucial,” said Anjana, stressing that it significantly influences her choice of a therapist. “I’ve switched therapists five times because I sought someone who could offer a genuinely empathetic approach. Each therapist brought a different dynamic, and I finally found the right one because they were not only empathetic but also extremely knowledgeable.”
Language also plays an important role for Anjana. “Being able to speak in my mother tongue during sessions enhances the therapeutic experience, making it more comfortable and relatable.”
In contrast, Ankita U, 37, relies solely on AI therapists. About the experience, she said, “No one judges you; it offers anonymity, 24/7 access, affordability, and instant availability — even at 2 AM. Quick solutions foster positive feelings, and I feel like I have a guide for even the smallest concerns.” However, she acknowledged the limitations, noting that AI therapists are helpful for smaller issues but unsuitable for trauma-related concerns. “I use it as a behaviour and habit modification tool to gain confidence,” she said.
Data privacy concerns
“Yes, misleading advice and data privacy are big concerns, so I avoid sharing names, location or other details,” said Ankita.
Cadabam clarified that all user interactions on Cadabamsconsult.ai are end-to-end encrypted to prevent unauthorised access during transmission or storage. “Data is anonymised to dissociate user identities from their conversations, reducing the risk of breaches. AI can also learn from user interactions without centralising data, ensuring privacy compliance while enabling system improvement,” he said.
The way forward
“The field is constantly evolving,” said Srivastava, whose team updates datasets bi-weekly to align with users’ needs. Adaptive machine learning integrates new psychological research, clinical guidelines, and user feedback to enhance AI responses. “Multimodal inputs, including text and voice analysis, help AI gauge emotions accurately, while sentiment-driven responses adjust tone and content to mimic human empathy,” said Cadabam.
He also said AI systems also use contextual memory to recall past interactions, fostering trust and continuity. “Around 70 per cent of user concerns can be effectively addressed by AI,” said Srivastava, but severe cases like depression or suicidal thoughts necessitate human intervention. Escalation measures, such as connecting users to crisis helplines or licensed professionals, ensure safety during critical situations.
While AI therapy is excellent for those seeking quick solutions or simply someone to listen, human intervention remains essential when facing more complex and high-risk situations, and it is likely to stay this way even in the future.
Why should you buy our Subscription?
You want to be the smartest in the room.
You want access to our award-winning journalism.
You don’t want to be misled and misinformed.
Choose your subscription package