Why Therapy is Now the #1 Use Case for Generative AI in 2025

ElizaChat Team

June 10, 2025

A student chats with AI on her phone

Here’s something that might surprise you: while everyone expected generative AI to excel at coding and data analysis, people are using it most for therapy and emotional support.

Harvard Business Review recently published their 2025 research on how people use generative AI, building on their popular 2024 study. The findings reveal a significant shift in how people interact with artificial intelligence.

What the Numbers Show

The research reveals that therapy and companionship are now the #1 use case for generative AI. Coming in at #2 is “organizing my life,” followed by “finding purpose” at #3.

This represents a notable shift from 2024, when technical tasks dominated AI usage. The study analyzed user behavior across major AI platforms and found that people are increasingly turning to AI for deeply personal needs: emotional support, life guidance, and self-improvement.

The research indicates that people are utilizing AI to discover their purpose and enhance themselves in specific ways. They’re seeking help with establishing daily habits, gaining personal insights, and identifying manageable steps toward their goals. The study documented people having extended conversations with AI about personal challenges and using it to build confidence in various areas of their lives.

As the researchers noted, most experts predicted that AI would first prove itself in technical domains. The data tells a different story about human needs and preferences.

Understanding the Shift

This trend reflects several converging factors in our current mental health landscape. According to the National Alliance on Mental Illness, approximately 1 in 5 adults in the U.S. experiences mental illness each year, yet many face significant barriers to accessing traditional care.

The timing factor is crucial. When someone is struggling at 2 AM on a weekend, traditional support systems are largely inaccessible. Therapist offices are closed, crisis hotlines often have long wait times, and personal support networks may not be available. AI-powered tools operate around the clock.

The accessibility barriers that AI addresses are well-documented. The American Psychological Association reports that cost, insurance limitations, and appointment availability are primary obstacles to mental health care. Additionally, stigma remains a significant concern—many people hesitate to seek help due to fear of judgment or professional consequences.

Research in digital therapeutics shows that people often feel more comfortable initially opening up to AI interfaces. This reduced social pressure can serve as a stepping stone to more comprehensive care.

The Technology Behind the Trend

Modern AI platforms use sophisticated algorithms to personalize support, analyzing conversation patterns, language use, and even response timing to provide relevant guidance. These systems often incorporate evidence-based therapeutic frameworks, particularly cognitive-behavioral therapy (CBT) techniques.

The Harvard Business Review research indicates that users want AI that knows them better, not less. They’re seeking personalized support that remembers their preferences, understands their specific challenges, and adapts recommendations accordingly.

However, the study also revealed growing sophistication among users. People are becoming more discerning about AI capabilities and limitations. Privacy concerns emerged as a consistent theme, with users wanting the benefits of AI support while maintaining control over their personal information.

Implications for Mental Health Care

The mental health care system in the United States faces well-documented capacity issues. According to the Health Resources and Services Administration, there are currently over 6,500 designated Mental Health Professional Shortage Areas in the country, affecting millions of Americans.

AI offers potential solutions to three critical challenges:

Availability: AI can provide immediate support without the constraints of office hours, appointment scheduling, or geographic limitations. This is particularly valuable for rural areas where mental health professionals are scarce.

Cost: Traditional therapy can cost $100-$ 300 per session, depending on the provider, without insurance coverage. AI-powered tools can potentially serve large populations at a fraction of this cost, making basic mental health support more accessible.

Stigma: For individuals hesitant to seek traditional therapy, AI can provide a private, judgment-free first step toward addressing mental health concerns.

Importantly, this doesn’t suggest AI should replace human therapists. The most effective model is a tiered approach, similar to how urgent care complements emergency rooms and primary care. AI could handle initial support and ongoing maintenance, while human professionals focus on complex cases, crisis intervention, and deeper therapeutic relationships.

Evidence from Early Implementations

Research on AI-powered mental health interventions is accumulating. Studies in peer-reviewed journals have documented positive outcomes when AI tools incorporate evidence-based therapeutic approaches.

Clinical trials have shown measurable improvements in anxiety and depression symptoms among users of AI therapy platforms. For example, research published in academic journals has found statistically significant reductions in standardized mental health assessment scores after consistent use of AI-guided interventions.

Organizations implementing AI mental health tools for employee wellness programs report improved engagement compared to traditional employee assistance programs, along with measurable improvements in workplace well-being metrics.

However, researchers emphasize essential limitations. AI tools show the most substantial evidence for mild to moderate symptoms and are most effective when designed with input from licensed mental health professionals.

Quality Standards and Best Practices

The research suggests users are becoming more discerning about AI tools. They want evidence-based approaches, not just conversational interfaces.

Adequate AI mental health support requires several key elements:

Clinical foundation: AI should incorporate established therapeutic approaches, such as cognitive-behavioral therapy, dialectical behavior therapy, or mindfulness-based interventions, rather than providing generic responses.

Transparency: Users want to understand how AI systems work and what happens to their data. Privacy and data security concerns remain paramount for people sharing personal mental health information.

Professional oversight: The most credible tools involve licensed mental health professionals in their development, training, and ongoing refinement.

Apparent limitations: Responsible AI mental health tools clearly communicate what they can and cannot do, including when users should seek professional help from a human.

The Path Forward

The Harvard Business Review findings suggest significant unmet demand for accessible mental health support. The willingness of people to turn to AI for therapy indicates gaps in traditional care delivery that technology might help address.

This creates opportunities to reimagine mental health care delivery. Instead of systems that primarily respond to crises, we might build infrastructure that supports daily emotional well-being. Instead of limiting quality help to those with specific resources or geographic access, we might make foundational support broadly available.

The most promising future likely combines human expertise with AI accessibility. AI can provide immediate, always-available support for common mental health challenges. At the same time, human professionals focus on complex cases, crises, and the nuanced work that requires human judgment and empathy.

At ElizaChat, we’re developing AI-powered mental health support with these principles in mind. Our approach combines clinical expertise from licensed mental health professionals with AI technology designed specifically for meaningful therapeutic conversations. We’re not trying to replace therapists—we’re working to extend their expertise to reach more people when and where they need support.

The research confirms that people want AI that enhances their human experience rather than replacing human connection. They’re looking for support that’s accessible, private, and genuinely helpful—built by teams who understand both technology and mental health care.

As this field evolves, the most successful approaches will likely be those that maintain focus on clinical effectiveness, user privacy, and straightforward integration with traditional mental health care systems.


If you’re curious about our approach to AI-powered mental health support, you can learn more about ElizaChat 

🌟 ElizaChat Featured on "Advancements with Ted Danson"