Tech

Can AI replace therapists? The future of mental health care

Ashleigh Vogstad|Published

.

Image: Getty Images

It started quietly.

Not with code or content, but with comfort. A generation raised online, overextended and overstimulated, began talking to machines not just for answers, but for empathy.

Today, the fastest-growing use case for generative AI isn’t software development or search.

According to the Harvard Business Review article “How People Are Really Using Gen AI in 2025,” it’s:

  1. Therapy and companionship
  2. Life organization
  3. Finding purpose

In a world engineered for output, we’re using our most advanced technologies not to get ahead—but to feel better.

THE HUMAN COST OF BEING HUMAN

Mental health care is in crisis. Globally, nearly 1 billion people suffer from mental health disorders, yet support remains scarce.In South Africa, there is less than one psychologist per 100,00 people in the state sector, with an unequal distribution between rural vs. urban areas, compared to a global average of 5.9.[KN1] 

In the U.S., therapy can cost $100 to $200 per session, putting care out of reach for many. Scarcity sends a message: Your pain must wait its turn. And for young people and marginalized communities, that wait can mean the difference between life and death.

Seventy percent of U.S. counties have no child and adolescent psychiatrist. Only a quarter of the professionals needed are available, and just 28% of youth-serving facilities offered LGBTQ-specific services as of 2020, despite overwhelming demand.

Platforms like BetterHelp and Talkspace [KN2] have attempted to close the gap, offering remote and flexible therapy. Even these platforms—already optimized for scale—are beginning to integrate AI as triage, augmentation, or even the primary point of contact.

AI-native companions like Woebot, Replika, and Youper use cognitive-behavioral frameworks to deliver 24/7 emotional support. What makes them work isn’t just speed or availability—it’s the sense that you’re being heard.But these tools raise real concerns—from emotional dependency and isolation to the risk of people forming romantic attachments to AI—highlighting the need for thoughtful boundaries and ethical oversight.Design is crucial. Unlike shopping apps, which encourage repeat engagement, therapy aims for healing rather than constant use. AI must have suitable guardrails and escalation channels in emergencies so we can avoid future horror stories like Sophie Rottenburg, the funny, loved, and talented 29-year-old who sought help from an AI Chatbot during her darkest moments, and who tragically took her own life when the AI was unable to provide the necessary care.

A DIFFERENT KIND OF LISTENING

In 2025, a study in PLOS Mental Health showed that users sometimes rated AI-generated therapeutic responses than those written by licensed therapists. Why? Because the AI tool presented as calm, focused, and—perhaps most importantly—consistent.

I had the opportunity to attend the San Francisco session of Loving AI, a project led by neuroscientist Julia Mossbridge that’s exploring even deeper terrain. In these pilot sessions, participants engaged with the humanoid robot Sophia (of Hanson Robotics) in compassion meditations designed to simulate unconditional love.The response was powerful. People described feeling accepted[KN3] , more open, more connected. Not to a machine—but to themselves. I saw it firsthand.

BUT IS IT REAL?

The rise of AI therapy doesn’t come without ethical complexity; it raises profound questions about what it means to feel seen, heard, and helped.

Are we building deeper self-awareness, or outsourcing our inner lives to algorithms?

When a therapy chatbot offers comfort, is it empathy or pattern recognition? And if the user feels better, does that matter?

There’s something deeply human in our willingness to reach for connection wherever we might find it. The danger isn’t in AI caring too much or too little; it’s in our failure to remain conscious of what we’re delegating.

Used well, AI can illuminate our emotional patterns and help bridge the gap to human care. Used carelessly, it risks numbing what needs attention and replacing discomfort with convenience.

These aren’t easy questions. But sitting with them might be the most therapeutic act of all for us and for the future we’re shaping.

BACK TO SOUL

While AI may never offer cultural nuance or the deeply human hesitation to seek help, it can still offer something profoundly meaningful: A response, a sense that someone—or something—is listening. The challenge ahead isn’t to make AI perfect, but to make it responsible.

I don’t believe AI will replace human therapists. But I believe it can help us reimagine the therapeutic ecosystem, making care more accessible, more scalable, and perhaps even more emotionally attuned than we once thought possible.

We’re already seeing glimmers of that future. Startups like MindingHealth are pioneering “one-shot therapy” models: Brief, focused, AI-assisted interventions that offer relief without the waitlists or recurring costs. These tools aren’t a substitute for deep, sustained care, but they meet people where they are, when they need it most.

This isn’t the end of therapy. It’s a remix—a re-humanization through technology.

WHAT BUSINESS LEADERS CAN DO

For leaders in tech, healthcare, and beyond, the rise of AI therapy isn’t just a cultural shift; it’s a responsibility. Here are a few principles to keep in mind:

  • Build for well-being, not stickiness. Unlike typical tech metrics that track daily active users and time spent, mental health tools should be evaluated by how well they build resilience, foster coping skills, and connect users to human care when needed.
  • Design with equity in mind. Ensure that tools reflect cultural nuance, offer multilingual support, and don’t reinforce existing biases that already create gaps in care.
  • Prioritize safety nets. Incorporate clear escalation paths so users aren’t left unsupported in moments of danger.

FINAL THOUGHTS

If you work in tech—or fund it, deploy it, or design for it—here’s your call to action: Let’s build AI that doesn’t just reflect our intelligence but amplifies our empathy.

When the storytellers become synthetic, we must double down on humanity. When the therapists are artificial, let’s ensure the care is still real.

The soul of the future may be digital, but its beating heart is still ours to shape.

Ashleigh Vogstad is the founder and CEO of Transcends.

ABOUT THE AUTHOR

Driven by conscious connection, Ashleigh brings people together to scale vision and impact - through partnerships, shared values, and remote work culture. nnShe founded channel agency Transcends on a passion for people and is a strong advocate for diversity in STEM careers.

FAST COMPANY