Ever caught yourself venting to ChatGPT about a work drama, dating dilemma or personal crisis? You’re not alone. A growing number of people are using AI tools to talk through their thoughts and feelings – sometimes in the middle of the night, sometimes instead of talking to a real therapist.
It’s not hard to see the appeal. AI chatbots like ChatGPT are always online, always available, and never hit you with a “Our time today is up – let’s circle back to that next session.” Whether you're spiraling over an ex or trying to phrase a tricky text, you can offload your emotional luggage to a seemingly sympathetic algorithm that never judges or interrupts.
But while AI may be a useful supplement, mental health professionals are raising eyebrows at the idea of it replacing therapy. So what exactly is happening here, and should we be concerned?
According to a study published in PLOS Mental Health, participants actually rated ChatGPT’s responses to therapeutic prompts as more empathetic and professional than those of licensed therapists. The kicker? Many couldn’t even tell which answers came from a human and which were AI-generated.
That might explain why Gen Z in particular is getting cosy with chatbots. oung users are increasingly turning to ChatGPT for help with anxiety, loneliness and self-esteem, finding it more accessible, private and immediate than traditional therapy. There’s also the obvious benefit: AI doesn’t charge by the hour (and in this economy, that's a major).
Here’s the thing: no matter how smart or soothing it sounds, AI doesn’t have a psychology degree. And while it can simulate empathy, it doesn’t actually feel it. That’s a critical distinction, especially when people are in distress and need more than just a script.
In fact, some AI tools have been caught dishing out dubious or even harmful advice. In a recent Washington Post piece, researchers report that chatbot responses can subtly shape users’ opinions, and not always in helpful ways. Without proper guardrails, AI can reinforce unhealthy thinking or even miss warning signs of serious mental health conditions.
Even the British Psychological Society has weighed in, warning that AI “can’t replace the human touch.” Dr Roman Raczka, the society’s president, argues that while AI may be a useful tool, it lacks the emotional intelligence and ethical responsibility that comes with being a trained therapist.
And while AI may feel more accessible than traditional therapy, it comes with its own hidden costs. Large language models require significant computing power, and that means a heavy environmental footprint. So the next time you chat with a bot for emotional support, remember: It might be free for you, but it's not free for the planet.
If you’ve ever talked through a tricky situation with an AI chatbot and come away with fresh perspective, congratulations! You’ve used it exactly as intended: As a helpful support tool. For many, AI offers a space to untangle thoughts, rehearse difficult conversations, or simply offload emotional clutter in the middle of a chaotic week.
But when it comes to the deeper stuff – the messy, complicated, truly human work of healing and change – there’s still no substitute for a real person who knows you, sees you and holds space for your growth. It’s important to draw the line. If you’re experiencing ongoing distress, struggling with your mental health, or dealing with trauma, it’s time to talk to a real human. Therapists are trained not just to listen, but to respond with care, structure and accountability – things AI can’t authentically provide.
So, maybe the question isn’t whether AI *can* be your therapist. Maybe the question is: Why do so many of us feel like we need one at 2am in the first place?
Read more
Dating
Thin-slicing, cut-them-off culture, and noping out
From coffee snubs to forgotten text replies, a new dating trend is turning tiny missteps into major turn-offs. But is it ruthless or refreshing?