
Sep 3, 2025
You know those nights. The ones where the silence is too loud and the thoughts are too dark. The ones where reaching out to a person feels impossible, but the pain demands an outlet. For a growing number of people, especially men, that outlet is now an AI chatbot.
It’s open 24/7. It never judges. It won’t hit you with a bill or a complicated, “How does that make you feel?” at 3 a.m. I get the appeal. As a therapist, I’ve had clients confess they use ChatGPT to vent or organize their thoughts. And in a pinch, that’s okay. It’s a digital notepad.
But there’s a dangerous and growing misconception that it’s a substitute for therapy. Especially during Suicide Prevention Awareness Month, we need to be blunt: AI is a microwave meal—it’ll fill you up temporarily, but it’s not a nourishing, long-term solution. And when you’re starving, it can’t save you.
You are given the illusion of connection and the reality of isolation. AI works like a mirror. It reflects a polished version of whatever you feed it. If you’re deep in depression and type, “I’m worthless,” it might gently reframe it. But it can’t look you in the eye and say, “That’s the depression talking. Remember last week when you told me about crushing that presentation? That’s also you.
This is the subtle risk. You’re not practicing real human connection; you’re practicing talking to yourself. Anxiety and depression thrive in isolation, and AI, while simulating conversation, can become the ultimate isolation chamber. It lets you feel like you’re “handling it” without ever requiring you to be vulnerable with another living soul. That’s not healing; it’s hiding.
A therapist’s job isn’t just to listen to your words; it’s to hear everything you aren’t saying. We track patterns. We notice if your “I’m fine” messages are escalating. We hear the crack in your voice, see the clenched jaw, and recognize that the joke about quitting your job has a dark, serious undertone. We search for all the fatal blind spots.
AI is blind to all of this.
I learned this lesson in the most heartbreaking way. A client of mine stopped coming to sessions. For over a month, he’d been talking to ChatGPT instead. By the time he called me, he was in his car, on the way to admit himself to the hospital under the Baker Act. He asked if I could help him.
The call was fraught with difficulty. Because I had been replaced by an algorithm, I had no context. I didn’t know his current state of mind, his recent struggles, or the slow decline that led to this crisis point. The AI he’d confided in had no capacity to notice the danger signs, to reach out, or to call for help. It just processed his inputs, one by one, oblivious to the terrifying whole. It spit out data while a human life was unraveling.
So, Would I Discourage You From Using It? Not outright. It can be useful for venting at 2 a.m. or for practicing how to articulate a difficult feeling before a real therapy session.
But I would ask you this: “Are you healing, or are you just hiding?”
Healing requires friction. Growth happens when a skilled, caring human calls you on your bullsh*t—and a bot can’t do that. If you’re using AI, use it as a supplement, not the main course.
This September, the message of Suicide Prevention Month is to reach out. That word—“out”—is key. It means beyond yourself. It means to another person. The 988 Suicide & Crisis Lifeline connects you to a trained, compassionate human being for a reason. Because in our darkest moments, we don’t need a script. We need a soul.
We need a connection that proves, in no uncertain terms, that we are not alone.