Collaborator: IA therapy is not improving. Therapists are simply failing


A growing number of people resort to AI for therapy not because now it is smarter than humans, but because too many human therapists stopped doing their job. Instead of challenging illusions, counting hard truths and helping to develop resilience, modern therapy deviated into noddings, empty guarantees and endless validation. In a vacuum, staggered chatbots, automating bad therapy practices, sometimes with mortal consequences.

Recent headlines told the heartbreaking story of Sophie RottenbergA young woman who trusted her suicidal plans to chat before taking her life in February. A bot of AI offered him the only comfort; No intervention, without warning, without protection. Sophie's death was not just a tragedy. It was a sign: the AI ​​has perfected the worst habits of modern therapy while eliminating the railings that once made it safe.

I warned more than a decade ago, in a New York Times 2012 Opinion articleThat therapy was drifting too far from its central purpose. That warning was presumed and that drift has hardened in orthodoxy. The therapy exchanged the objective of helping people strengthen themselves due to the false comfort of the validation and maintenance of the hand.

For much of the last century, the objective of therapy was resilience. But in the last decade, the culture of the campus has changed towards emotional protection. Universities now adopt the language of safe spaces, Activation warnings and Microgresions. The therapist's training, made up of that environment, takes the same spirit to the clinic. Instead of being taught how to challenge patients and develop their strength, new therapists are encouraged to affirm feelings and protect patients with discomfort. The intention is compassion. The effect is paralysis.

When therapy stops challenging people, it ceases to be therapy and is paid. The damage is real. I have seen it first hand in more than two decades as a psychotherapist in exercise in New York and Washington City, DC, a patient told me that his previous therapist urged him to leave a promising job because the patient felt “triggered” by his boss. The real problem, the difficulty in taking the direction, was soldier. Another case in the news recently focused on a man in the middle of a manic spiral that turned to Chatgpt to get help. He validated his delusions, and ended up hospitalized twice. Different suppliers, the same failure: avoid discomfort at all costs.

A trained mentality to “validate first and always” leaves no room to solve problems or responsibility. Patients quickly feel the void: the hollow sensation of canned empathy, nods without challenge and answers that go anywhere. They want guidance, direction and courage of a therapist willing to say what is difficult to listen. When the therapy only offers comfort without clarity, it becomes ineffective and people increasingly resort to algorithms.

With ia, the danger is multiplied. A bad therapist can lose years. A chatbot can waste thousands of lives every day, without pause, without ethics, without responsibility. Bad therapy has become scalable.

All this is colliding with a loneliness epidemic, record levels of anxiety and depression and a mental health industry potentially worth it thousands of millions. Estimates of the administration of US health and health services. suggest that Approximately 1 in 3 Americans feel comfortable by resorting to AI bots instead of flesh and blood therapists for emotional or mental health support.

The appeal of AI is not wisdom but decision. A wind never doubts, he never says “sit with that feeling.” Simply respond. That is why AI feels like an update. Your answers can be reckless, but the format is fast, safe and direct, and is addictive.

Good therapy should not look like a chatbot, who cannot capture nonverbal signs or tone, cannot face them and cannot act when it matters more.

The tragedy is that therapy has taught patients to wait so little that even an algorithm feels like an update. It became a professional retention business, which weakened the patients and opened the door to the intervention of the machine. If the therapists continue to avoid discomfort, tragedies like Sophie Rottenberg will become more common.

But therapy can evolve. The way to follow is not to imitate machines, but to recover what made the therapy effective in the first place. In my own practice, I ask difficult questions. I press patients to see their role in the conflict, to face the discomfort that they want to avoid and build the resilience that growth requires. That approach is not hard. It is compassion with a purpose: to help people change instead of staying trapped.

Modern therapy can comply with the current crisis if training programs again teach those skills. Instead of resulting in the young therapists fluently in the language of the complaint, programs should focus on developing doctors who know how to challenge, guide and strengthen patients. Patients deserve honesty, responsibility and tools to advance. The therapy can continue to be a listening business, or it can be a change catalyst.

Jonathan Alpert is a psychotherapist who practices in New York and Washington City and the author of the next Therapeutic nation.

scroll to top