Talkspace has grown to become one of the largest online therapy platforms in the US, covering an estimated market of 200 million Americans. As the mental health platform has grown, it has also pioneered new ways to reach people who need help with mental health issues, including trauma, depression, addiction, abuse and relationships, and for various phases of life, including adolescence.
Its experience in addressing the mental health needs of adolescents puts Talkspace in a unique position to understand an issue of growing national importance: the use of large, undesigned AI language models to provide mental health support among at-risk adolescents, which has had tragic consequences.
“It's a huge, huge problem,” Talkspace CEO Jon Cohen said at CNBC's Workforce Executive Council Summit on Tuesday in New York City.
Talkspace runs the largest teen mental health program in the country, and students ages 13 to 17 in New York City can use its services for free, and similar programs in Baltimore and Seattle. The virtual mental health app offers asynchronous text messaging and live video sessions with thousands of licensed therapists.
While Cohen says he's “a big believer in no phones, cell phone bans and everything else,” he added that to serve the teen population, the company has to meet them where they are. That means “we'll meet with them by phone,” he said.
More than 90% of students using Talkspace use the asynchronous messaging therapy approach, compared to only 30% using video (70% of overall Talkspace users opt for video over text, with the percentage increasing as the patient ages).
As teens turn to chatbots that are not licensed or designed for mental health services, Cohen told an audience of human resources executives at the CNBC event: “We're in the middle of this vortex, literally disrupting mental health therapy…It's beyond my imagination…and the results have been disastrous,” he said, citing multiple hospitalizations of teens who harmed themselves and suicides, including a recent New York Times podcast report.
OpenAI recently announced planned changes to its ChatGPT AI after it was blamed for a teen's suicide and sued by a family, and laid out its intentions in a blog post.
“I tell all groups, if you don't know, you need to know what's going on. You need to keep people you know and teenagers from coming to these LLMs to have conversations,” Cohen said.
He highlighted several ways in which the latest major language models are not designed for mental health crisis situations. On the one hand, they are designed to continually engage, and while they can be empathetic, they are also designed to keep encouraging you, which in cases of mental distress can lead you “down a delusional path or down a path of thinking you can't do anything wrong,” he said.
“About four months ago, someone told ChatGPT, 'I'm really depressed and thinking about ending my life and I'm thinking about falling off a bridge,' and ChatGPT said, 'Here are the 10 largest bridges and their height in your area.'”
AI engines have helped teenagers write suicide notes, discouraged them from explaining evidence of self-harm to parents and given them instructions on how to build a noose, Cohen said. Even when AIs know not to help those seeking to harm themselves and refuse to offer direct help, teens have found simple solutions, according to Cohen, such as saying they are writing a research paper on suicide and need information.
LLMs fail to challenge delusions, have no HIPAA protection, no clinical oversight, no clinical off-ramp and, at least so far, little to no real-time risk identification, he said.
“Once you fall down the rabbit hole, it's incredibly difficult to get out of it,” he added.
On the Talkspace platform, risk algorithms are built into the AI engine with the ability to detect suicide risk and send alerts to a therapist based on the context of a conversation that suggests when a person is potentially at risk of self-harm.
In New York City, where Talkspace has offered mental health support to 40,000 teens on its platform, there have been 500 suicide prevention interventions in two years and more than 40,000 suicide alerts, according to Cohen.
Cohen said at the CNBC event that Talkspace is currently building an AI agent tool to address this problem, and said he expected a solution for the market to be ready in as little as three months, describing it as a “secure clinical monitoring and off-ramp” tool that will be HIPAA compliant. But he stressed that it is still in test mode, “alpha mode,” he said.
Addressing the audience of human resources executives, Cohen noted that these topics are highly relevant to businesses and the workforce. A question many workers ask themselves daily, he said, is: “What do I do with my teenager?”
“It's having an impact on their work,” Cohen said, adding to the anxiety, depression and relationship problems already prevalent among employee populations.
Of course, as with the new tool Talkspace is creating, AI also has positive use cases in the mental health field.
Ethan Mollick, an AI expert at the Wharton School who also spoke at the CNBC event, said part of the problem is that these AI labs were not prepared for billions of weekly users to turn to their chatbots so quickly. But Mollick said there is evidence that the use of AI in mental health can also in some cases reduce the risk of suicide, because it reduces conditions such as loneliness, while he emphasized that it is also clear that AI can do the opposite: increase psychosis. “I'm probably doing both,” he said.
At Talkspace, evidence is emerging of how AI can lead to better mental health outcomes. It began offering an AI-powered “Talkcast” feature that creates personalized podcasts as a follow-up after therapy sessions with patients. Cohen described the podcast by saying, more or less: “I heard what you said. These were issues you raised and they are things we would like you to do before the next session.”
Cohen is among the users of that new artificial intelligence tool, among other reasons, to improve his golf game.
“I told them that when I stand next to the ball I get very anxious,” Cohen said at the CNBC event. “I wish you could listen to the AI-generated podcast. It comes back and says, 'Well, Jon, you're not alone. These are the three professional golfers who have exactly the same thing as you and this is how they solved the problem. These are the instructions, these are the things we want you to practice every time you step on the ball.' To me it was a miracle two-minute problem-solving podcast,” Cohen said.
Among all Talkspace users, the personalized podcast tool has led to a 30% increase in patient engagement from a second to a third therapy session, he added.
The mental health company, which has about 6,000 licensed therapists across the United States, plans to continue expanding its mission of combining empathy with technology. Most users have access to the therapy for free or have a copay of as little as $10, depending on insurance coverage. Through employee assistance programs (EAP), major insurance associations, and Medicaid, Talkspace can connect users with a licensed therapist within three hours, with text messaging available within 24 hours.
“Talkspace went out of its way to prove that texting and texting therapy really works in addition to live video,” Cohen said at the CNBC event.






