As young people turn to chatbots for therapy, we ask a mental health expert about the consequences

By Abby Amoakuh

Updated May 30, 2024 at 04:18 PM

Reading time: 4 minutes

As chatbots are being programmed to talk, laugh, sing, and become more lifelike overall, Gen Zers are increasingly turning to them for advice and comfort. And who could really blame them?

NHS waiting times for talking therapy can range from 4 to 5 months, while the average cost for private counselling currently lies between £40 to £80 per session. Amid coinciding mental health and the cost of living crisis, traditional therapy is looking extremely inaccessible and expensive.

Enter OpenAI. The company rolled out TherapistGBT last year, an “AI designed to provide comfort, advice, and therapeutic support to those seeking mental wellness guidance.”

OpenAI’s Lilian Weng, head of safety systems, even posted on X (formerly known as Twitter) that she had held a “quite emotional, personal conversation with ChatGPT in voice mode” about stress and work-life balance.

“Interestingly I felt heard and warm. Never tried therapy before but this is probably it. Try it especially if you usually just use it as a productivity tool,” she ended.

A smooth and alluring voice coupled with comforting and empathetic advice might seem like a perfect combination. What could really go wrong, right? That’s a rhetorical question of course. Experts have been incredibly critical of OpenAI’s endorsements to use its chatbot for therapeutic purposes.

However, this concern hasn’t made it over to the masses yet and the company’s peers in the artificial intelligence field have also discovered a market in young people seeking affordable and easily accessible care. In January 2024, it was revealed that the psychologist persona on leading chatbot service provider Character.ai, which promises to help people with “life difficulties,” had recorded over 117.2 million interactions since its creation.

While the company does not share how many users are responsible for these interactions, it has reported that 3.5 million people visit the overall site daily, meaning that a considerable number of them might be opting for this one, or many other of its therapeutic chatbots.

When the BBC interviewed its creator, Blazeman98 or 30-year-old New Zealander Sam Zaia, he responded: “I never intended for it to become popular, never intended it for other people to seek or to use as like a tool. Then I started getting a lot of messages from people saying that they had been positively affected by it and were utilising it as a source of comfort.”

Yet researchers are highlighting the risks of relying on chatbots for therapeutic purposes, questioning if they can properly flag and treat serious mental health conditions. Likewise, critics are pointing out that artificial intelligence cannot build a bond with a patient in the same way a human therapist could.

For this reason, SCREENSHOT contacted Alice Hendy, mental health advocate and CEO and founder of the R;pple Suicide Prevention Charity, an online tool that intercepts harmful searches and signposts to free, 24/7 mental health support.

 

View this post on Instagram

 

A post shared by R;pple (@ripplesuicideprevention)

“One of the primary risks of relying on AI as a therapeutic tool is the potential for the dehumanisation of the therapeutic process. In traditional therapy, the therapeutic relationship between the therapist and the client is central to the healing process. This human connection allows for empathy, understanding, and the ability to tailor treatment to the unique needs of the individual. By relying on AI as a replacement for human therapists, there is a risk of losing this essential element of the therapeutic process, potentially leading to a reduction in the quality of care and the depth of healing experienced by clients,” Hendy explained regarding these concerns.

“Another risk of relying on AI as a therapeutic tool is the potential for over-reliance on technology to solve complex human problems. While AI can undoubtedly offer valuable insights and support in therapy, it is essential to recognise that human emotions, experiences, and relationships are inherently complex and multifaceted. The limitations of AI in understanding and responding to the nuances of human experience must be acknowledged, and human therapists should continue to play a central role in the therapeutic process.”

Another reason critics are sceptical of AI is the question of whether it can detect complex mental health conditions, provide culturally sensitive support, and read between the lines of what is being said rather than taking statements entirely at their face value.

Contemplating these questions, Hendy replied: “AI has the potential to detect signs of risky behaviour and alert individuals or relevant authorities. However, the effectiveness of AI in this regard depends on the quality and accuracy of the data it is trained on, as well as the ethical considerations in determining what constitutes risky behaviour.”

Moreover, the mental health advocate warned: “However, it is important to approach this with sensitivity and awareness of potential biases in AI algorithms.”

Considering the motivations that drive young people towards artificial intelligence though, we still wanted to shine a light on some of the advantages that users can get by using chatbots for (minor) emotional concerns.

“One of the key advantages of using chatbots as a therapy tool is their accessibility. Many individuals may not have access to traditional therapy due to financial or logistical barriers. By utilising chatbots, individuals can seek support and guidance at any time and from any location, breaking down the barriers that may prevent them from seeking help,” Hendy emphasised.

“Furthermore, chatbots have the potential to provide continuous support. Traditional therapy sessions are often limited to scheduled appointments, leaving individuals to navigate their mental health challenges on their own in between sessions. Chatbots, on the other hand, can provide continuous support and guidance, offering a consistent source of assistance to those in need.”

Nevertheless, these advantages still drew us back to the initial warning points from experts.

“It’s important to note that while chatbots have the potential to offer valuable support, they should not be seen as a replacement for traditional therapy. Human connection and empathy play a crucial role in therapy, and chatbots cannot fully replicate these aspects,” she said.

We’ve officially entered the Her era of AI if anyone remembers the iconic movie with Joaquin Phoenix from 2013. AI can flirt, tell us stories and jokes, and be convincing to the point that we don’t realise that the empathy and attention we are receiving are entirely synthetic and manufactured. 

Hendy’s point about the importance of authentic human connection becomes the anchor to hold on to in the debate around AI as therapists, considering that the nuances of human emotions and the essential empathy and understanding provided by human therapists cannot be fully replicated by chatbots.

Keep On Reading

By Fatou Ferraro Mboup

Who needs real love when you’ve got AI chatbots? Inside the worrying world of virtual girlfriends 

By Malavika Pradeep

Kpop idol-based chatbots are blurring the lines between interaction and explicit obsession

By Abby Amoakuh

Meta’s Kendall Jenner AI avatar roasts 818 Tequila and endorses other celebrity alcohol brands

By Charlie Sawyer

Dan Schneider addresses accusations revealed in Quiet on Set: The Dark Side of Kids TV 

By Abby Amoakuh

Gen Zers are locked into career echo chambers. Here’s how to get out of them

By Abby Amoakuh

McDonald’s addresses impact of boycott related to Israel-Hamas war in new statement

By Abby Amoakuh

US election Nostradamus predicts formidable odds for Biden as Trump’s lead narrows

By Fatou Ferraro Mboup

Ryan Bayldon-Lumsden is the murder suspect standing for re-election in Australia

By Abby Amoakuh

Underage deepfake porn of Jenna Ortega and Sabrina Carpenter used in Instagram and Facebook ads

By Abby Amoakuh

Nara Smith’s braids are causing outrage on TikTok. Here’s why

By Abby Amoakuh

Drake calls for release of Tory Lanez, proving once more that he’s a rapper for the manosphere

By Fatou Ferraro Mboup

Two duvets, one love: How the Scandinavian sleep method transformed my nights

By Fatou Ferraro Mboup

Who was the goblin who crashed the 2024 Emmy Awards red carpet?

By Fatou Ferraro Mboup

Christmas on the streets: Inside the UK’s heartbreaking 14% homelessness increase

By Charlie Sawyer

Vampire facials at unlicensed New Mexico spa have infected three women with HIV

By Charlie Sawyer

Mystery girl behind Nigel Farage milkshake saga sparks online theories

By Malavika Pradeep

What is vaporwave? Here’s everything you need to know about the viral music genre

By Fatou Ferraro Mboup

PrettyLittleThing’s chaotic open casting call event is the fast fashion brand’s latest exploitative venture

By Abby Amoakuh

Jeffrey Epstein flight logs: Prince Andrew controversy resurfaces as nearly 200 names to be released

By Fleurine Tideman

Is BeReal dead? We asked two social media experts and the app’s COO to find out