As young people turn to chatbots for therapy, we ask a mental health expert about the consequences

By Abby Amoakuh

Updated May 30, 2024 at 04:18 PM

Reading time: 4 minutes

57943

As chatbots are being programmed to talk, laugh, sing, and become more lifelike overall, Gen Zers are increasingly turning to them for advice and comfort. And who could really blame them?

NHS waiting times for talking therapy can range from 4 to 5 months, while the average cost for private counselling currently lies between £40 to £80 per session. Amid coinciding mental health and the cost of living crisis, traditional therapy is looking extremely inaccessible and expensive.

Enter OpenAI. The company rolled out TherapistGBT last year, an “AI designed to provide comfort, advice, and therapeutic support to those seeking mental wellness guidance.”

OpenAI’s Lilian Weng, head of safety systems, even posted on X (formerly known as Twitter) that she had held a “quite emotional, personal conversation with ChatGPT in voice mode” about stress and work-life balance.

“Interestingly I felt heard and warm. Never tried therapy before but this is probably it. Try it especially if you usually just use it as a productivity tool,” she ended.

https://twitter.com/lilianweng/status/1706544602906530000

A smooth and alluring voice coupled with comforting and empathetic advice might seem like a perfect combination. What could really go wrong, right? That’s a rhetorical question of course. Experts have been incredibly critical of OpenAI’s endorsements to use its chatbot for therapeutic purposes.

However, this concern hasn’t made it over to the masses yet and the company’s peers in the artificial intelligence field have also discovered a market in young people seeking affordable and easily accessible care. In January 2024, it was revealed that the psychologist persona on leading chatbot service provider Character.ai, which promises to help people with “life difficulties,” had recorded over 117.2 million interactions since its creation.

While the company does not share how many users are responsible for these interactions, it has reported that 3.5 million people visit the overall site daily, meaning that a considerable number of them might be opting for this one, or many other of its therapeutic chatbots.

When the BBC interviewed its creator, Blazeman98 or 30-year-old New Zealander Sam Zaia, he responded: “I never intended for it to become popular, never intended it for other people to seek or to use as like a tool. Then I started getting a lot of messages from people saying that they had been positively affected by it and were utilising it as a source of comfort.”

Yet researchers are highlighting the risks of relying on chatbots for therapeutic purposes, questioning if they can properly flag and treat serious mental health conditions. Likewise, critics are pointing out that artificial intelligence cannot build a bond with a patient in the same way a human therapist could.

For this reason, SCREENSHOT contacted Alice Hendy, mental health advocate and CEO and founder of the R;pple Suicide Prevention Charity, an online tool that intercepts harmful searches and signposts to free, 24/7 mental health support.

 

View this post on Instagram

 

A post shared by R;pple (@ripplesuicideprevention)

“One of the primary risks of relying on AI as a therapeutic tool is the potential for the dehumanisation of the therapeutic process. In traditional therapy, the therapeutic relationship between the therapist and the client is central to the healing process. This human connection allows for empathy, understanding, and the ability to tailor treatment to the unique needs of the individual. By relying on AI as a replacement for human therapists, there is a risk of losing this essential element of the therapeutic process, potentially leading to a reduction in the quality of care and the depth of healing experienced by clients,” Hendy explained regarding these concerns.

“Another risk of relying on AI as a therapeutic tool is the potential for over-reliance on technology to solve complex human problems. While AI can undoubtedly offer valuable insights and support in therapy, it is essential to recognise that human emotions, experiences, and relationships are inherently complex and multifaceted. The limitations of AI in understanding and responding to the nuances of human experience must be acknowledged, and human therapists should continue to play a central role in the therapeutic process.”

Another reason critics are sceptical of AI is the question of whether it can detect complex mental health conditions, provide culturally sensitive support, and read between the lines of what is being said rather than taking statements entirely at their face value.

Contemplating these questions, Hendy replied: “AI has the potential to detect signs of risky behaviour and alert individuals or relevant authorities. However, the effectiveness of AI in this regard depends on the quality and accuracy of the data it is trained on, as well as the ethical considerations in determining what constitutes risky behaviour.”

Moreover, the mental health advocate warned: “However, it is important to approach this with sensitivity and awareness of potential biases in AI algorithms.”

Considering the motivations that drive young people towards artificial intelligence though, we still wanted to shine a light on some of the advantages that users can get by using chatbots for (minor) emotional concerns.

“One of the key advantages of using chatbots as a therapy tool is their accessibility. Many individuals may not have access to traditional therapy due to financial or logistical barriers. By utilising chatbots, individuals can seek support and guidance at any time and from any location, breaking down the barriers that may prevent them from seeking help,” Hendy emphasised.

“Furthermore, chatbots have the potential to provide continuous support. Traditional therapy sessions are often limited to scheduled appointments, leaving individuals to navigate their mental health challenges on their own in between sessions. Chatbots, on the other hand, can provide continuous support and guidance, offering a consistent source of assistance to those in need.”

Nevertheless, these advantages still drew us back to the initial warning points from experts.

“It’s important to note that while chatbots have the potential to offer valuable support, they should not be seen as a replacement for traditional therapy. Human connection and empathy play a crucial role in therapy, and chatbots cannot fully replicate these aspects,” she said.

We’ve officially entered the Her era of AI if anyone remembers the iconic movie with Joaquin Phoenix from 2013. AI can flirt, tell us stories and jokes, and be convincing to the point that we don’t realise that the empathy and attention we are receiving are entirely synthetic and manufactured. 

Hendy’s point about the importance of authentic human connection becomes the anchor to hold on to in the debate around AI as therapists, considering that the nuances of human emotions and the essential empathy and understanding provided by human therapists cannot be fully replicated by chatbots.

Keep On Reading

By Fatou Ferraro Mboup

Who needs real love when you’ve got AI chatbots? Inside the worrying world of virtual girlfriends 

By Malavika Pradeep

Kpop idol-based chatbots are blurring the lines between interaction and explicit obsession

By Abby Amoakuh

Meta’s Kendall Jenner AI avatar roasts 818 Tequila and endorses other celebrity alcohol brands

By Fatou Ferraro Mboup

German company launches first digital condom aiming to block non-consensual recording during sex?

By Abby Amoakuh

The Menendez brothers star in new documentary to hit back at Ryan Murphy’s Monsters

By Charlie Sawyer

The 3 wildest fan theories about Severance season 2

By Abby Amoakuh

Iraq legalises child marriage following proposal to lower age of consent to nine

By Abby Amoakuh

MAGA pushes wildest conspiracy theory yet about Tim Walz ahead of US presidential election

By Fatou Ferraro Mboup

Fans rally around Sabrina Carpenter after YouTuber Hannah Pearl Davis labels her catfish of the year

By Fatou Ferraro Mboup

From soaring prices to ethical issues: Here’s why PrettyLittleThing’s rebrand is sparking outrage

By Charlie Sawyer

From his beef with Taylor Swift to losing Justin Bieber’s loyalty, here’s why Scooter Braun is in his flop era

By Charlie Sawyer

Not only are BMI scores sexist, racist and anxiety-inducing, they’re also massively inaccurate

By Fatou Ferraro Mboup

University academic who sent girl to Iraq for FGM jailed for a meagre 4 and a half years

By Charlie Sawyer

Calls for Gisèle Pelicot to be named TIME Magazine’s Person of the Year after Trump takes title

By Fatou Ferraro Mboup

P&O Cruises under fire after staff caught on film wearing KKK-like costumes at Christmas party

By Charlie Sawyer

What is Make America Healthy Again? Inside Robert F. Kennedy Jr’s plan to cure America’s health system

By Charlie Sawyer

Why are today’s McDonald’s restaurants so dull and grey? Here’s what conspiracy theorists believe

By Fatou Ferraro Mboup

Why does ChatGPT shut down when you ask it about a man called David Mayer? We investigate

By Charlie Sawyer

Young men are turning to testosterone boosters in new TikTok trend linked to right-wing rhetoric

By Abby Amoakuh

Keke Palmer recounts agent’s shocking response to inappropriate kiss scene she had to shoot age 12