As young people turn to chatbots for therapy, we ask a mental health expert about the consequences

By Abby Amoakuh

Updated May 30, 2024 at 04:18 PM

Reading time: 4 minutes

57943

As chatbots are being programmed to talk, laugh, sing, and become more lifelike overall, Gen Zers are increasingly turning to them for advice and comfort. And who could really blame them?

NHS waiting times for talking therapy can range from 4 to 5 months, while the average cost for private counselling currently lies between £40 to £80 per session. Amid coinciding mental health and the cost of living crisis, traditional therapy is looking extremely inaccessible and expensive.

Enter OpenAI. The company rolled out TherapistGBT last year, an “AI designed to provide comfort, advice, and therapeutic support to those seeking mental wellness guidance.”

OpenAI’s Lilian Weng, head of safety systems, even posted on X (formerly known as Twitter) that she had held a “quite emotional, personal conversation with ChatGPT in voice mode” about stress and work-life balance.

“Interestingly I felt heard and warm. Never tried therapy before but this is probably it. Try it especially if you usually just use it as a productivity tool,” she ended.

https://twitter.com/lilianweng/status/1706544602906530000

A smooth and alluring voice coupled with comforting and empathetic advice might seem like a perfect combination. What could really go wrong, right? That’s a rhetorical question of course. Experts have been incredibly critical of OpenAI’s endorsements to use its chatbot for therapeutic purposes.

However, this concern hasn’t made it over to the masses yet and the company’s peers in the artificial intelligence field have also discovered a market in young people seeking affordable and easily accessible care. In January 2024, it was revealed that the psychologist persona on leading chatbot service provider Character.ai, which promises to help people with “life difficulties,” had recorded over 117.2 million interactions since its creation.

While the company does not share how many users are responsible for these interactions, it has reported that 3.5 million people visit the overall site daily, meaning that a considerable number of them might be opting for this one, or many other of its therapeutic chatbots.

When the BBC interviewed its creator, Blazeman98 or 30-year-old New Zealander Sam Zaia, he responded: “I never intended for it to become popular, never intended it for other people to seek or to use as like a tool. Then I started getting a lot of messages from people saying that they had been positively affected by it and were utilising it as a source of comfort.”

Yet researchers are highlighting the risks of relying on chatbots for therapeutic purposes, questioning if they can properly flag and treat serious mental health conditions. Likewise, critics are pointing out that artificial intelligence cannot build a bond with a patient in the same way a human therapist could.

For this reason, SCREENSHOT contacted Alice Hendy, mental health advocate and CEO and founder of the R;pple Suicide Prevention Charity, an online tool that intercepts harmful searches and signposts to free, 24/7 mental health support.

 

View this post on Instagram

 

A post shared by R;pple (@ripplesuicideprevention)

“One of the primary risks of relying on AI as a therapeutic tool is the potential for the dehumanisation of the therapeutic process. In traditional therapy, the therapeutic relationship between the therapist and the client is central to the healing process. This human connection allows for empathy, understanding, and the ability to tailor treatment to the unique needs of the individual. By relying on AI as a replacement for human therapists, there is a risk of losing this essential element of the therapeutic process, potentially leading to a reduction in the quality of care and the depth of healing experienced by clients,” Hendy explained regarding these concerns.

“Another risk of relying on AI as a therapeutic tool is the potential for over-reliance on technology to solve complex human problems. While AI can undoubtedly offer valuable insights and support in therapy, it is essential to recognise that human emotions, experiences, and relationships are inherently complex and multifaceted. The limitations of AI in understanding and responding to the nuances of human experience must be acknowledged, and human therapists should continue to play a central role in the therapeutic process.”

Another reason critics are sceptical of AI is the question of whether it can detect complex mental health conditions, provide culturally sensitive support, and read between the lines of what is being said rather than taking statements entirely at their face value.

Contemplating these questions, Hendy replied: “AI has the potential to detect signs of risky behaviour and alert individuals or relevant authorities. However, the effectiveness of AI in this regard depends on the quality and accuracy of the data it is trained on, as well as the ethical considerations in determining what constitutes risky behaviour.”

Moreover, the mental health advocate warned: “However, it is important to approach this with sensitivity and awareness of potential biases in AI algorithms.”

Considering the motivations that drive young people towards artificial intelligence though, we still wanted to shine a light on some of the advantages that users can get by using chatbots for (minor) emotional concerns.

“One of the key advantages of using chatbots as a therapy tool is their accessibility. Many individuals may not have access to traditional therapy due to financial or logistical barriers. By utilising chatbots, individuals can seek support and guidance at any time and from any location, breaking down the barriers that may prevent them from seeking help,” Hendy emphasised.

“Furthermore, chatbots have the potential to provide continuous support. Traditional therapy sessions are often limited to scheduled appointments, leaving individuals to navigate their mental health challenges on their own in between sessions. Chatbots, on the other hand, can provide continuous support and guidance, offering a consistent source of assistance to those in need.”

Nevertheless, these advantages still drew us back to the initial warning points from experts.

“It’s important to note that while chatbots have the potential to offer valuable support, they should not be seen as a replacement for traditional therapy. Human connection and empathy play a crucial role in therapy, and chatbots cannot fully replicate these aspects,” she said.

We’ve officially entered the Her era of AI if anyone remembers the iconic movie with Joaquin Phoenix from 2013. AI can flirt, tell us stories and jokes, and be convincing to the point that we don’t realise that the empathy and attention we are receiving are entirely synthetic and manufactured. 

Hendy’s point about the importance of authentic human connection becomes the anchor to hold on to in the debate around AI as therapists, considering that the nuances of human emotions and the essential empathy and understanding provided by human therapists cannot be fully replicated by chatbots.

Keep On Reading

By Fatou Ferraro Mboup

Who needs real love when you’ve got AI chatbots? Inside the worrying world of virtual girlfriends 

By Malavika Pradeep

Kpop idol-based chatbots are blurring the lines between interaction and explicit obsession

By Abby Amoakuh

Meta’s Kendall Jenner AI avatar roasts 818 Tequila and endorses other celebrity alcohol brands

By Charlie Sawyer

With the West turning a blind eye to the Taliban’s brutal oppression, Afghan women show their defiance

By Fatou Ferraro Mboup

Teenage boy arrested after creating graphic deepfake AI images of over 50 female students 

By Fatou Ferraro Mboup

Is democracy for sale? How Donald Trump plans to use election betting to declare early victory

By Malavika Pradeep

8 celebrities and fashion moments you might have missed at the $600 million Ambani wedding

By Charlie Sawyer

What is Project 2025, the extreme right-wing wish list being compared to The Handmaid’s Tale?

By Abby Amoakuh

German firm called out for selling vaginal tightening gels, vulva bleach and fake hymens

By Abby Amoakuh

Backpack bans amid US school shootings leave students hiding tampons in their hair and shoes

By Abby Amoakuh

Harris Dickinson and Nicole Kidman’s horny Babygirl trailer bound to divide viewers

By Charlie Sawyer

Dakota Fanning reveals she was asked lots of inappropriate questions when she was a child star

By Abby Amoakuh

Bridgerton’s casting director reveals why her inbox regularly gets flooded with NSFW audition tapes

By J'Nae Phillips

Team Mongolia’s viral uniforms and high-fashion collabs: How Olympic fashion is taking over TikTok

By Fatou Ferraro Mboup

Armie Hammer breaks silence on cannibal rumours and assault allegations in podcast interview

By Charlie Sawyer

Project 2025 requires reporting of pregnancy loss due to chemotherapy

By Abby Amoakuh

Charli XCX secures the Gen Z girlie vote for Kamala Harris by calling her a brat

By Abby Amoakuh

Here is what really happened between Julia Roberts and Travis Kelce at the Eras Tour in Dublin

By Charlie Sawyer

TikTok Tradwife Estee Williams tells women how to be feminine, fit, and friendly to attract a wealthy man

By Fatou Ferraro Mboup

Fans left angered over 50 Cent’s reaction to Power actor Michael Rainey Jr. being groped on a livestream