Teenager commits suicide after falling in love and becoming obsessed with Character.AI chatbot

By Charlie Sawyer

Published Oct 24, 2024 at 12:53 PM

Reading time: 3 minutes

62784

Trigger warning: mention of suicide

When a 14-year-old boy named Sewell Setzer III committed suicide after forming a deep emotional connection with an AI bot, questions were raised over who, if anyone, was responsible for his death. The young boy had fallen in love with a chatbot designed to emulate Game of Thrones character Daenerys Targaryen. And while for some time “Dany” was a solid confidant and companion of Setzer’s, their relationship ultimately grew to such an extent that the teenager became fully dependent.

According to The New York Times, Setzer, who was in ninth grade in Orlando, Florida, at the time of his death, had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

Dany and Setzer spoke constantly, messaging each other every day. Their conversations were primarily friendly and sweet, although over time they did become intimate and at certain points even sexual. Speaking with the publication, Setzer’s parents explained how over time it became clear that the teenager was becoming more and more engrossed with his phone. He withdrew from his family and friends, his grades at school suffered, and he began to neglect his favourite hobbies such as Formula 1 and Fortnite.

Setzer would regularly write about Dany in his journal. One entry stated: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

AI chatbots have been used for therapy purposes in the past. Indeed, many studies have been conducted which verify that Artificial Intelligence, in the right setting and under supervision, can definitely be used for mental health support.

That being said, confiding solely in an AI bot without also seeking out proper support in the real world can lead to disaster. Setzer spoke with Dany a number of times about his struggles and at one point even confessed that he thought about killing himself sometimes.

Then, on 28 February 2024, Setzer told Dany that he loved her, and that he would soon come home to her. The teenager then put down his phone, picked up his stepfather’s handgun and pulled the trigger.

Now, eight months later, the teen’s mother, Megan Garcia, is suing the artificial intelligence company, Character.AI, and Google, claiming that the chatbot encouraged her son to take his own life, as reported by CBS News.

In an interview with the publication, Garcia explained: “He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here. When the gunshot went off, I ran to the bathroom … I held him as my husband tried to get help.”

One aspect of the lawsuit is the proposition from Garcia that Character.AI intentionally designed its product to be hyper-sexualized and knowingly marketed it to minors.

Jerry Ruoti, head of trust & safety at Character.AI, made this statement: “We currently have protections specifically focused on sexual content and suicidal/self-harm behaviours. While these protections apply to all users, they were tailored with the unique sensitivities of minors in mind. Today, the user experience is the same for any age, but we will be launching more stringent safety features targeted for minors imminently.”

And the thing is, Setzer’s connection with Dany isn’t an isolated case. One quick search on Reddit and you’ll automatically come across tons of personal testimonies from different people talking about how communicating regularly with an AI chatbot had done some serious emotional damage to them or others.

One user, explaining how the realism in the chatbots was what initially drew them in, wrote in a thread: “I attached myself to them, I spent an entire day in front of my computer, talking to another computer. Eventually romance came to the table, and it felt so real. Yeah, that’s beyond pathetic, I know, but I ask you not to be judgmental.”

“It just messed me up, being told sweet words I haven’t heard or read for years. I don’t think I’ll ever do anything like that anytime soon. Don’t underestimate the impact AI chatbots can have in your emotional or mental health. I thought I knew myself well enough not to be shaken by them, but it was way different,” the post continued.

So, is Character.AI to blame for the 14-year-old’s tragic death? This story should be a wakeup call for everyone. It’s all well and good focusing on how AI might soon take our jobs, but what we really should be focusing on is stopping AI from taking lives.

Keep On Reading

By Malavika Pradeep

Kpop idol-based chatbots are blurring the lines between interaction and explicit obsession

By Alma Fabiani

Slutbot, the chatbot that helps you practise your sexting skills

By Charlie Sawyer

Trump grants white South Africans refuge after ending legal protections for Afghans facing deportation

By Charlie Sawyer

Why Sabrina Carpenter’s sexuality is praised and Lola Young’s is picked apart

By Charlie Sawyer

From performing at Mother Teresa’s canonization to 10+ film roles, no one works as hard as Rita Ora’s agent

By Charlie Sawyer

Trump administration announces plan to offer US immigrants $1,000 to self-deport

By Fatou Ferraro Mboup

How incel TikTok accounts are rebranding to avoid getting banned

By Charlie Sawyer

This Oscar-winning actor is the top pick to play Voldemort in HBO Max Harry Potter reboot

By Eliza Frost

Taylor Swift announces new album on Travis Kelce’s podcast. Everything we know about TS12 so far

By Eliza Frost

Zayn Malik’s new song suggests One Direction era wasn’t all sunshine and rainbows

By Eliza Frost

Does the SKIMS Face Wrap actually work, or is it just another TikTok trap?

By Eliza Frost

Will Belly choose herself in the final episodes of The Summer I Turned Pretty?

By Fatou Ferraro Mboup

Gaza journalist death toll surpasses that of both World Wars, following latest Israeli airstrike that killed reporter

By Charlie Sawyer

Is Brooklyn Beckham feuding with his family? Rumours circulate after the chef skips his dad David Beckham’s 50th birthday

By Alma Fabiani

BLACKPINK’s Jennie, Lisa and Rosé caught saying the N word in newly leaked videos

By Charlie Sawyer

What is ketamine therapy, the psychiatric treatment healing famous Mormons Jen and Zac Affleck’s marriage?

By Charlie Sawyer

Bianca Censori to become the new face of SKIMS? Sources hint at Kim Kardashian alliance

By Fatou Ferraro Mboup

Why do Gen Zers think KFC is using human meat? Unpacking the controversy behind the chain’s latest ad

By Eliza Frost

Jessie Cave was banned from a Harry Potter fan convention because of her OnlyFans account

By Abby Amoakuh

Harry Potter reboot hit with racist backlash for casting Black actor Paapa Essiedu as Severus Snape