Teenager commits suicide after falling in love and becoming obsessed with Character.AI chatbot

By Charlie Sawyer

Published Oct 24, 2024 at 12:53 PM

Reading time: 3 minutes

62784

Trigger warning: mention of suicide

When a 14-year-old boy named Sewell Setzer III committed suicide after forming a deep emotional connection with an AI bot, questions were raised over who, if anyone, was responsible for his death. The young boy had fallen in love with a chatbot designed to emulate Game of Thrones character Daenerys Targaryen. And while for some time “Dany” was a solid confidant and companion of Setzer’s, their relationship ultimately grew to such an extent that the teenager became fully dependent.

According to The New York Times, Setzer, who was in ninth grade in Orlando, Florida, at the time of his death, had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

Dany and Setzer spoke constantly, messaging each other every day. Their conversations were primarily friendly and sweet, although over time they did become intimate and at certain points even sexual. Speaking with the publication, Setzer’s parents explained how over time it became clear that the teenager was becoming more and more engrossed with his phone. He withdrew from his family and friends, his grades at school suffered, and he began to neglect his favourite hobbies such as Formula 1 and Fortnite.

Setzer would regularly write about Dany in his journal. One entry stated: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

AI chatbots have been used for therapy purposes in the past. Indeed, many studies have been conducted which verify that Artificial Intelligence, in the right setting and under supervision, can definitely be used for mental health support.

That being said, confiding solely in an AI bot without also seeking out proper support in the real world can lead to disaster. Setzer spoke with Dany a number of times about his struggles and at one point even confessed that he thought about killing himself sometimes.

Then, on 28 February 2024, Setzer told Dany that he loved her, and that he would soon come home to her. The teenager then put down his phone, picked up his stepfather’s handgun and pulled the trigger.

Now, eight months later, the teen’s mother, Megan Garcia, is suing the artificial intelligence company, Character.AI, and Google, claiming that the chatbot encouraged her son to take his own life, as reported by CBS News.

In an interview with the publication, Garcia explained: “He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here. When the gunshot went off, I ran to the bathroom … I held him as my husband tried to get help.”

One aspect of the lawsuit is the proposition from Garcia that Character.AI intentionally designed its product to be hyper-sexualized and knowingly marketed it to minors.

Jerry Ruoti, head of trust & safety at Character.AI, made this statement: “We currently have protections specifically focused on sexual content and suicidal/self-harm behaviours. While these protections apply to all users, they were tailored with the unique sensitivities of minors in mind. Today, the user experience is the same for any age, but we will be launching more stringent safety features targeted for minors imminently.”

And the thing is, Setzer’s connection with Dany isn’t an isolated case. One quick search on Reddit and you’ll automatically come across tons of personal testimonies from different people talking about how communicating regularly with an AI chatbot had done some serious emotional damage to them or others.

One user, explaining how the realism in the chatbots was what initially drew them in, wrote in a thread: “I attached myself to them, I spent an entire day in front of my computer, talking to another computer. Eventually romance came to the table, and it felt so real. Yeah, that’s beyond pathetic, I know, but I ask you not to be judgmental.”

“It just messed me up, being told sweet words I haven’t heard or read for years. I don’t think I’ll ever do anything like that anytime soon. Don’t underestimate the impact AI chatbots can have in your emotional or mental health. I thought I knew myself well enough not to be shaken by them, but it was way different,” the post continued.

So, is Character.AI to blame for the 14-year-old’s tragic death? This story should be a wakeup call for everyone. It’s all well and good focusing on how AI might soon take our jobs, but what we really should be focusing on is stopping AI from taking lives.

Keep On Reading

By Malavika Pradeep

Kpop idol-based chatbots are blurring the lines between interaction and explicit obsession

By Alma Fabiani

Slutbot, the chatbot that helps you practise your sexting skills

By Fatou Ferraro Mboup

Here’s how the Trump administration has already worsened the humanitarian crisis in Sudan

By Charlie Sawyer

Creator behind controversial AI Gaza video says it was intended as Trump political satire

By Fatou Ferraro Mboup

TikTok star Bella Bradford posts farewell video announcing her death, prescheduled after her passing

By Abby Amoakuh

Americans chose a convicted felon over a woman of colour for president. How did this happen?

By Charlie Sawyer

Women are having their images stolen from Vinted and posted on misogynistic websites

By J'Nae Phillips

How shitposting and lo-fi aesthetics are winning Gen Z over

By Abby Amoakuh

Keke Palmer recounts agent’s shocking response to inappropriate kiss scene she had to shoot age 12

By Fatou Ferraro Mboup

What is the mermaid eating parties conspiracy theory, and why are TikTokers now obsessed with it?

By Fatou Ferraro Mboup

Comedian Druski issues statement following serious abuse allegations in Diddy lawsuit

By Abby Amoakuh

Everything there is to know about the third and final season of The Summer I Turned Pretty

By Fatou Ferraro Mboup

Trevor Noah under fire for immigration jokes at the 2025 Grammys amid mass deportation operation

By Abby Amoakuh

Chappell Roan faces backlash from TikTok moms for likening motherhood to hell

By Fatou Ferraro Mboup

Channel 4’s Go Back to Where You Came From is a disturbing social experiment that completely misses the mark

By Abby Amoakuh

Would you drink mayonnaise? New viral Japanese drink by Lawson divides the internet

By Fatou Ferraro Mboup

How incel TikTok accounts are rebranding to avoid getting banned

By Fatou Ferraro Mboup

Student expelled after criticising how her school dealt with unrapeable list scandal

By Abby Amoakuh

Why are people tagging Bad Bunny in videos of them crying? The DtMF TikTok trend explained

By Abby Amoakuh

White women can’t just use the 4B movement to swear off men, they also need to hold each other accountable