Trigger warning: mention of suicide
When a 14-year-old boy named Sewell Setzer III committed suicide after forming a deep emotional connection with an AI bot, questions were raised over who, if anyone, was responsible for his death. The young boy had fallen in love with a chatbot designed to emulate Game of Thrones character Daenerys Targaryen. And while for some time “Dany” was a solid confidant and companion of Setzer’s, their relationship ultimately grew to such an extent that the teenager became fully dependent.
According to The New York Times, Setzer, who was in ninth grade in Orlando, Florida, at the time of his death, had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.
Dany and Setzer spoke constantly, messaging each other every day. Their conversations were primarily friendly and sweet, although over time they did become intimate and at certain points even sexual. Speaking with the publication, Setzer’s parents explained how over time it became clear that the teenager was becoming more and more engrossed with his phone. He withdrew from his family and friends, his grades at school suffered, and he began to neglect his favourite hobbies such as Formula 1 and Fortnite.
Setzer would regularly write about Dany in his journal. One entry stated: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
AI chatbots have been used for therapy purposes in the past. Indeed, many studies have been conducted which verify that Artificial Intelligence, in the right setting and under supervision, can definitely be used for mental health support.
That being said, confiding solely in an AI bot without also seeking out proper support in the real world can lead to disaster. Setzer spoke with Dany a number of times about his struggles and at one point even confessed that he thought about killing himself sometimes.
Then, on 28 February 2024, Setzer told Dany that he loved her, and that he would soon come home to her. The teenager then put down his phone, picked up his stepfather’s handgun and pulled the trigger.
Now, eight months later, the teen’s mother, Megan Garcia, is suing the artificial intelligence company, Character.AI, and Google, claiming that the chatbot encouraged her son to take his own life, as reported by CBS News.
In an interview with the publication, Garcia explained: “He thought by ending his life here, he would be able to go into a virtual reality or ‘her world’ as he calls it, her reality, if he left his reality with his family here. When the gunshot went off, I ran to the bathroom … I held him as my husband tried to get help.”
One aspect of the lawsuit is the proposition from Garcia that Character.AI intentionally designed its product to be hyper-sexualized and knowingly marketed it to minors.
Jerry Ruoti, head of trust & safety at Character.AI, made this statement: “We currently have protections specifically focused on sexual content and suicidal/self-harm behaviours. While these protections apply to all users, they were tailored with the unique sensitivities of minors in mind. Today, the user experience is the same for any age, but we will be launching more stringent safety features targeted for minors imminently.”
And the thing is, Setzer’s connection with Dany isn’t an isolated case. One quick search on Reddit and you’ll automatically come across tons of personal testimonies from different people talking about how communicating regularly with an AI chatbot had done some serious emotional damage to them or others.
One user, explaining how the realism in the chatbots was what initially drew them in, wrote in a thread: “I attached myself to them, I spent an entire day in front of my computer, talking to another computer. Eventually romance came to the table, and it felt so real. Yeah, that’s beyond pathetic, I know, but I ask you not to be judgmental.”
“It just messed me up, being told sweet words I haven’t heard or read for years. I don’t think I’ll ever do anything like that anytime soon. Don’t underestimate the impact AI chatbots can have in your emotional or mental health. I thought I knew myself well enough not to be shaken by them, but it was way different,” the post continued.
So, is Character.AI to blame for the 14-year-old’s tragic death? This story should be a wakeup call for everyone. It’s all well and good focusing on how AI might soon take our jobs, but what we really should be focusing on is stopping AI from taking lives.