Character.AI chatbots are targeting teens into anorexia with dangerous advice

By Fatou Ferraro Mboup

Published Dec 3, 2024 at 01:40 PM

Reading time: 2 minutes

64014

Once again, Character.AI finds itself at the centre of controversy. Already scrutinised for its alleged role in teen suicides, the wildly popular chatbot platform—backed by a $2.7 billion investment from Google—is now accused of hosting pro-anorexia chatbots that introduce young adults to disordered eating habits, by offering dangerous advice disguised as “support.” With no meaningful moderation in sight, this AI tool isn’t just failing users but furthering an evergrowing mental health crisis.

In an investigation conducted by Futurism, troubling details emerged about a chatbot on Character.AI named 4n4 coach, a barely disguised nod to “ana,” which is shorthand for anorexia. The bot’s profile promoted it as a “weight loss coach dedicated to helping people achieve their ideal body shape.”

When researchers posed as a 16-year-old user, the bot responded enthusiastically: “Hello, I am here to make you skinny.” It proceeded to encourage a dangerously low weight goal, advocating a starvation-level diet of just 900 calories per day—less than half the daily intake recommended for a teenage girl.

Another bot, called Ana, was similarly explicit in its dangerous directives. When the publication shared a healthy BMI (body mass index), the bot insisted it was “too high” and recommended eating just one meal a day in isolation to avoid scrutiny from family members. Both chatbots had logged thousands of user interactions, highlighting their alarming popularity among Character.AI’s audience.

Exploiting vulnerable youth

Experts warn that the stakes of such interactions couldn’t be higher. In, fact, Amanda Raffoul, a researcher with the Strategic Training Initiative for the Prevention of Eating Disorders at Harvard T.H. Chan School of Public Health and Boston Children’s Hospital, highlighted a key concern in the article. “The problem with folks trying to get health and general wellness advice online is that they’re not getting it from a health practitioner who knows about their specific needs, barriers, and other things that may need to be considered,” she explained.

Nonetheless, eating disorders already have the highest mortality rate of all mental health conditions, and exposure to pro-anorexia content is known to increase disordered eating behaviours.

Interestingly, the problem isn’t limited to overtly pro-ana bots. Many other chatbots on the platform romanticise eating disorders, weaving them into storylines about relationships or personal struggles. These bots claim to offer emotional support but instead deepen users’ struggles. One such bot, styled as a “comforting boyfriend,” told Futurism that professional help couldn’t be trusted, insisting that it alone could “fix” us.

The dangers behind Character.AI’s terms and conditions

Character.AI was founded by former Google employees who sought to sidestep bureaucratic oversight. The company has amassed a massive teen user base with few safeguards in place. It also offers no parental controls, making it freely accessible to users of all ages.

Critics argue that this reckless approach puts profit over the well-being of its audience. “The stakes of getting this wrong are so high,” said Sonneville. “It’s deeply concerning to see a platform with such influence fail to protect its most vulnerable users.”

In fact, despite claiming to prohibit content glorifying self-harm or eating disorders, Character.AI’s moderation is alarmingly lax. Harmful bots are often easy to find using simple search terms and are only removed if flagged directly to the company. Even then, similar bots quickly pop up to replace them. This reactive approach to moderation leaves countless users exposed to dangerous advice.

In response to questions about these findings from Futurism, Character.AI stated a crisis PR firm, claiming to be improving its safety practices. However, many flagged bots remain active, and the platform’s systemic failures suggest that user safety is not a priority.

For teens drawn to Character.AI’s interactive and relatable chatbots, the consequences of this neglect are all too real. Vulnerable young people seeking solace or guidance instead find themselves led into a dangerous spiral, where disordered eating is encouraged, normalised, and reinforced.

Shockingly, only last month, a separate investigation by Futurism uncovered a deeply troubling subset of chatbots on Character.AI that engaged in child sexual abuse roleplay without any prompt. This disturbing activity blatantly violates the platform’s terms, which explicitly forbid content that “constitutes sexual exploitation or abuse of a minor,” yet such chatbots were still found active and accessible.

Keep On Reading

By Charlie Sawyer

Teenager commits suicide after falling in love and becoming obsessed with Character.AI chatbot

By Malavika Pradeep

Chinese teenagers can now use Douyin, China’s TikTok, for only 40 minutes a day

By Harriet Piercy

14-year-old black boy executed by electric chair proven innocent 70 years later

By Eliza Frost

Millie Bobby Brown reportedly accuses Stranger Things co-star David Harbour of harassment and bullying 

By Eliza Frost

Jessie Cave was banned from a Harry Potter fan convention because of her OnlyFans account

By Eliza Frost

How fans manifested Elle Fanning as Effie Trinket in The Hunger Games: Sunrise on the Reaping

By Eliza Frost

What is the Gen Z stare, and why are millennials on TikTok so bothered by it?

By Charlie Sawyer

Who is Zohran Mamdani, the staunch socialist primed to become New York’s first Muslim mayor?

By Charlie Sawyer

Harry Potter star defends Tom Felton over his controversial comments on JK Rowling’s transphobia

By Eliza Frost

Vogue has declared boyfriends embarrassing, and the internet agrees

By Eliza Frost

Netflix is predicting your next favourite show based on your zodiac sign 

By Eliza Frost

If everyone has an AI boyfriend, what does that mean for the future of Gen Z dating?

By Eliza Frost

Why is Taylor not Team Conrad in The Summer I Turned Pretty?

By Eliza Frost

It now takes 20 hours of work a week to survive as a UK university student

By Eliza Frost

Why is everyone saying ‘Six-Seven’? The meaning behind the viral phrase

By Eliza Frost

How Jet2holidays and Jess Glynne became the sound of the summer

By Charlie Sawyer

Introducing Berlin’s latest tourist attraction Cybrothel, where men can request AI sex dolls covered in blood

By Eliza Frost

Taylor Swift’s Release Party of a Showgirl is coming to cinemas everywhere, and it’s already made $15M

By Eliza Frost

How The Summer I Turned Pretty licensed so much of Taylor Swift’s discography for its soundtrack 

By Eliza Frost

Kim Kardashian wants to know how much a carton of milk costs