TikTok users are using secret hashtags to discuss self harm on the platform – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

TikTok users are using secret hashtags to discuss self harm on the platform

Speaking to Huck, 19-year-old Becca from Essex, who is currently in a psychiatric ward told the publication she uses TikTok regularly to speak up about her experience with self-harm, “It’s really fun and makes the time go by a little bit quicker.” As it turns out, with more than 13,000 followers to her name, Becca is not the only gen Zer using the video-sharing platform to discuss self-harm.

Only, one thing doesn’t seem quite right—none of Becca’s videos are tagged #selfharm or #selfharmrecovery. That’s because if you were to search either of these hashtags on TikTok, no videos would come up. Instead, you’ll find a support resources page and the Samaritans number.

Like many other social media platforms before, TikTok felt like censoring the hashtag #selfharm would help reduce the spread of self-harm and suicide content online. But for tech-savvy gen Zers like Becca, banning a handful of hashtags didn’t do much; instead, TikTokers created ‘secret’ self-harm hashtags to talk candidly about the triggering topic.

“I think it’s a really good coping mechanism for me, because it allows me to be funny and lighthearted, and it also allows me to use my voice to hopefully help other people,” she told Huck, and while her initiative definitely sounds like a healthy way to tackle taboos surrounding self-harm, unmoderated hashtags like the ones used by Becca also represent worrying grey areas where any content is free to circulate.

17-year-old Aoibheann, known as @barcodebabyx on TikTok, is another user who posts under these secret hashtags. Just like Becca, she enjoys using dark humour to talk openly about self-harm. “I find it helps, because people who’ve gone through the same thing relate,” she told Huck. Her honest, upfront approach clearly resonates with a lot of people, as she currently boasts an impressive 22,900 followers on the app.

While it’s clear that some of the content shared under these secret hashtags bear a positive and educative message for other self-harm victims, Aoibheann herself admits to being triggered by specific content found under them. “It’s not really their fault,” she says, speaking of the creators behind the videos which caused her distress. Triggers are complicated, and users sometimes unknowingly upload content which can cause others to spiral: “Like personally, if I see that someone has worse scars than me, I’m like, ‘Oh, I’m not that bad, theirs is worse.’ But that’s not their fault.”

In a way, TikTok has inadvertently made it easier for people to stumble across triggering videos. “I have been triggered by some content. If there’s like, obvious open wounds and they haven’t used appropriate warnings, it can catch you off-guard,” Becca added.

Censoring ‘triggering’ content remains one of the biggest problems social media platforms face as it isn’t as straightforward as making the platform free of graphic imagery (although that certainly helps). Recovering self-harmers, like any other group, are not a homogenous mass to which a one-size-fits-all rule can be applied. How do you moderate something where responses differ widely from person to person?

According to Mental Health First Aid England, 25.7 per cent of women and 9.7 per cent of men aged 16 to 24 report having self-harmed at some point in their life. Research published in February 2021 found that the rate of self-harm among young children in the UK has doubled over the last six years, with an average of 10 children aged between 9 to 12 being admitted to hospital each week after intentionally injuring themselves.

That’s why it is crucial that TikTok’s content moderation practices effectively separate helpful from harmful self-harm related posts. Similar to what Instagram launched in 2019, professionals recommend that TikTok tries to do the same by introducing ‘Sensitivity Screens’ to the app to warn users when they’re about to see a post about self-harm.

However, this problem doesn’t start with social media—evidently, there wouldn’t be so many triggering videos if fewer young people were struggling in the first place. As Huck stated, we’re currently in the middle of a youth mental health crisis, there’s no denying it. In the UK, child and adolescent mental health services (CAMHS) have been particularly neglected, accounting for under one per cent of total NHS spending.

Meanwhile, people aged 10 to 24 years in England and Wales have seen one of the greatest increases in suicide rates over the past decade. So, of course, TikTok should ensure robust content moderation to prevent users from seeing distressing content, but the same amount of energy, attention, and time should also be given towards demanding that the government’s policies on child and adolescent mental health services are robust too.

From live streamed suicides to accidental teen deaths, here’s the dark side of TikTok

When I think of TikTok, the first things that come to mind are dance choreographies, short-lived challenges and cute animal videos. Ask someone else and they’ll probably mention similar concepts such as the Tim Burton challenge, singing bowls’ comeback or the unboxing trend. But no one in their right mind would mention live streamed suicides or teen deaths, right? Well, as it turns out, they wouldn’t be wrong.

TikTok has already made headlines for its strange way of moderating certain types of content. While comments are not getting deleted quickly enough, TikTok moderators are (rightly so) being accused of discriminating and racist content moderation. In other words, although we’re quick to glamourise the sensation that the video-sharing app has become, we tend to forget or ignore its dark side. This time, let’s not do that—let’s look at TikTok’s worst aspects so that we can work towards fixing those (data privacy problems put aside just this once).

In February 2019, a 19-year-old vlogger living in Curitiba, Brazil, took his own life on a TikTok livestream after warning his fans a day earlier that he was planning a “special performance.” Around 280 people watched the man kill himself on the stream, which continued to show his body until TikTok moderators finally took it down. During that time, users posted nearly 500 comments and 15 complaints. It took TikTok three hours to warn police and over an hour and a half to take the video down.

Reportedly, TikTok took steps to prevent the post from going viral first before notifying the authorities and Business Insider reports that the video-sharing app’s first move was to notify its PR team immediately. This story only came out into the open a year after the incident took place, so TikTok’s PR team obviously did a good job at stifling it.

Then, in September 2020, a video of another man committing suicide by shooting himself in the head with a shotgun began circulating on the app. Despite not seeing it myself, I witnessed the mass outcry and shock firsthand in reply videos and their comments section. Against odds, the video remained on the platform for a few days, which resulted in TikTok being heavily criticised for its poor moderation efforts.

In response to the moderators’ inaction, several users ended up posting engagement they had with TikTok moderators, who reportedly told them the video “doesn’t violate our Community Guidelines.” In the meantime, users took the matter into their own hands by sharing videos that warned others about the presence of the suicide clip on TikTok. “Please try not to go on TikTok today or tomorrow,” one video said. “There is a very graphic and gorey suicide video going around right now!”

@alluringskull

#greenscreen

♬ original sound - Alluringskull

“Please stop treating this like a meme, please stop treating this like a joke, this is a real person who passed and his family is grieving,” said another TikTok user. In July, the app’s moderation guidelines were questioned once again, after its algorithm promoted a collection of anti-semitic memes soundtracked by the lyrics, “We’re going on a trip to a place called Auschwitz, it’s shower time.” Nearly 100 users featured the song in their videos, which remained on the app for three days.

TikTok’s Transparency Report published in July 2020, says that the app removed over 49 million videos globally in the second half of last year, with 98.2 per cent of those being taken down before they were reported. 89.4 per cent of these were removed before they received any views. Yet, TikTok is known for censoring users and content that doesn’t violate any guidelines, including a teenager who criticised China, those deemed ugly or disabled and Black creators.

Fast forward to October 2020, and another death can be somehow ‘assigned’ to TikTok. 21-year-old Areline Martinez was shot in the head by one of her friends in what has been referred to as an accident, as Mexico News Daily first reported. Martinez was killed while attempting to stage a kidnapping for a TikTok video.

Previous videos posted on Martinez’s TikTok page featured scenes in which she was blindfolded with her hands and feet bound, while men surrounded her and pointed guns at her head. TikTok has since removed these videos. Many of the friends who were involved in the fake kidnapping fled the scene after the killing, though a “behind the scenes” video posted to TikTok before Martinez was killed was used by authorities to identify the individuals.

Undoubtedly, TikTok moderators cannot catch every instance of inappropriate content, but the timeline above clearly highlights the amount of content that goes unnoticed on the app for too long—or sometimes simply ignored by moderators until users start getting involved. TikTok’s content-moderation is a time bomb waiting to explode in our face.

Because teens are using the app not just as a channel for light-hearted fun but also as a space to discuss personal problems, traumas and politics, the more serious the TikTok conversation gets, the more potential mischief and “coordinated inauthentic behaviour,” as the app calls it, its users will face from bad actors. Even Bill Gates called TikTok “a poison chalice.” The question that remains now is how; how can this be stopped?

If you’re struggling with mental health issues and feel like you need help, you can contact the suicide prevention specialists Samaritans in the UK here or the National Suicide Prevention Lifeline in the US here.