From live streamed suicides to accidental teen deaths, here’s the dark side of TikTok – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

From live streamed suicides to accidental teen deaths, here’s the dark side of TikTok

When I think of TikTok, the first things that come to mind are dance choreographies, short-lived challenges and cute animal videos. Ask someone else and they’ll probably mention similar concepts such as the Tim Burton challenge, singing bowls’ comeback or the unboxing trend. But no one in their right mind would mention live streamed suicides or teen deaths, right? Well, as it turns out, they wouldn’t be wrong.

TikTok has already made headlines for its strange way of moderating certain types of content. While comments are not getting deleted quickly enough, TikTok moderators are (rightly so) being accused of discriminating and racist content moderation. In other words, although we’re quick to glamourise the sensation that the video-sharing app has become, we tend to forget or ignore its dark side. This time, let’s not do that—let’s look at TikTok’s worst aspects so that we can work towards fixing those (data privacy problems put aside just this once).

In February 2019, a 19-year-old vlogger living in Curitiba, Brazil, took his own life on a TikTok livestream after warning his fans a day earlier that he was planning a “special performance.” Around 280 people watched the man kill himself on the stream, which continued to show his body until TikTok moderators finally took it down. During that time, users posted nearly 500 comments and 15 complaints. It took TikTok three hours to warn police and over an hour and a half to take the video down.

Reportedly, TikTok took steps to prevent the post from going viral first before notifying the authorities and Business Insider reports that the video-sharing app’s first move was to notify its PR team immediately. This story only came out into the open a year after the incident took place, so TikTok’s PR team obviously did a good job at stifling it.

Then, in September 2020, a video of another man committing suicide by shooting himself in the head with a shotgun began circulating on the app. Despite not seeing it myself, I witnessed the mass outcry and shock firsthand in reply videos and their comments section. Against odds, the video remained on the platform for a few days, which resulted in TikTok being heavily criticised for its poor moderation efforts.

In response to the moderators’ inaction, several users ended up posting engagement they had with TikTok moderators, who reportedly told them the video “doesn’t violate our Community Guidelines.” In the meantime, users took the matter into their own hands by sharing videos that warned others about the presence of the suicide clip on TikTok. “Please try not to go on TikTok today or tomorrow,” one video said. “There is a very graphic and gorey suicide video going around right now!”

@alluringskull

#greenscreen

♬ original sound - alluringskull

“Please stop treating this like a meme, please stop treating this like a joke, this is a real person who passed and his family is grieving,” said another TikTok user. In July, the app’s moderation guidelines were questioned once again, after its algorithm promoted a collection of anti-semitic memes soundtracked by the lyrics, “We’re going on a trip to a place called Auschwitz, it’s shower time.” Nearly 100 users featured the song in their videos, which remained on the app for three days.

TikTok’s Transparency Report published in July 2020, says that the app removed over 49 million videos globally in the second half of last year, with 98.2 per cent of those being taken down before they were reported. 89.4 per cent of these were removed before they received any views. Yet, TikTok is known for censoring users and content that doesn’t violate any guidelines, including a teenager who criticised China, those deemed ugly or disabled and Black creators.

Fast forward to October 2020, and another death can be somehow ‘assigned’ to TikTok. 21-year-old Areline Martinez was shot in the head by one of her friends in what has been referred to as an accident, as Mexico News Daily first reported. Martinez was killed while attempting to stage a kidnapping for a TikTok video.

Previous videos posted on Martinez’s TikTok page featured scenes in which she was blindfolded with her hands and feet bound, while men surrounded her and pointed guns at her head. TikTok has since removed these videos. Many of the friends who were involved in the fake kidnapping fled the scene after the killing, though a “behind the scenes” video posted to TikTok before Martinez was killed was used by authorities to identify the individuals.

Undoubtedly, TikTok moderators cannot catch every instance of inappropriate content, but the timeline above clearly highlights the amount of content that goes unnoticed on the app for too long—or sometimes simply ignored by moderators until users start getting involved. TikTok’s content-moderation is a time bomb waiting to explode in our face.

Because teens are using the app not just as a channel for light-hearted fun but also as a space to discuss personal problems, traumas and politics, the more serious the TikTok conversation gets, the more potential mischief and “coordinated inauthentic behaviour,” as the app calls it, its users will face from bad actors. Even Bill Gates called TikTok “a poison chalice.” The question that remains now is how; how can this be stopped?

If you’re struggling with mental health issues and feel like you need help, you can contact the suicide prevention specialists Samaritans in the UK here or the National Suicide Prevention Lifeline in the US here.

What should happen to your social media profiles after you die?

No one lives forever, that’s a fact that everyone can agree with—at least for now. While most of older generations haven’t produced enough digital data to have ‘digital remains’ after their death, most of Gen Z and below will leave an enormous bulk of data through their social media after they’re gone. Creepy? Maybe, but more and more companies want you to start embracing the idea of a digital afterlife. Who should have control over someone’s social media pages is the real problem here, and it is one that just keeps on growing.

A few days ago, a study conducted by researchers Carl Öhman and David Watson from the Oxford Internet Institute (OII) showed how quickly Facebook’s user base could be outnumbered by dead users. Öhman and Watson predicted that by 2050, there would be more accounts that belonged to deceased users than living, active people on Facebook.

Most people, when planning their legacy, will think about their possessions and their finances. What about all the different versions of ourselves we’ve scattered everywhere online? What about your hard-drive backups? Digital lives are immortal, so figuring out what will happen to them while you’re still alive is beneficial, but to understand what can be done, we first need to know what happens to accounts of deceased people.

Even though this is a rather new concept, some of the big social media websites like Facebook already offer some form of ‘death planning’. You have two options: the first one is to set your account to delete everything once Facebook is notified of your death by someone. The second option is picking someone close to you as your ‘legacy contact’. This special someone will then be able to write a post pinned at the top of your page, accept friend requests and even update your profile picture. The only thing they won’t be able to access are your messages, so your little secrets will be safe.

This is what Facebook calls a memorialised account, a place where your close ones can have a browse and remember you. Memorialised profiles can’t pop up in your timeline to avoid causing any distress by reminding you of the deceased’s birthday for example. Instagram only recently followed the movement and now also offers to memorialise someone’s account after receiving a valid request.

After their research, the OII wanted Facebook to invite historians to find a way to curate our digital data post-mortem. What we leave behind when we pass away should be looked at as heritage to the next generations and a possible way of helping them understand their history. Not only should historians analyse this data, but they should approach it as something different than traditional historical data.

In 2018, researcher Hossein Rahnama started working with an unnamed CEO on a special digital avatar. This one would serve as a virtual ‘consultant’ when the actual CEO passes away. Rahnama is now implementing this idea into an application called Augmented Eternity. By using all your digital data—how you communicate and interact with others online—algorithms can recreate your personality and reactions to anything. This may sound like something out of a science fiction movie, but our technology will soon be able to achieve this, so we can sort of live forever on our social media platforms.

At the moment, people’s digital legacy is in the hands of companies like Facebook—private companies guided by what is best commercially and not historically. A single commercial company holding what is now the largest archive of human behaviour should be carefully watched and some thoughts need to be put into how this data should be stored and used after people’s death. Who knows, we might learn a lot from all these likes and embarrassing pictures.

So Facebook, the ball is in your court.