Hamster versus Lana Del Rey: it’s time to pick a side in TikTok’s latest cult war – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Hamster versus Lana Del Rey: it’s time to pick a side in TikTok’s latest cult war

Do you remember the rise of Step Chickens on TikTok last year? The powerful cult, led by ‘mother hen’ Melissa Ong, changed their profile pictures to the iconic blue-tinted selfie of Ong and set daily missions of raiding various comments sections to initiate more users into the cult. This resulted in Ong gaining millions of followers, writing and recording a Step Chickens song, later starting her own line of merchandise ranging from t-shirts to face masks.

The power of TikTok cults are hence massive with users joining new ones every day. For this very reason, it wasn’t surprising to see Lana Del Rey stans start a TikTok cult earlier this week. But the rise of a parallel Hamster cult, having a go at the Lana cult everywhere on the platform seems to pose a digital dilemma, forcing other users to pick a side.

What is the Lana Del Rey cult?

The Lana Del Rey cult was formed in the wake of the singer’s recent TikTok debut and in the anticipation of her studio album, Chemtrails Over The Country Club, set to drop on 19 March. Many users claim to have joined the cult to show their support in response to various controversies and criticisms surrounding the star.

#Lanacult, which has amassed over 97 million views on the platform, requires members to change their profile picture to a photo of the star smiling in front of a car on fire. Initiation processes require users to post pro-Lana comments like “Join the cult of Lana Del Rey or get cooked in soup!” and “Repent to the lord and savior Lana Del Rey before it’s too late” under random videos on the platform. Though not a strict requirement, members of the cult can alternatively share posts declaring their love and support for the cult leader.

What is the Hamster cult?

The Hamster cult was created with the sole purpose of cancelling Lana Del Rey’s cult. “If you don’t know what the Hamster cult is, it’s a TikTok cult of hamsters and the mascot is the meme ‘staring hamster’. Basically, we hate Lana Del Rey and it’s a TikTok war. So join the Hamster cult today!” wrote a member, introducing Twitter to the cult.

Hamster cult requires users to change their profile picture to that of a viral hamster staring at the camera and spread anti-Lana comments like “Join the Hamster cult. We don’t mock god or burn bibles.” These comments stem from the alleged rumour of the star disrespecting religions and burning bibles in her songs.

Although created with the intent of cancelling Del Rey, the cult has somewhat taken a lighthearted turn with some members showing their love for their furry mascot instead.

@papi_dre1Who’s gonna win?😳 #fyp #foryou #foryoupage #xyzbca #xyzabc #lanadelrey #viral #whathappened #true #fy #4u♬ I spoke to the devil in Miami by XXXTENTACION – Nate

How cult war affects TikTok users

The Hamster versus Lana Del Rey is not a pointless TikTok war. The members of both these cults are backed by a strong sense of purpose as they try to cultivate their fandom and bash each other in the comments section of unrelated TikToks.

The war can’t seem to be ignored by other users on the platform as well. Picking a side and joining one of these cults is guaranteed to get users mutual following from other members. This surge of followers makes the war particularly appealing to those who want to grow their account.

However, it is problematic when these cult battles take off past TikTok. Though the leaders of these cults discourage cyberbullying between members of rival cults, comments often feature personal attacks on each other.

“Check in to see if any of these so-called ‘cults’ your kids might be following are asking them to do something harmful or foolish,” Diana Graber, co-founder of Cyberwise, warns in an interview with Parentology. “Talk to your kids about the wisdom (or lack thereof) of blindly following strangers online, or doing something because ‘everyone’ is doing it and the impact upon their digital reputations,” she adds.

While these cults are yet to manifest into causes of concern, it is always a safe bet to join these with the purpose of what they were meant forharmless digital battles to keep one engaged in one of the most uncertain times.

From live streamed suicides to accidental teen deaths, here’s the dark side of TikTok

When I think of TikTok, the first things that come to mind are dance choreographies, short-lived challenges and cute animal videos. Ask someone else and they’ll probably mention similar concepts such as the Tim Burton challenge, singing bowls’ comeback or the unboxing trend. But no one in their right mind would mention live streamed suicides or teen deaths, right? Well, as it turns out, they wouldn’t be wrong.

TikTok has already made headlines for its strange way of moderating certain types of content. While comments are not getting deleted quickly enough, TikTok moderators are (rightly so) being accused of discriminating and racist content moderation. In other words, although we’re quick to glamourise the sensation that the video-sharing app has become, we tend to forget or ignore its dark side. This time, let’s not do that—let’s look at TikTok’s worst aspects so that we can work towards fixing those (data privacy problems put aside just this once).

In February 2019, a 19-year-old vlogger living in Curitiba, Brazil, took his own life on a TikTok livestream after warning his fans a day earlier that he was planning a “special performance.” Around 280 people watched the man kill himself on the stream, which continued to show his body until TikTok moderators finally took it down. During that time, users posted nearly 500 comments and 15 complaints. It took TikTok three hours to warn police and over an hour and a half to take the video down.

Reportedly, TikTok took steps to prevent the post from going viral first before notifying the authorities and Business Insider reports that the video-sharing app’s first move was to notify its PR team immediately. This story only came out into the open a year after the incident took place, so TikTok’s PR team obviously did a good job at stifling it.

Then, in September 2020, a video of another man committing suicide by shooting himself in the head with a shotgun began circulating on the app. Despite not seeing it myself, I witnessed the mass outcry and shock firsthand in reply videos and their comments section. Against odds, the video remained on the platform for a few days, which resulted in TikTok being heavily criticised for its poor moderation efforts.

In response to the moderators’ inaction, several users ended up posting engagement they had with TikTok moderators, who reportedly told them the video “doesn’t violate our Community Guidelines.” In the meantime, users took the matter into their own hands by sharing videos that warned others about the presence of the suicide clip on TikTok. “Please try not to go on TikTok today or tomorrow,” one video said. “There is a very graphic and gorey suicide video going around right now!”

@alluringskull

#greenscreen

♬ original sound - alluringskull

“Please stop treating this like a meme, please stop treating this like a joke, this is a real person who passed and his family is grieving,” said another TikTok user. In July, the app’s moderation guidelines were questioned once again, after its algorithm promoted a collection of anti-semitic memes soundtracked by the lyrics, “We’re going on a trip to a place called Auschwitz, it’s shower time.” Nearly 100 users featured the song in their videos, which remained on the app for three days.

TikTok’s Transparency Report published in July 2020, says that the app removed over 49 million videos globally in the second half of last year, with 98.2 per cent of those being taken down before they were reported. 89.4 per cent of these were removed before they received any views. Yet, TikTok is known for censoring users and content that doesn’t violate any guidelines, including a teenager who criticised China, those deemed ugly or disabled and Black creators.

Fast forward to October 2020, and another death can be somehow ‘assigned’ to TikTok. 21-year-old Areline Martinez was shot in the head by one of her friends in what has been referred to as an accident, as Mexico News Daily first reported. Martinez was killed while attempting to stage a kidnapping for a TikTok video.

Previous videos posted on Martinez’s TikTok page featured scenes in which she was blindfolded with her hands and feet bound, while men surrounded her and pointed guns at her head. TikTok has since removed these videos. Many of the friends who were involved in the fake kidnapping fled the scene after the killing, though a “behind the scenes” video posted to TikTok before Martinez was killed was used by authorities to identify the individuals.

Undoubtedly, TikTok moderators cannot catch every instance of inappropriate content, but the timeline above clearly highlights the amount of content that goes unnoticed on the app for too long—or sometimes simply ignored by moderators until users start getting involved. TikTok’s content-moderation is a time bomb waiting to explode in our face.

Because teens are using the app not just as a channel for light-hearted fun but also as a space to discuss personal problems, traumas and politics, the more serious the TikTok conversation gets, the more potential mischief and “coordinated inauthentic behaviour,” as the app calls it, its users will face from bad actors. Even Bill Gates called TikTok “a poison chalice.” The question that remains now is how; how can this be stopped?

If you’re struggling with mental health issues and feel like you need help, you can contact the suicide prevention specialists Samaritans in the UK here or the National Suicide Prevention Lifeline in the US here.