How did a 16-year-old boy become radicalised through ISIS-themed Roblox servers? – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

How did a 16-year-old boy become radicalised through ISIS-themed Roblox servers?

If you search Roblox on Google, among other queries featured in the search engine’s ‘People also ask’ category, you’ll find one result at the top that reads: “Is Roblox appropriate for a 7-year-old?” Far be it from me to present myself as a gaming expert, but up until today, I would have probably answered this question affirmatively. How naive I was.

To put it simply, Roblox is an online game platform and game creation system that allows users to play creations made by its community. Within these, players can also chat with each other, which I should probably have spotted as the game’s first red flag considering that over half of the platform’s users are under the age of 13. Online unfiltered socialising for underage individuals? Rarely a good thing. But wait, it gets worse.

A 16-year-old Singaporean boy has been detained by the country’s authorities under strict new terror laws after he was found to have been playing on “multiple Islamic State-themed servers on Roblox.” The teenager, who retains his anonymity because of the fact that he’s still a minor, “was issued with a restriction order in January, limiting his movements and preventing him from issuing public statements,” the South China Morning Post (SCMP) reported on Tuesday 21 February 2023.

While the restriction order was issued this year, it wasn’t the first time that the young boy had caught the attention of Singaporean authorities. In November 2020, when he was only 14 years old, the country’s Internal Security Department (ISD) decided to keep a close watch on the boy after it was discovered that he had been spending a worrying amount of time role-playing as an ISIS combatant on Roblox. It seems that his online radicalisation has only escalated over the past two and a half years.

Releasing a statement at the time, the ISD explained that the boy had used the social gaming platform to replicate ISIS conflict zones such as Syria and Marawi city in the southern Philippines, and regarded himself as an ISIS member after taking the ‘bai`ah’ (allegiance) to an in-game “ISIS leader.”

He played out his fantasies on the game, where he would shoot and kill enemies and undertake roles as the “spokesperson” and “chief propagandist” for his virtual ISIS faction, the ISD further revealed in its statement.

Things kept on escalating from there, with Channel News Asia (CNA) reporting that “the teen was also attracted to Islamic eschatological prophecies after watching YouTube videos and had come across Islamic State songs from online music streaming platforms.”

Like countless other young and impressionable individuals online, the boy was found to have “an interest in far-right extremist content, including those which were anti-semitic and supportive of neo-Nazi groups whose ideologies promoted a ‘race war’.”

The boy was also alleged to have been in contact with Muhammad Irfan Danyal Bin Mohamad Nor, an 18-year-old who was arrested in December 2022 under Singapore’s sweeping (and highly controversial) Internal Security Act (ISA) laws, which allow the government to imprison terror suspects for up to two years without trial. Irfan had been planning “to set up an Islamic caliphate on Singapore’s Coney Island.”

Another teenage boy—a 15-year-old who is the youngest person to be held under the country’s new law—has been detained since November 2022 after he was arrested for planning to carry out multiple knife attacks across Singapore.

The ISD also stated that the young boy even thought about beheading non-Muslims in popular tourist areas and becoming a suicide bomber. “At the point of his arrest, the youth was deeply entrenched in his radical views, but had yet to undertake any steps towards actualising his attack ideations,” it added.

According to the SCMP, a total of 11 people under the age of 21 have been punished under the ISA since 2015. Seven were detained and four given restriction orders.

YouTube Shorts’ problematic algorithm is rife with transphobia and misogyny

Lusting over the hype and popularity of media competitor TikTok, YouTube decided to create its very own short-form video-sharing platform: YouTube Shorts. Launching in 2020, the service offered users carousels of 60 second clips, using a so-called “smart algorithm” to predict and deliver an individual’s favourite content—much like the way in which your For You Page (FYP) might detect a particular interest (or obsession, in my case) with the ice haircut moulded out of gel and therefore keep you engaged with similar videos related to the odd new hair trend.

Since its kickoff, YouTube Shorts has struggled to compete with its slightly sassier cousin TikTok. As of September 2022, the platform has tried to combat this by unveiling a new way for creators to earn revenue from the short-video format. The Google-owned streaming service announced on 21 September that it would introduce advertising on its video feature Shorts and give creators, hello PNGTubers, 45 per cent of the revenue, as reported by Business Today.

Despite this supposed progress, it should also be highlighted that, as noted by Quartz, YouTube is currently facing an all-time low in regards to its advertising earnings—making it the official weakest link in Google’s conglomerate.

Advertisements aside, a greater problem is currently plaguing the infant feature as users and creators alike are beginning to make noise about the downsides of the platform’s algorithm. Most significantly, its tendency to promote transphobic and misogynistic content.

On 22 October, author, creator, and YouTube OG Hank Green tweeted: “TikTok definitely remains better than Shorts at not randomly showing me transphobic / misogynistic content just because I like science stuff and video game clips. It’s like ‘We’ve noticed you like physics, might I interest you in some men’s rights?’”

According to Mashable, despite the platform’s overwhelming initial success—reaching more than 1.5 billion monthly users in June 2022—numerous reports are now coming out from users who’ve reported being shown transphobic Shorts. And to make matters worse, this type of content is spreading across other social media websites too.

Regarding the growing problem within YouTube Shorts, some have taken to Reddit to air their frustrations. One user emphasised how, due to the nature of the algorithm, you can be served transphobic or harmful content despite having never consumed videos of that kind before—unlike the TikTok FYP, which delivers you content specifically based on your preferences.

It can happen if you watch a non transphobic video from someone who’s made transphobic videos. Not that I ever do that deliberately but it’s easily done, you know? I even remember I watched a regular, non-bigoted YouTuber reviewer react to the Batwoman trailer and I got days of recommendations of incels screaming about the series,” they wrote.

A second Redditor stated: “I get ads for stuff like Ben Shapiro, Matt Walsh, Jordan Peterson—the list goes on… I watch political stuff sometimes so I get that’s why I’m bombarded with their crap. I report and say I don’t like it every time but nothing happens.”

Has anyone else been getting transphobic recommendations on YouTube CONSTANTLY?
by u/theblindbunny in lgbt

It should also be noted that TikTok has faced similar criticism in the past. In February 2021, Insider reported that a number of trans creators had warned other members of the LGBTQIA+ community that the app, while allowing for them to find a sense of togetherness, was also designed in such a way that perpetuated a culture of transphobia and harassment.

Specifically, creators have explained how, unlike other social media platforms where users post content that is often only ever viewed by their list of followers, TikTok broadcasts videos to entire legions of users who may have similar interests but who have never searched for or seen their content before. This can then lead to extreme levels of abuse and harassment if viewers decide to voice their disagreement with the videos in question.

That being said, we know all too well that there remains a serious problem regarding the vicious spread of online hate within these platforms. Often, sites such as YouTube will explicitly state detailed regulations regarding their condemnation of online malicious harassment, abuse, and bullying—only to blur the lines later when ‘engagement’ gets mentioned.

Nevertheless, it’s crucial that these conversations continue to happen. Most importantly, because we have seen first hand how effective hate campaigns such as transphobia and the manosphere’s lethal love child incelism are, both at radicalising young individuals and turning that radicalisation into real-life harm.