Thinking about: how the metaverse puts kids in danger – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Thinking about: how the metaverse puts kids in danger

From deepfake porn to TikTok sleuthing, 'Thinking about' is a monthly column in which Amy Rose Everett deep dives into the biggest issues facing gen Z.

Chances are that someone in your family got a virtual reality (VR) headset last Christmas. And if they did, they’re using it to play video games, communicate and explore an immersive new world (broadly known as the metaverse). This will soon be reflective of the planet we live on, with augmented elements we can hardly imagine yet.

Of course, that’s all super exciting. Couldn’t get a ticket to Harry Styles’ latest world tour? You’ll be able to watch in the metaverse (and won’t have to queue for the bathroom). Imagine walking your avatar through virtual shopping centres, trying on outfits. Can’t afford flights abroad? Fear not, a VR headset will help you explore anywhere in the world.

Technological advances help us live our best lives, but they make things complicated, too. It’s a sad fact that exploitative, abusive and illegal activity will exist in this expanding virtual world—and already does.

Just last month, the National Society for the Prevention of Cruelty to Children (NSPCC) published evidence that children are being abused in the metaverse, after obtaining police data following a Freedom of Information request. Child abuse image offences have skyrocketed, with 30,000 reported in 2022.

The scariest part? This isn’t only happening in perceived ‘dark corners’ of the virtual world. Massively popular platforms like Fortnite and Roblox make it all too easy for predators to approach vulnerable people, and children. It’s happening on Snapchat, WhatsApp, Instagram and Facebook.

These spaces are basically unregulated, because it’s all so new. Last year, a reporter posed as a 13-year-old girl in a VR app, as part of a BBC investigation. Within minutes, they’d been exposed to explicit content, racist insults and the threat of rape.

The pressure is on for big tech companies to do more to protect their users. Rani Govender, Senior Child Safety Online Policy Officer at the NSPCC, told SCREENSHOT: “Sadly, we know children are contacting Childline about grooming and child abuse images while crimes are rising on the most popular social media sites. We cannot allow VR and the metaverse to become the next tools of choice for offenders to target children for abuse while Silicon Valley executives mouth empty platitudes about safety.”

Govender continued: “Big tech has failed to get its house in order which is why we need a tough Online Safety Bill which includes an expert child safety advocate that can stand up for children and families.”

But online regulation is tricky, and it’s going to take time. Take age verification, for example. You’d think this was a simple way to safeguard young users, but it also throws up a ton of concerns around anonymity and personal privacy, and the potential for ID documents and sensitive data to be leaked. Some apps require a selfie to unlock potentially problematic features; Roblox asks for one to activate voice chats, but that hasn’t stopped slurs and lewd comments once users are approved.

Meanwhile, problems are multiplying. In February 2023, SCREENSHOT covered the impact deepfake porn has had on the life of female streamer QTCinderella. Her likeness was used to make hyper-realistic fake porn videos which racked up millions of views within hours. With minimal protection or laws in place, there’s little she can do to gain justice.

It’s becoming increasingly difficult to determine what is real in the virtual world. Online romance scams like pig butchering are more common than ever. AI-generated photographs threaten to create false narratives about the past. In fact, AI is getting so good, it can beat humans at board games.

So what can we do about all this? I asked the NSPCC for its latest advice. The charity explained that you can protect yourself and the younger or more vulnerable people in your life, by learning as much as possible—try out your little brother’s new headset; ask your cousin which games they’re playing and who they’re chatting to. Your parents might not know that online games and virtual environments offer parental controls to minimise exposure to inappropriate content. Why not find out how to access them and help set them up?

The key is to prioritise life in the real world as much as possible, taking regular breaks from the virtual one. Use strong passwords, never reveal personal information, and block and report any suspicious accounts or activity. Pass these practices down to younger siblings, family members and friends, explaining why they are important and warning them of the dangers they might face.

If you know any parents or guardians worried about kids’ safety online, you can help by sharing the NSPCC’s tips:

1. Being screen-time savvy

Spending less time on screens is a great way to improve well-being, both online and offline. Try setting some limits and boundaries surrounding screen time and make use of the well-being settings on apps such as Instagram and TikTok, or on your devices.

2. Navigating the negative

Children can see things online that make them feel upset, angry, or that cause low self-esteem. If this happens, encourage them to mute or block accounts that do this. They can also use settings that prevent words, phrases, or posts they don’t want to see cropping up. Encourage them to make their online space a positive one.

3. Seeing isn’t believing

Remind your child that not everything they see or hear online is true. Encourage them to question what they’re viewing on a regular basis. This includes content that might be making them feel bad about themselves or like they’re missing out. If this does happen, you could talk to your child about filters, Photoshopped images or the fact that people usually post about the best bits of their lives rather than the boring, everyday bits.

4. Mistakes happen

If your child makes a mistake online, such as getting into an argument or sharing personal information, be understanding. You could use this mistake as a learning opportunity! This is a part of building digital resilience, which will help you all feel better about being online.

5. Take the lead

Your children look to you as an example. Make sure you’re acting on your own online wellbeing advice by doing things like taking breaks and not engaging with negative content.

6. Get chatting

Have regular chats with your child about what they like doing online and how it makes them feel. Don’t forget to talk about the positives of being online as much as the negatives, and really listen to what they’re telling you. Children use the internet in a different way from adults and if you show them that you understand the importance of their time online, it might help with more difficult conversations further down the line.

The NSPCC made this quiz, designed to be taken as a family, which includes more tips. Need advice? You’ll find information on the NSPCC’s online safety page.

How did a 16-year-old boy become radicalised through ISIS-themed Roblox servers?

If you search Roblox on Google, among other queries featured in the search engine’s ‘People also ask’ category, you’ll find one result at the top that reads: “Is Roblox appropriate for a 7-year-old?” Far be it from me to present myself as a gaming expert, but up until today, I would have probably answered this question affirmatively. How naive I was.

To put it simply, Roblox is an online game platform and game creation system that allows users to play creations made by its community. Within these, players can also chat with each other, which I should probably have spotted as the game’s first red flag considering that over half of the platform’s users are under the age of 13. Online unfiltered socialising for underage individuals? Rarely a good thing. But wait, it gets worse.

A 16-year-old Singaporean boy has been detained by the country’s authorities under strict new terror laws after he was found to have been playing on “multiple Islamic State-themed servers on Roblox.” The teenager, who retains his anonymity because of the fact that he’s still a minor, “was issued with a restriction order in January, limiting his movements and preventing him from issuing public statements,” the South China Morning Post (SCMP) reported on Tuesday 21 February 2023.

While the restriction order was issued this year, it wasn’t the first time that the young boy had caught the attention of Singaporean authorities. In November 2020, when he was only 14 years old, the country’s Internal Security Department (ISD) decided to keep a close watch on the boy after it was discovered that he had been spending a worrying amount of time role-playing as an ISIS combatant on Roblox. It seems that his online radicalisation has only escalated over the past two and a half years.

Releasing a statement at the time, the ISD explained that the boy had used the social gaming platform to replicate ISIS conflict zones such as Syria and Marawi city in the southern Philippines, and regarded himself as an ISIS member after taking the ‘bai`ah’ (allegiance) to an in-game “ISIS leader.”

He played out his fantasies on the game, where he would shoot and kill enemies and undertake roles as the “spokesperson” and “chief propagandist” for his virtual ISIS faction, the ISD further revealed in its statement.

Things kept on escalating from there, with Channel News Asia (CNA) reporting that “the teen was also attracted to Islamic eschatological prophecies after watching YouTube videos and had come across Islamic State songs from online music streaming platforms.”

Like countless other young and impressionable individuals online, the boy was found to have “an interest in far-right extremist content, including those which were anti-semitic and supportive of neo-Nazi groups whose ideologies promoted a ‘race war’.”

The boy was also alleged to have been in contact with Muhammad Irfan Danyal Bin Mohamad Nor, an 18-year-old who was arrested in December 2022 under Singapore’s sweeping (and highly controversial) Internal Security Act (ISA) laws, which allow the government to imprison terror suspects for up to two years without trial. Irfan had been planning “to set up an Islamic caliphate on Singapore’s Coney Island.”

Another teenage boy—a 15-year-old who is the youngest person to be held under the country’s new law—has been detained since November 2022 after he was arrested for planning to carry out multiple knife attacks across Singapore.

The ISD also stated that the young boy even thought about beheading non-Muslims in popular tourist areas and becoming a suicide bomber. “At the point of his arrest, the youth was deeply entrenched in his radical views, but had yet to undertake any steps towards actualising his attack ideations,” it added.

According to the SCMP, a total of 11 people under the age of 21 have been punished under the ISA since 2015. Seven were detained and four given restriction orders.