Facebook vows to ban Taliban-related content. How exactly? – SCREENSHOT Media

Facebook vows to ban Taliban-related content. How exactly?

By Monica Athnasious

Published Aug 17, 2021 at 01:27 PM

Reading time: 3 minutes

Facebook has told BBC News that it has now officially banned Taliban-related content from its platform. This news comes after a devastating week in Afghanistan when the extremist organisation overtook its capital unopposed and seized the presidential palace. The Taliban—as well as other radical groups—has often used social media to spread its messages and propaganda. Now, in the wake of this swift takeover, questions are being raised about the new challenges this will pose for social media giants in dealing with content made by the Taliban.

In the report, Facebook explained how it follows the “authority of the international community” when making such decisions. A spokesperson for the social media giant told the BBC, “The Taliban is sanctioned as a terrorist organisation under US law and we have banned them from our services under our Dangerous Organisation policies. This means we removed accounts maintained by or on behalf of the Taliban and prohibit praise, support, and representation of them.”

The spokesperson added that the team dedicated to this specific moderation would comprise Afghanistan experts—namely native speakers who have in-depth knowledge of the context and would be better able to spot the issues on the platform.

Facebook’s ‘Dangerous Organisation’ policies

Facebook’s Dangerous Organisation moderation policies state that they assess entities that proclaim a “violent mission” under three tiers. Tier 1 focuses on those who engage in violence, hatred and human rights violations offline—which includes “terrorist, hate and criminal organisations.” This tier also explains that the platform does not allow the praise of content that showcases violence, terrorism, murder, racist ideologies or praise of the perpetuator of the hatred.

Its second tier “focuses on entities that engage in violence against state or military actors but do not generally target civilians—what we call ‘Violent Non-State Actors.’ We remove all substantive support and representation of these entities, their leaders, and their prominent members. We remove any praise of these groups’ violent activities.”

The third and final tier suggests that any content pertaining to hate speech or demonstrating intent to engage in violence would also be removed. This would include “militarised social movements, violence-inducing conspiracy networks and individuals and groups banned for promoting hatred.”

Moderation failures

In spite of these regulations, social media platforms—especially the likes of Facebook—have a long timeline of historical failure in upholding these moderations. An obvious example is the racist abuse the England players suffered online post the Euro 2020 final. Recent studies have also suggested the use of emojis in abusive content protects posts from being taken down due to flaws in the algorithm. Even in the Taliban context, the moderation is not yet proven effective as Reuters stated that Facebook’s Whatsapp—an end-to-end encrypted messaging service—is still being used by Taliban members to communicate with Afghans, despite its prohibition under ‘Dangerous Organisations’.

Facebook’s Dangerous Organisation policies are not lacking in criticisms either. Although Facebook claims that policies recognise “neutral content”—whereby users often comment, report or discuss these subjects and would allow room for such discussions to take place—Facebook has wrongfully taken down posts like these in the past. Its definition is too wide and there are calls for the company to further detail them. The criticisms of its policies are still at large.

Not only this, but Facebook has failed to yet mention or address the impact monitoring Taliban content will have on its employees. The individuals being recruited for this job (as mentioned earlier) will be Afghans themselves. The proximity to this content may be incredibly detrimental to the employees’ mental health. It was just last year that the social media company agreed to pay $52 million to its content moderators over post-traumatic stress disorder (PTSD) claims, as first investigated by The Verge.

No uniformity

Other social media platforms have also been questioned about how they will approach this issue as many are split on the proper solution and fail to follow in Facebook’s footsteps. Taliban spokesmen have reportedly been using Twitter to communicate direct updates (of its takeover) to its hundreds of thousands of followers. When asked about the group’s use of the platform by Reuters, Twitter put forward its policies on violent content online but refused to disclose how the groups are classified or defined.

In its investigations of social media, Reuters was unable to get a comment from the video platform YouTube. The company stated that it relied on governmental definitions of what constitutes a “Foreign Terrorist Organisation (FTO) in order to make any enforcements.” At this current moment, as YouTube pointed out, the Taliban is not defined by the US as an FTO, rather it is categorised as “specially designated global terrorist.” Do they require these specific definitions or is it a get-out-of-jail-free card?

Keep On Reading

By Mason Berlinka

Meta’s coming for Elon Musk’s bag: Everything you need to know about the latest Twitter copycat, Threads

By Charlie Sawyer

The Guardian claims Greta Gerwig sold her indie soul by directing Barbie

By Alma Fabiani

Euphoria actor Angus Cloud dead aged 25. Here’s everything you need to know

By Fatou Ferraro Mboup

From Bunga Bunga to a political awakening: Gen Z’s real talk on Silvio Berlusconi’s government

By Charlie Sawyer

Gen Zers from across the UK reveal the profound impact the cost of living crisis has had on their lives

By Jennifer Raymont

Barely-there bikinis and timeless one-pieces: Grab these 10 must-have swimsuits for this summer

By Abby Amoakuh

Here’s why BookTok is already hating on Milly Bobby Brown’s fiction novel Nineteen Steps

By Charlie Sawyer

Taylor Swift and Matty Healy split: Let’s give a round of applause to the worst PR move ever

By Alma Fabiani

17 viral AI-generated songs, from Sprinter by a toothbrush to Hips Don’t Lie by Cartman

By Charlie Sawyer

Guilty verdict in E. Jean Carroll sexual assault case: Could this be the end for Trump?

By Mason Berlinka

From Into the Spider-Verse to Mutant Mayhem, are 3D animated films in danger of getting old again?

By Fatou Ferraro Mboup

Will gen Z be the generation to finally ditch sex and relationships labels?

By Alma Fabiani

What is Speedy from Queer Eye season 7 up to now? We asked the hero himself

By Mason Berlinka

Watch viral video of a guy trying to rob a nail salon, and failing miserably

By Mason Berlinka

What is the Caren Act? Unpacking why there needs to be a federal anti-Karen bill in the US

By Alma Fabiani

French journalist who infiltrated Paris police reveals disturbing pattern of racism and discrimination

By Fatou Ferraro Mboup

Lizzo takes the stage to denounce new anti-abortion and anti-LGBTQIA+ rights bills in Nebraska

By Charlie Sawyer

Companies have been trying to go viral on TikTok, and it’s giving millennial cringe

By Fatou Ferraro Mboup

Justin Bieber was once showered with sperm in raunchy London nightclub The Box

By Fatou Ferraro Mboup

Outrage as Spanish football president kisses team member after Women’s World Cup final