Facebook vows to ban Taliban-related content. How exactly?

By Monica Athnasious

Published Aug 17, 2021 at 01:27 PM

Reading time: 3 minutes

21735

Facebook has told BBC News that it has now officially banned Taliban-related content from its platform. This news comes after a devastating week in Afghanistan when the extremist organisation overtook its capital unopposed and seized the presidential palace. The Taliban—as well as other radical groups—has often used social media to spread its messages and propaganda. Now, in the wake of this swift takeover, questions are being raised about the new challenges this will pose for social media giants in dealing with content made by the Taliban.

In the report, Facebook explained how it follows the “authority of the international community” when making such decisions. A spokesperson for the social media giant told the BBC, “The Taliban is sanctioned as a terrorist organisation under US law and we have banned them from our services under our Dangerous Organisation policies. This means we removed accounts maintained by or on behalf of the Taliban and prohibit praise, support, and representation of them.”

The spokesperson added that the team dedicated to this specific moderation would comprise Afghanistan experts—namely native speakers who have in-depth knowledge of the context and would be better able to spot the issues on the platform.

Facebook’s ‘Dangerous Organisation’ policies

Facebook’s Dangerous Organisation moderation policies state that they assess entities that proclaim a “violent mission” under three tiers. Tier 1 focuses on those who engage in violence, hatred and human rights violations offline—which includes “terrorist, hate and criminal organisations.” This tier also explains that the platform does not allow the praise of content that showcases violence, terrorism, murder, racist ideologies or praise of the perpetuator of the hatred.

Its second tier “focuses on entities that engage in violence against state or military actors but do not generally target civilians—what we call ‘Violent Non-State Actors.’ We remove all substantive support and representation of these entities, their leaders, and their prominent members. We remove any praise of these groups’ violent activities.”

The third and final tier suggests that any content pertaining to hate speech or demonstrating intent to engage in violence would also be removed. This would include “militarised social movements, violence-inducing conspiracy networks and individuals and groups banned for promoting hatred.”

Moderation failures

In spite of these regulations, social media platforms—especially the likes of Facebook—have a long timeline of historical failure in upholding these moderations. An obvious example is the racist abuse the England players suffered online post the Euro 2020 final. Recent studies have also suggested the use of emojis in abusive content protects posts from being taken down due to flaws in the algorithm. Even in the Taliban context, the moderation is not yet proven effective as Reuters stated that Facebook’s Whatsapp—an end-to-end encrypted messaging service—is still being used by Taliban members to communicate with Afghans, despite its prohibition under ‘Dangerous Organisations’.

Facebook’s Dangerous Organisation policies are not lacking in criticisms either. Although Facebook claims that policies recognise “neutral content”—whereby users often comment, report or discuss these subjects and would allow room for such discussions to take place—Facebook has wrongfully taken down posts like these in the past. Its definition is too wide and there are calls for the company to further detail them. The criticisms of its policies are still at large.

Not only this, but Facebook has failed to yet mention or address the impact monitoring Taliban content will have on its employees. The individuals being recruited for this job (as mentioned earlier) will be Afghans themselves. The proximity to this content may be incredibly detrimental to the employees’ mental health. It was just last year that the social media company agreed to pay $52 million to its content moderators over post-traumatic stress disorder (PTSD) claims, as first investigated by The Verge.

No uniformity

Other social media platforms have also been questioned about how they will approach this issue as many are split on the proper solution and fail to follow in Facebook’s footsteps. Taliban spokesmen have reportedly been using Twitter to communicate direct updates (of its takeover) to its hundreds of thousands of followers. When asked about the group’s use of the platform by Reuters, Twitter put forward its policies on violent content online but refused to disclose how the groups are classified or defined.

In its investigations of social media, Reuters was unable to get a comment from the video platform YouTube. The company stated that it relied on governmental definitions of what constitutes a “Foreign Terrorist Organisation (FTO) in order to make any enforcements.” At this current moment, as YouTube pointed out, the Taliban is not defined by the US as an FTO, rather it is categorised as “specially designated global terrorist.” Do they require these specific definitions or is it a get-out-of-jail-free card?

Keep On Reading

By Abby Amoakuh

Did Taylor Swift disrespect Céline Dion at the 2024 Grammys? We investigated the incident

By Fatou Ferraro Mboup

A long list of horrifying claims of systemic migrant mistreatment by the Greek coastguard

By Abby Amoakuh

Neuralink’s human implant success sparks fear for the future of society

By Charlie Sawyer

American Airlines blames 9-year-old girl for not detecting flight attendant’s hidden bathroom camera

By Fatou Ferraro Mboup

Percy Hynes White speaks out after Netflix confirmed that he won’t return for Wednesday season 2

By Fatou Ferraro Mboup

Machine Gun Kelly officially changed his name after fans pointed out its problematic issue

By Charlie Sawyer

Lily Allen creates an OnlyFans account to sell feet pics for $10 per month

By Fatou Ferraro Mboup

Raven-Symoné tells fans to stop sending death threats to her wife Miranda amid online hate

By Charlie Sawyer

How much are the Love Island All Stars contestants getting paid?

By Fleurine Tideman

Your Honor, I’d like to plead the case for Taylor Swift going to the Super Bowl

By Fatou Ferraro Mboup

Samaria Ayanle’s tragic death prompts theories about a serial killer targeting Black women in London

By Charlie Sawyer

25-year-old Republican politician tells Americans not to be weak or gay in campaign video

By Fatou Ferraro Mboup

George Santos sues Jimmy Kimmel after taking distasteful jab at Amy Schumer’s appearance

By Charlie Sawyer

Doritos faces boycott over new trans brand ambassador’s alleged tweet about 12-year-old

By Emma O'Regan-Reidy

Is BookTok ruining reading? Critics seem to think so

By Abby Amoakuh

Channel 4’s Queenie is a love letter to messy Black women in their quarter-life crisis

By Abby Amoakuh

Back to Black costume designer PC Williams spills the tea on We Are Lady Parts and Polite Society

By Charlie Sawyer

Jenna Ortega shocks fans by departing hit Netflix show

By Abby Amoakuh

Trump to face trial in hush money case, as Fani Willis defends romantic relationship in Georgia case 

By J'Nae Phillips

Why Harajuku fashion is making a comeback in both Gen Z culture and aesthetics