According to a new report in The Wall Street Journal, Facebook has, for years now, operated a little known programme called ‘XCheck’ or ‘cross check’—allowing celebrities, politicians, and other members of America’s elite to avoid the kinds of moderation policies that the average user is subject to. In other words, they’ve been receiving special treatment because of their fame and have been allowed to play by their own rules.
According to the publication’s investigation, the programme was created in order to avoid “PR fires,” the public backlash that occurs when Facebook makes a mistake affecting a high profile user’s account. The XCheck programme means that if one of these accounts broke its rules, the violation is then sent to a separate team so that it can be reviewed by Facebook employees, rather than its non-employee moderators who typically review rule-breaking content.
Although Facebook had previously divulged the existence of cross check, it forgot to mention that “most of the content flagged by the XCheck system faced no subsequent review,” as stated in The Wall Street Journal’s report. In one incident, Brazilian football star Neymar da Silva Santos Júnior posted nude photos of a woman who had accused him of sexual assault. In case you weren’t sure, such a post is a blatant violation of Facebook’s rule around non-consensual nudity, and rule-breakers are typically banned from the platform. And yet the cross check system “blocked Facebook’s moderators from removing the video,” and the post was viewed nearly 60 million times before it was eventually removed. Neymar’s account faced no other consequences.
Other recipients of such privileges have included former President Donald Trump (prior to his two-year suspension from the platform earlier this year), his son Donald Trump Jr., rightwing commentator Candace Owens, and Senator Elizabeth Warren, among others. In most cases, individuals who are ‘whitelisted’ or given a pass on moderation enforcement are unaware that it is happening.
Employees at Facebook seem to have been aware that XCheck is problematic for quite some time. “We are not actually doing what we say we do publicly,” company researchers said in a 2019 memo entitled ‘The Political Whitelist Contradicts Facebook’s Core Stated Principles’. “Unlike the rest of our community, these people can violate our standards without any consequences.”
Last year alone, the cross check system enabled rule-breaking content to be viewed more than 16 billion times before being removed, according to internal Facebook documents cited by The Wall Street Journal. The report also says Facebook ‘misled’ its Oversight Board, which pressed the company on its cross check system back in June when weighing in on how the company should handle Trump’s “indefinite suspension.” The company told the board at the time that the system only affected “a small number” of its decisions and that it was “not feasible” to share more data.
“The Oversight Board has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts,” the Oversight Board shared on Twitter. “The Board has repeatedly made recommendations that Facebook be far more transparent in general, including about its management of high-profile accounts, while ensuring that its policies treat all users fairly.”
‘What does Facebook have to say about all that?’ you might be wondering. The social media giant told The Wall Street Journal that its reporting was based on “outdated information” and that the company has been trying to improve the cross check system. “In the end, at the center of this story is Facebook’s own analysis that we need to improve the program,” Facebook spokesperson Andy Stone wrote in a statement. “We know our enforcement is not perfect and there are tradeoffs between speed and accuracy.”
These recent revelations could lead to new investigations into Facebook’s content moderation policies. Already, some information related to cross check has been “turned over to the Securities and Exchange Commission and to Congress by a person seeking federal whistleblower protection,” according to The Wall Street Journal. It’s not a good look, given how just a few weeks ago, Facebook received backlash after one of its AI recommendation systems asked users if they wanted to “keep seeing videos about primates” when referring to a newspaper video featuring black men.
Watch out Zucko, the cracks are showing.