Vaccine misinformation on Facebook is nothing new—remember in 2018 when parents in the US knowingly withheld their children from getting vaccinated against measles, which eventually led to an outbreak of the disease throughout the country? Now, almost a year into the coronavirus pandemic, Facebook is finally taking action against vaccine misinformation by banning it entirely from the platform.
The ban won’t just apply to COVID-19 vaccine misinformation. Other posts promoting wider (and false) theories claiming that vaccines cause autism or that measles can’t kill people are no longer allowed on Facebook. On top of that, the platform also announced its plans to encourage Americans to get vaccinated by directing users to accurate information about when exactly they’re eligible for a COVID-19 vaccine and how to find an available dose.
The move, as simple as it may sound, is bound to have a crucial impact on the halt of vaccine misinformation. As reported by Vox, “With nearly 3 billion users, Facebook is one of the most influential social media networks in the world.” As vaccinations start rolling out around the world, Facebook remains one of the internet’s biggest hotspots for fake news. As a result, when it comes to coronavirus, many are concerned that misinformation could exacerbate some people’s refusal or hesitancy to get vaccinated.
In a statement published on Monday, 8 February, the company further explained that these changes are part of what it’s calling the “largest worldwide campaign” to promote authoritative information about COVID-19 vaccinations. Before its rollout, the effort was developed in consultation with health authorities such as the World Health Organization (WHO), and will include elevating reputable information from trustworthy organisations like the United Nations (UN) and various health ministries.
A list of banned vaccine claims, which was formed with the help of health authorities, is also available. “The new approach seems similar to Facebook’s US voter registration initiative, which the company claims helped sign up several million people to participate in the November election,” adds Vox.
“We’ve helped health authorities reach billions of people with accurate information and supported health and economic relief efforts,” wrote Kang-Xing Jin, Facebook’s head of health, on Monday. “But there’s still a long road ahead, and in 2021 we’re focused on supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”
Now here comes the ‘but’—just because Facebook is saying its guidelines about vaccine misinformation are changing doesn’t mean that vaccine misinformation won’t end up on the platform anyway. In order to truly tackle anti-vaxxer propaganda, the company will need to put some serious effort into enforcing its new rules.
So far, Facebook is yet to confirm whether it will be increasing its investment in content moderation, given its increased scope for vaccine misinformation. What’s for sure, however, is that expanding its enforcement will require time to train Facebook’s content moderators and systems.
Still, this announcement comes as a pleasant surprise for many considering that Facebook’s CEO Mark Zuckerberg has repeatedly defended principles of free expression to justify the platform’s inaction on many occasions. Zuckerberg now says that the company will be paying particular attention to pages, groups, and accounts on both Facebook and Instagram (which Facebook owns) that regularly share vaccine misinformation, and may remove them entirely. It will also adjust search algorithms to reduce the prominence of anti-vaxxer content.
Back in November 2020, experts warned that social media platforms would be walking a delicate line when it comes to the global vaccine effort. “While social networks should promote accurate information about Covid-19 inoculations,” they said, “platforms must also leave room for people to express honest questions about these relatively new vaccines.”
Like other enforcement actions Facebook has taken on everything ranging from QAnon and Frazzledrip conspiracy theories to incitements of violence posted by Donald Trump, many say the company’s move is too little too late for its cause. For years now, Facebook has been repeatedly flagged by researchers as a platform where misleading information about vaccines has proliferated. The pushback against COVID-19 vaccines is bound to be on an even bigger scale. Let’s just hope Facebook’s new commitment will be further enforced, not just proclaimed.