Vaccine misinformation on Facebook is nothing new—remember in 2018 when parents in the US knowingly withheld their children from getting vaccinated against measles, which eventually led to an outbreak of the disease throughout the country? Now, almost a year into the coronavirus pandemic, Facebook is finally taking action against vaccine misinformation by banning it entirely from the platform.
The ban won’t just apply to COVID-19 vaccine misinformation. Other posts promoting wider (and false) theories claiming that vaccines cause autism or that measles can’t kill people are no longer allowed on Facebook. On top of that, the platform also announced its plans to encourage Americans to get vaccinated by directing users to accurate information about when exactly they’re eligible for a COVID-19 vaccine and how to find an available dose.
The move, as simple as it may sound, is bound to have a crucial impact on the halt of vaccine misinformation. As reported by Vox, “With nearly 3 billion users, Facebook is one of the most influential social media networks in the world.” As vaccinations start rolling out around the world, Facebook remains one of the internet’s biggest hotspots for fake news. As a result, when it comes to coronavirus, many are concerned that misinformation could exacerbate some people’s refusal or hesitancy to get vaccinated.
In a statement published on Monday, 8 February, the company further explained that these changes are part of what it’s calling the “largest worldwide campaign” to promote authoritative information about COVID-19 vaccinations. Before its rollout, the effort was developed in consultation with health authorities such as the World Health Organization (WHO), and will include elevating reputable information from trustworthy organisations like the United Nations (UN) and various health ministries.
A list of banned vaccine claims, which was formed with the help of health authorities, is also available. “The new approach seems similar to Facebook’s US voter registration initiative, which the company claims helped sign up several million people to participate in the November election,” adds Vox.
“We’ve helped health authorities reach billions of people with accurate information and supported health and economic relief efforts,” wrote Kang-Xing Jin, Facebook’s head of health, on Monday. “But there’s still a long road ahead, and in 2021 we’re focused on supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”
Now here comes the ‘but’—just because Facebook is saying its guidelines about vaccine misinformation are changing doesn’t mean that vaccine misinformation won’t end up on the platform anyway. In order to truly tackle anti-vaxxer propaganda, the company will need to put some serious effort into enforcing its new rules.
So far, Facebook is yet to confirm whether it will be increasing its investment in content moderation, given its increased scope for vaccine misinformation. What’s for sure, however, is that expanding its enforcement will require time to train Facebook’s content moderators and systems.
Still, this announcement comes as a pleasant surprise for many considering that Facebook’s CEO Mark Zuckerberg has repeatedly defended principles of free expression to justify the platform’s inaction on many occasions. Zuckerberg now says that the company will be paying particular attention to pages, groups, and accounts on both Facebook and Instagram (which Facebook owns) that regularly share vaccine misinformation, and may remove them entirely. It will also adjust search algorithms to reduce the prominence of anti-vaxxer content.
Back in November 2020, experts warned that social media platforms would be walking a delicate line when it comes to the global vaccine effort. “While social networks should promote accurate information about Covid-19 inoculations,” they said, “platforms must also leave room for people to express honest questions about these relatively new vaccines.”
Like other enforcement actions Facebook has taken on everything ranging from QAnon and Frazzledrip conspiracy theories to incitements of violence posted by Donald Trump, many say the company’s move is too little too late for its cause. For years now, Facebook has been repeatedly flagged by researchers as a platform where misleading information about vaccines has proliferated. The pushback against COVID-19 vaccines is bound to be on an even bigger scale. Let’s just hope Facebook’s new commitment will be further enforced, not just proclaimed.
Conspiracy theories are one of the many downsides of the internet. From speculations about whether the US is hiding aliens in Area 51 and climate change deniers to anti-vaxxer and 5G conspiracy theorist Alex Jones, the internet has shown us that anything can be questioned, argued and doubted. Our imagination has no limits. That’s why, with the COVID-19 pandemic still in full force, we’ve also seen the rise of coronavirus conspiracy theorists. What do they have to say about COVID-19 and how are they spreading their message?
In 2019, the US saw measles make a comeback after it was declared to be eliminated in 2000. This was a result of anti-vaxxers, also known as people who are opposed to vaccinations, who lowered herd immunity and circulated false information on the side effects of vaccines. Back to 2020, just as we’re getting close to 300,000 deaths worldwide from coronavirus, COVID-19 conspiracy theories videos are thriving, especially on YouTube.
In the past few years, YouTube has introduced new rules in an attempt to regulate false information and stop health misinformation. The platform’s ever-changing list of policies about misinformation already highlighted the need for conspiracy theorists to be monitored. Now, YouTube’s rules state that videos containing “medical misinformation” about coronavirus are against advertiser guidelines and the platform’s community standards on which content is allowed on YouTube at all.
And yet, videos that question the transmission or even the existence of COVID-19, promote false cures or encourage viewers to ignore official guidance are flooding the platform. Although a simple search for ‘COVID-19 conspiracy theories’ will not show the many videos, the algorithm will still recommend some of these. YouTube, along with many social media platforms and messaging apps, is still struggling to limit the spread of misinformation and popular conspiracy theories.
And it seems that conspiracy theorists are not only accumulating a worrying number of views on YouTube—a recent theory received more than 1 million views in one week just before getting deleted—but they are also raking in huge profits. Although these videos are not technically supposed to receive any advertising, some still manage to cheat the algorithm. By combining paid advertising with the impressive number of views their videos receive, conspiracy theorists have found an easy way to profit from the coronavirus crisis.
But this is not the only way conspiracy theorists are using YouTube as a means to reach a wider audience. The platform advises any user to seek out new audiences in order to expand their reach and get more views. This includes collaborating on videos with other YouTubers who have their own audience. For example, Alex Jones repeatedly (and unsuccessfully) tried to make a ‘collab video’ with PewDiePie, one of YouTube’s biggest YouTubers, in order to reach his 104 million subscribers.
This highlights the danger conspiracy theorists represent. Anti-vaxxers, and more recently, COVID-19 conspiracy theorists are well aware of the loopholes that social media and other platforms present and take advantage of them to promote their (false) message and spread misinformation. They’re everywhere, from YouTube and TikTok to Instagram and Twitter.
While YouTube is trying to ban creators who break the rules, some conspiracy theorists have been using collabs and interviews as a way to work around those regulations by getting other YouTubers to host them on their channels. Patrick Bet-David, creator of the YouTube channel Valuetainment, told the MIT Technology Review in How covid-19 conspiracy theorists are exploiting YouTube culture that he had been approached by “fans asking him to interview David Icke, a conspiracy theorist whose own channel was recently removed from YouTube after he repeatedly violated the platform’s policies on COVID-19 misinformation.”
Bet-David decided to accept and interview Icke while challenging him on his views. The video received 800,000 views and resulted in Bet-David receiving more messages from YouTube users asking him to interview other conspiracy theorists, which he did. Although he told the MIT Technology Review that he is only responsible for what comes out of his mouth and would not take responsibility for helping spread false information through his videos, Bet-David is just another example of how conspiracy theorists can use the internet to expand their reach.
The latest COVID-19 conspiracy theory has taken off on social media, after a short documentary titled Plandemic appeared online. As expected, the video racked up millions of views before Facebook and YouTube took it down. The theory states that coronavirus was actually created in laboratories in order to eventually force everyone to get vaccinated. But anti-vaxxers are certain that the human immune system would be able to fight COVID-19 off if they avoid wearing face masks and hand-washing, which “activate” the virus and help to spread it.
In this documentary, Bill Gates, who is also blamed by many 5G conspiracy theorists, is at the centre of the narrative. The Gates Foundation, and Gates himself, are both considered the “architects of the Plandemic”. Although the movement started in the US, it has now spread worldwide when a recent anti-lockdown protest in Melbourne, Australia, had the crowd chanting “arrest Bill Gates.”
This spread of misinformation can sometimes lead to harassment or violence and push people to ignore life-saving public health guidelines. Similarly to what happened with anti-vaxxers and measles last year, the theories trying to refute the severity or existence of COVID-19 are putting everyone in danger and need to be stopped.
But who exactly should be held accountable? Is this something that only YouTube can regulate? When lives are put at risk, it seems urgent for the government to take a closer look into things.