“I’ve read a lot of posts on Facebook about how many died in other countries because of vaccines, and how that’s being concealed,” said Gerry Casida, a 43-year-old construction worker from Manila, Philippines in an interview with Bloomberg. “My mom also consulted a folk healer, who said the vaccines could affect my heart,” he added.
In Southeast Asia, along with Casida, and despite the region’s recent struggle with its many virus hotspots, millions of people are delaying inoculation or just saying no, swayed by disinformation on social media from both local sources as well anti-vaccination movements in the US.
Those false—and false is the key word here—claims are fueling vaccine hesitancy in some pockets of the region, undermining efforts to immunise some of the most vulnerable people in Asia and end a pandemic that has stalled the world, along with its economy.
“Despite some of the highest rates of new cases in the world, recent surveys show vaccine resistance is prevalent in the region,” Bloomberg noted. According to polling company Social Weather Stations (SWS), in the Philippines, 68 per cent of people are either uncertain or unwilling to take a COVID-19 shot. A third of Thais have doubts or refuse to be vaccinated, according to a poll shared by the Bangkok Post, while a separate survey in Indonesia showed that nearly a fifth of the country’s population were hesitant.
So far, less than 10 per cent of the population in Thailand and the Philippines have received even one shot. In countries already struggling with limited supplies, anti-vaccination propaganda is an additional reason for such small numbers.
“It is a polluted media landscape,” Melissa Fleming, the United Nations’ under-secretary-general for global communications, said at a virtual forum in May. “This infodemic has shifted now, and the focus is misinformation on vaccines. It’s about instilling fear in people.”
More than 86 per cent of the Philippines’ population is Roman Catholic, with 6 per cent belonging to various nationalised Christian cults, and another 2 per cent belonging to well over 100 Protestant denominations. In other words, it is a heavily Catholic country, and this plays a part in its current vulnerability to anti-vaxxer misinformation.
Among many Facebook discussion groups focused on anti-vaccination theories trawled by Bloomberg, one video in the local Filipino language claimed COVID-19 shots will brand people with the “mark of the beast,” alluding to the Antichrist in Christian eschatology. It got more than a thousand views. The video, along with many other coronavirus-related propaganda, was originally shared online by US evangelical Christian groups, which then filtered across through church and family networks.
Another English language video with hundreds of views said the vaccine makes recipients magnetic. Meanwhile, in Malaysia, misinformation ranging from exaggerated risks to life and body organs to genetic alteration are spreading on the Facebook-owned messaging service WhatsApp. Many of them twist and amplify arguments made by US politicians and also Michael Yeadon, a former Pfizer scientist and a COVID-vaccine sceptic who’s now become an icon for anti-vaxxers.
Other popular conspiracy theories being spread on social platforms across the region include a claim that microchips in COVID-19 vaccines are being used to collect biometric data, a false theory that originated in the US and was previously linked to Bill Gates.
For governments keen to get as much as 80 per cent of their populations protected against the deadly virus, the resistance is challenging to say the least. As most countries in Southeast Asia struggle to contain the rise in outbreaks driven by more transmissible variants as well as a lagging vaccine rollout because of shortages after rich nations dominated stock, the last thing they need is for the public to doubt the safety of vaccines.
“Even in Singapore, which has largely contained the spread of the virus, the young and educated succumb to fake news,” said Leong Hoe Nam, an infectious disease physician at Singapore’s Mount Elizabeth Novena Hospital, to Bloomberg.
But there’s another major reason for hesitancy—with wealthier Western nations getting the super-effective mRNA vaccines, poorer countries are having to contend with limited supplies and fewer available brands. When a country offers a choice of only one vaccine, many people want to wait until they can get a higher-efficacy shot. And it’s almost understandable, as for some nations, specific vaccines helped them exit the pandemic faster than others.
Of course, educating vaccine recipients and even medical practitioners is the best tool for fighting hesitancy. Many also add that offering alluring prizes in exchange can also help. And as we’ve seen in other countries, it works—China had a girl band and the US offered free Krispy Kreme doughnuts and pre-rolled joints.
A district in northern Thailand started raffling off cows in mid-June as an incentive. In rural Indonesia, vaccinated residents got free chickens, while a city in the Philippines is giving away a house. But in the face of online misinformation, can a dozen cows, chickens and a single house win?
Conspiracy theories are one of the many downsides of the internet. From speculations about whether the US is hiding aliens in Area 51 and climate change deniers to anti-vaxxer and 5G conspiracy theorist Alex Jones, the internet has shown us that anything can be questioned, argued and doubted. Our imagination has no limits. That’s why, with the COVID-19 pandemic still in full force, we’ve also seen the rise of coronavirus conspiracy theorists. What do they have to say about COVID-19 and how are they spreading their message?
In 2019, the US saw measles make a comeback after it was declared to be eliminated in 2000. This was a result of anti-vaxxers, also known as people who are opposed to vaccinations, who lowered herd immunity and circulated false information on the side effects of vaccines. Back to 2020, just as we’re getting close to 300,000 deaths worldwide from coronavirus, COVID-19 conspiracy theories videos are thriving, especially on YouTube.
In the past few years, YouTube has introduced new rules in an attempt to regulate false information and stop health misinformation. The platform’s ever-changing list of policies about misinformation already highlighted the need for conspiracy theorists to be monitored. Now, YouTube’s rules state that videos containing “medical misinformation” about coronavirus are against advertiser guidelines and the platform’s community standards on which content is allowed on YouTube at all.
And yet, videos that question the transmission or even the existence of COVID-19, promote false cures or encourage viewers to ignore official guidance are flooding the platform. Although a simple search for ‘COVID-19 conspiracy theories’ will not show the many videos, the algorithm will still recommend some of these. YouTube, along with many social media platforms and messaging apps, is still struggling to limit the spread of misinformation and popular conspiracy theories.
And it seems that conspiracy theorists are not only accumulating a worrying number of views on YouTube—a recent theory received more than 1 million views in one week just before getting deleted—but they are also raking in huge profits. Although these videos are not technically supposed to receive any advertising, some still manage to cheat the algorithm. By combining paid advertising with the impressive number of views their videos receive, conspiracy theorists have found an easy way to profit from the coronavirus crisis.
But this is not the only way conspiracy theorists are using YouTube as a means to reach a wider audience. The platform advises any user to seek out new audiences in order to expand their reach and get more views. This includes collaborating on videos with other YouTubers who have their own audience. For example, Alex Jones repeatedly (and unsuccessfully) tried to make a ‘collab video’ with PewDiePie, one of YouTube’s biggest YouTubers, in order to reach his 104 million subscribers.
This highlights the danger conspiracy theorists represent. Anti-vaxxers, and more recently, COVID-19 conspiracy theorists are well aware of the loopholes that social media and other platforms present and take advantage of them to promote their (false) message and spread misinformation. They’re everywhere, from YouTube and TikTok to Instagram and Twitter.
While YouTube is trying to ban creators who break the rules, some conspiracy theorists have been using collabs and interviews as a way to work around those regulations by getting other YouTubers to host them on their channels. Patrick Bet-David, creator of the YouTube channel Valuetainment, told the MIT Technology Review in How covid-19 conspiracy theorists are exploiting YouTube culture that he had been approached by “fans asking him to interview David Icke, a conspiracy theorist whose own channel was recently removed from YouTube after he repeatedly violated the platform’s policies on COVID-19 misinformation.”
Bet-David decided to accept and interview Icke while challenging him on his views. The video received 800,000 views and resulted in Bet-David receiving more messages from YouTube users asking him to interview other conspiracy theorists, which he did. Although he told the MIT Technology Review that he is only responsible for what comes out of his mouth and would not take responsibility for helping spread false information through his videos, Bet-David is just another example of how conspiracy theorists can use the internet to expand their reach.
The latest COVID-19 conspiracy theory has taken off on social media, after a short documentary titled Plandemic appeared online. As expected, the video racked up millions of views before Facebook and YouTube took it down. The theory states that coronavirus was actually created in laboratories in order to eventually force everyone to get vaccinated. But anti-vaxxers are certain that the human immune system would be able to fight COVID-19 off if they avoid wearing face masks and hand-washing, which “activate” the virus and help to spread it.
In this documentary, Bill Gates, who is also blamed by many 5G conspiracy theorists, is at the centre of the narrative. The Gates Foundation, and Gates himself, are both considered the “architects of the Plandemic”. Although the movement started in the US, it has now spread worldwide when a recent anti-lockdown protest in Melbourne, Australia, had the crowd chanting “arrest Bill Gates.”
This spread of misinformation can sometimes lead to harassment or violence and push people to ignore life-saving public health guidelines. Similarly to what happened with anti-vaxxers and measles last year, the theories trying to refute the severity or existence of COVID-19 are putting everyone in danger and need to be stopped.
But who exactly should be held accountable? Is this something that only YouTube can regulate? When lives are put at risk, it seems urgent for the government to take a closer look into things.