How COVID-19 conspiracy theorists are making big bucks on Instagram – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

How COVID-19 conspiracy theorists are making big bucks on Instagram

Conspiracy theorists have reportedly found new ways to spread their false rumours—and make serious money while doing so. Instagram user and proud conspiracy theorist Steven Baker has, for a while now, been telling his followers that the coronavirus vaccine will change their DNA and possibly kill them. What does the anti-masker recommend they do instead? Spray themselves using a small bottle of “colloidal silver.” As it would happen, his website conveniently sells it.

According to the Bureau of Investigative Journalism, Baker has been riding off the back of already popular conspiracy theories such as the one promoted by anti-vaxxers to make a profit off them. “Real immunity happens when you get an infection, not when you get a shot that changes your cellular makeup and creates what you call a spike protein, therefore forever changing your DNA and your cells and making you more likely to have an autoimmune disease… Listen, here’s the thing I couldn’t say a year ago: the vaccine kills people,” states the man in one of his many Instagram videos.

Alongside this video, which was posted last month, is a marketing pitch, a promo code for his website, which sells “health supplements,” and a disclaimer: “Dr Baker takes the same supplements that he is telling you about. Yes, Dr Baker does make money when you buy these supplements, this is not a charity. The money goes to help Dr Baker keep doing online videos to wake you up.”

Although most people might think that they would never be swayed by a charlatan like Baker, many seem to have thought his claims true after he posted a previous post about where he gets all this (false) knowledge, “Dr Baker is a chiropractor. This means he knows more than medical doctors about helping people to actually heal. This video is not intended to give medical advice, treat, or diagnose.”

Right, next time I attend my monthly spine adjustment, I’ll make sure to interrogate my own chiropractor on why he didn’t feel the need to share his magical remedies with me.

Baker is behind one of more than 100 other Instagram accounts that were identified by the Bureau of Investigative Journalism as using the platform to seek to make money as a consequence of spreading misinformation about COVID-19 and vaccines.

In total, the accounts reach almost 6 million people and promote a combination of false claims, some of which can potentially be dangerous, as well as products, from health supplements to wellness courses and juicers.

Although Instagram (along with other major social media platforms such as Facebook and Twitter) have previously claimed they will be enforcing further restrictions and actions against health misinformation, Baker’s channels are consistently growing. “Over the first three months of this year, the accounts gained almost a million followers between them, according to data from Facebook-owned service CrowdTangle,” the Bureau of Investigative Journalism reports.

This means that the publication’s investigation reveals how Facebook, which owns Instagram, continues to be in breach of a commitment to the UK government that was made last November, agreeing that no one should profit from coronavirus vaccine misinformation online.

The Bureau previously found hundreds of pages on Facebook itself using monetisation tools to profit from false claims about COVID-19 and vaccines. “The Instagram accounts, many of which have received multiple flags from fact-checkers, are still posting two months after Facebook announced its latest tightening of rules.”

The accounts tracked by the Bureau include a group of Instagrammers who, in January, turned themselves into a formal network called the Health Freedom for Humanity (HFFH). The group’s founders, board and committee members are some of the loudest voices spreading COVID-19 misinformation online. Many are also making money out of it.

Alec Zeck, the group’s executive director and co-founder, has over 85,000 followers on Instagram. While also here to help promote HFFH, his page also points to his Linktree, which directs users to products that are for sale, such as bottles of a spray that claims to cleanse the body and brain of heavy metals for $95 a pop—whatever that means.

Meanwhile Thomas Cowan, who you might know of for his false claim that COVID-19 is caused by 5G mobile signals, has directed visitors to several money-making ventures hosted on his own website, including “Marine Plasma Drinkable Sea Water” for $49.95 per bottle.

It should be noted that, while neither Facebook or Instagram profit directly from these fraudulent schemes, the company’s business model relies on keeping audiences engaged—and you’ll rarely come across a more engaged audience than conspiracy theorists.

These same platforms, along with YouTube and forums like 4chan, have regularly been criticised over systems designed to increase engagement drawing users towards extremist or conspiracist content and views. We saw this happen on YouTube with the Frazzledrip conspiracy theory.

As a result, many still believe that those companies are doing “the absolute minimum they believe they can get away with,” as Lord Puttnam, who heads the Lords committee on democracy and digital technology told the Bureau.

Why don’t they fix it, you might ask? The answer is a little less simple, because the problem is what makes these platforms’ business model so successful in the first place. To stop the rapid spread of misinformation, they would first need to alter their business model, and if they do that, they will be less profitable.

Why conspiracy theories are so addictive, explained by professor of psychiatry Joe Pierre, MD

It’s been said that QAnon is the mother of all conspiracy theories. Since its modest beginnings with Pizzagate in 2016, QAnon has grown from a fringe movement to a seemingly all-inclusive convergence of multiple right-wing conspiracy theories related to COVID-19, vaccines, 5G networks, Bill Gates, Donald Trump, and a “Deep State” cabal that harvests adrenochrome from children. Some have gone so far as to claim that QAnon has become a new American religion.

By the end of 2020, an NPR/Ipsos poll confirmed broad support for some of the beliefs falling underneath the wide QAnon umbrella. Nearly 40 per cent of Americans endorsed belief that the “Deep State” was working to undermine President Trump and as many as 1 in 3 believed that voter fraud helped Joe Biden win the presidential election. The claim that “a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media” was rated as false by a mere 47 per cent of respondents. But ironically, 83 per cent voiced concern about the spread of false information.

Why do some people believe in conspiracy theories?

How can we explain why so many people believe in things not supported by empirical evidence? How did people come to believe in Pizzagate, the conspiracy theory that Hilary Clinton was running a child pornography ring out of the basement of a pizzeria that actually had no basement? How did people believe something as obscure and lacking in evidence as “Frazzledrip,” another conspiracy theory claim that Clinton was filmed tearing off the face of a child, wearing it as a mask, and then drinking the child’s blood?

Although a common temptation is to invoke mental illness or delusional thinking as an explanation, such accounts mostly fall flat. For one thing, surveys have consistently shown that about half of the population believes in at least one conspiracy theory. So conspiracy theory beliefs, like religious beliefs, are normal, unless we want to start proposing that half the population has a psychotic disorder.

Although psychology research has found some support for a kind of subclinical paranoia being related to conspiracy theory belief, most findings indicate that it is a matter of more quantitative differences in certain subtle ‘cognitive quirks’ that we all tend to have to some degree. For example, greater needs for closure, certainty, and control could explain why some tend to embrace conspiracy theories during times of crisis and chaos.

Lack of analytical thinking

Psychological needs for uniqueness can explain how belief in conspiracy theories are often rewarding, based on believers’ fantasy of “seeing the light” in a way that “uncritical sheep” do not. Lack of analytical thinking and ‘bullshit receptivity’ have also been implicated in conspiracy theory belief, but these are also widespread liabilities of normal human brains and results of poor education, not deficits of psychopathology per se.

Simple mistrust of others and authoritative sources of information

A more convincing account of QAnon conspiracy theories like Pizzagate and Frazzledrip lies at the level of social interaction and information science as opposed to psychiatric disorder. Conspiracy theories involve a negation of conventional explanations in favour of an alternative account featuring shadowy forces with malevolent intent. Therefore, mistrust—not clinical paranoia—is often the central feature of conspiracy theory belief.

QAnon conspiracy theories that, at their core, demonise left-wing liberals and “globalists” were born out of an emerging right-wing American populism and have come to be widely adopted within the Republican party. Appearing around the time of Clinton’s campaign against Donald Trump during the 2016 Presidential race, conspiracy theories like Pizzagate and Frazzledrip that maligned her were embraced because many viewed her as a literal enemy, just as the so-called “birthers” endorsed conspiracy theories about Barack Obama’s citizenship 8 years before. While mistrust can be earned, when it underlies conspiracy theory belief, it’s often fueled by prejudices of “othering” in the form of hyper-partisanship, racism, or misogyny.

Mistrust of others, and especially of authoritative sources of information, results in a vulnerability to misinformation that is widely available within today’s media landscape. When Edgar Maddison Welch decided to “self-investigate” Pizzagate armed with a rifle, he was misinformed by sources like Reddit and InfoWars that had presented the supposed “evidence” of Clinton’s guilt. Only later, when Welch was arrested, did he concede that “the intel on this wasn’t 100%.” Likewise, the purported “evidence” for Frazzledrip was widely claimed within YouTube videos back in 2018, despite the fact that no actual video of Clinton cutting off a child’s face ever existed.

The democratisation of knowledge in the modern era of online media has resulted in a ‘post-truth’ world in which ‘fake news’ has become a household word, but no one can agree on which information sources are to be trusted. Consequently, trust has largely become a matter of confirmation bias and motivated reasoning aligned with partisan identities, with little agreement about what constitutes objective evidence anymore.

Human beings are fond of myths that revere our heroes and demonise our enemies. We’re also fond of the self-deception that we think rationally and are always right. The reality is that we come to hold beliefs based on intuition, subjective experience, and faith in trusted sources of transmitted information, independent of veracity. It’s no coincidence that Pizzagate, Frazzledrip, and QAnon have been largely online phenomena that have flourished in an era of hyper-partisanship where the other side is regarded as a mortal enemy and existential threat. So long as that continues, conspiracy theories will continue to flourish.

Joseph Pierre, MD is a Health Sciences Clinical Professor in the Department of Psychiatry and Biobehavioural Sciences, David Geffen School of Medicine at UCLA and the Acting Chief of Mental Health Community Care Systems at the VA Greater Los Angeles Healthcare System. His column titled Psych Unseen and published in Psychology Today draws from the perspectives of psychiatry, neuroscience, psychology, and evidence-based medicine to address timely topics related to mental illness, human behaviour, and how we come to hold popular and not-so-popular beliefs.