It’s been said that QAnon is the mother of all conspiracy theories. Since its modest beginnings with Pizzagate in 2016, QAnon has grown from a fringe movement to a seemingly all-inclusive convergence of multiple right-wing conspiracy theories related to COVID-19, vaccines, 5G networks, Bill Gates, Donald Trump, and a “Deep State” cabal that harvests adrenochrome from children. Some have gone so far as to claim that QAnon has become a new American religion.
By the end of 2020, an NPR/Ipsos poll confirmed broad support for some of the beliefs falling underneath the wide QAnon umbrella. Nearly 40 per cent of Americans endorsed belief that the “Deep State” was working to undermine President Trump and as many as 1 in 3 believed that voter fraud helped Joe Biden win the presidential election. The claim that “a group of Satan-worshipping elites who run a child sex ring are trying to control our politics and media” was rated as false by a mere 47 per cent of respondents. But ironically, 83 per cent voiced concern about the spread of false information.
How can we explain why so many people believe in things not supported by empirical evidence? How did people come to believe in Pizzagate, the conspiracy theory that Hilary Clinton was running a child pornography ring out of the basement of a pizzeria that actually had no basement? How did people believe something as obscure and lacking in evidence as “Frazzledrip,” another conspiracy theory claim that Clinton was filmed tearing off the face of a child, wearing it as a mask, and then drinking the child’s blood?
Although a common temptation is to invoke mental illness or delusional thinking as an explanation, such accounts mostly fall flat. For one thing, surveys have consistently shown that about half of the population believes in at least one conspiracy theory. So conspiracy theory beliefs, like religious beliefs, are normal, unless we want to start proposing that half the population has a psychotic disorder.
Although psychology research has found some support for a kind of subclinical paranoia being related to conspiracy theory belief, most findings indicate that it is a matter of more quantitative differences in certain subtle ‘cognitive quirks’ that we all tend to have to some degree. For example, greater needs for closure, certainty, and control could explain why some tend to embrace conspiracy theories during times of crisis and chaos.
Psychological needs for uniqueness can explain how belief in conspiracy theories are often rewarding, based on believers’ fantasy of “seeing the light” in a way that “uncritical sheep” do not. Lack of analytical thinking and ‘bullshit receptivity’ have also been implicated in conspiracy theory belief, but these are also widespread liabilities of normal human brains and results of poor education, not deficits of psychopathology per se.
A more convincing account of QAnon conspiracy theories like Pizzagate and Frazzledrip lies at the level of social interaction and information science as opposed to psychiatric disorder. Conspiracy theories involve a negation of conventional explanations in favour of an alternative account featuring shadowy forces with malevolent intent. Therefore, mistrust—not clinical paranoia—is often the central feature of conspiracy theory belief.
QAnon conspiracy theories that, at their core, demonise left-wing liberals and “globalists” were born out of an emerging right-wing American populism and have come to be widely adopted within the Republican party. Appearing around the time of Clinton’s campaign against Donald Trump during the 2016 Presidential race, conspiracy theories like Pizzagate and Frazzledrip that maligned her were embraced because many viewed her as a literal enemy, just as the so-called “birthers” endorsed conspiracy theories about Barack Obama’s citizenship 8 years before. While mistrust can be earned, when it underlies conspiracy theory belief, it’s often fueled by prejudices of “othering” in the form of hyper-partisanship, racism, or misogyny.
Mistrust of others, and especially of authoritative sources of information, results in a vulnerability to misinformation that is widely available within today’s media landscape. When Edgar Maddison Welch decided to “self-investigate” Pizzagate armed with a rifle, he was misinformed by sources like Reddit and InfoWars that had presented the supposed “evidence” of Clinton’s guilt. Only later, when Welch was arrested, did he concede that “the intel on this wasn’t 100%.” Likewise, the purported “evidence” for Frazzledrip was widely claimed within YouTube videos back in 2018, despite the fact that no actual video of Clinton cutting off a child’s face ever existed.
The democratisation of knowledge in the modern era of online media has resulted in a ‘post-truth’ world in which ‘fake news’ has become a household word, but no one can agree on which information sources are to be trusted. Consequently, trust has largely become a matter of confirmation bias and motivated reasoning aligned with partisan identities, with little agreement about what constitutes objective evidence anymore.
Human beings are fond of myths that revere our heroes and demonise our enemies. We’re also fond of the self-deception that we think rationally and are always right. The reality is that we come to hold beliefs based on intuition, subjective experience, and faith in trusted sources of transmitted information, independent of veracity. It’s no coincidence that Pizzagate, Frazzledrip, and QAnon have been largely online phenomena that have flourished in an era of hyper-partisanship where the other side is regarded as a mortal enemy and existential threat. So long as that continues, conspiracy theories will continue to flourish.
Joseph Pierre, MD is a Health Sciences Clinical Professor in the Department of Psychiatry and Biobehavioural Sciences, David Geffen School of Medicine at UCLA and the Acting Chief of Mental Health Community Care Systems at the VA Greater Los Angeles Healthcare System. His column titled Psych Unseen and published in Psychology Today draws from the perspectives of psychiatry, neuroscience, psychology, and evidence-based medicine to address timely topics related to mental illness, human behaviour, and how we come to hold popular and not-so-popular beliefs.
YouTube is the gateway to almost any video content you can think of. From compilations of otters holding hands to outdated DIY documentaries speculating on whether Michael Jackson is dead or not, there are very few topics you won’t be able to explore on the video-sharing platform. And while gaining such access has undoubtedly made everyone’s life easier, the rise of YouTube has also helped the rapid spread of conspiracy theories. Among some of the most famous conspiracy theories is Frazzledrip, one that became so big it even got Sundar Pichai, Google’s CEO, in trouble. What is the Frazzledrip conspiracy theory about and how did it take over YouTube?
The conspiracy theorists behind Frazzledrip believe that Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping off a child’s face and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice. Supposedly, the Hillary Clinton video was later found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name ‘Frazzledrip’.
Frazzledrip is just an addition to two other popular conspiracy theories: Pizzagate and QAnon. QAnon followers believe that a group of Satan-worshipping Democrats, Hollywood celebrities and billionaires run the world while engaging in paedophilia, human trafficking and the harvesting of a supposedly life-extending chemical from the blood of abused children. Of course, this satanic clique includes Hillary Clinton.
QAnon is actually basing all its beliefs on previously established conspiracy theories, some new and some a millennium old. One of them is Pizzagate, the conspiracy theory that went viral during the 2016 US presidential campaign, when rightwing news outlets and influencers promoted the idea that references to food and a pizza restaurant located in Washington DC in the stolen emails of Clinton’s campaign manager John Podesta were actually a secret code for a child trafficking ring.
“The #Hillgramage 2018”. I’m shaking tonight with this drop. This would be an appropriate outcome. From #Epstein to #CometPingPong and everything in between, We’ve been waiting for this. We’re coming @HillaryClinton, I’m sharpening my pitchfork right now. #FRAZZLEDRIP #Pizzagate https://t.co/NFbCc4AZTt
— Hi, I’m Mike (@ImMikeRobertson) April 15, 2018
While many politicians have shown questionable pasts, especially when it comes to paedophilia, it should be noted that the three conspiracy theories mentioned above are completely unfounded. There isn’t a video out there that depicts Hillary Clinton ripping off a child’s face or drinking blood, simply because no such thing ever happened. Yet, multiple conspiracy theories of the Trump era seem to believe that Hillary Clinton is a secret paedophile and murderer. And, up until the end of 2018, Frazzledrip videos had taken over YouTube completely. So what happened?
More than one billion hours’ worth of content is viewed on YouTube every single day. About 70 per cent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”
If you had typed “Frazzledrip” in YouTube’s search bar at the beginning of 2018, you would have found thousands of videos on Hillary Clinton’s alleged murder and child trafficking. Previous users from Gab.ai, 4chan and 8chan had also flocked to YouTube to share their views with users from these sites linking to YouTube more than to any other website—thousands of times a day, according to research from Data and Society as well as from the Network Contagion Research Institute, both of which track the spread of hate speech.
Now, while it remains possible if you’re willing to put some effort into it, finding videos about this specific conspiracy theory has become harder on the platform. That’s because, on 11 December 2018, while Google’s CEO Sundar Pichai was testifying before lawmakers in Washington about different matters, one lawmaker asked him about the way YouTube’s algorithms could be used to push conspiracy theories and highlighted how urgent it was for this problem to be regulated.
Congressman Jamie Raskin said, “The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.”
Raskin added, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?”
In other words, Raskin shed light on the fact that YouTube, which Google purchased for $1.65 billion in 2006, had a major conspiracy theory problem—and technically still does today, only not as much with Frazzledrip. And at the time, it looked like neither Congress nor YouTube were anywhere near solving it.
YouTube’s content algorithms determine which videos show up in your search results, in the suggested videos stream, on the homepage, in the trending stream, and under your subscriptions. When you go to the platform’s homepage, algorithms dictate which videos you see and which ones you don’t. Same applies when you search for something.
As common as this sounds in today’s digital world, YouTube’s algorithms had an extremism problem. Whether you were previously watching a right-leaning, left-leaning or even non-political video, YouTube’s algorithm would always recommend increasingly more extreme videos in order to keep users’ attention and push them to watch as many videos as possible.
As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina The New York Times in March 2018, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).
Because of this algorithm, people who watched videos of Hillary Clinton and Bernie Sanders ended up being recommended videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11.”
if this is recommended to me, imagine what goes to people who actually watch this sort of content pic.twitter.com/tPcOsdbM0r
— John Ganz (@lionel_trolling) December 11, 2018
Back to 2020, and while the problem has been slightly diminished, it remains present. It isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories or to think that Michael Jackson didn’t die (he did) or that Hillary Clinton is a child-eating pedophilic cannibal (she’s not).
YouTube previously said in a statement, “False information is not necessarily violative, unless it crosses the line into hate speech, harassment, inciting violence or scams. We’ve developed robust Community Guidelines, and enforce these policies effectively.” Yes, YouTube’s algorithms have been tweaked slightly but changing the way they work altogether would be bad for business.
While explaining that YouTube takes problematic videos on a case-by-case basis, Pichai said “Freedom of speech is at the foundation of YouTube. As such, we have a strong bias toward allowing content on the platform even when people express controversial or offensive beliefs. That said, it’s not anything goes on YouTube.”
Yet today, Frazzledrip and Pizzagate on YouTube have simply been replaced by QAnon. Even though YouTube removes millions of videos on average each month, it is slow to identify troubling content and, when it does, is too permissive in what it allows to remain. We’ve seen the likes of 4chan, 8chan, 8kun and Voat make it to the top only to quickly crash and burn. What’s next after YouTube? Should we be worried?
Until we get an answer to this precise question, there is already something else we should all be worried about. The Pizzagate shooter reportedly had watched a YouTube video about the conspiracy days before heading to Washington from his home in North Carolina, telling a friend that he was “raiding a pedo ring… The world is too afraid to act and I’m too stubborn not to.” Let’s not forget that US citizens have guns, and most of them are not scared to use them—be that for the right reason or not.