Meet Patrick Cage, the man who makes money betting against QAnon followers – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Meet Patrick Cage, the man who makes money betting against QAnon followers

What is QAnon?

QAnon has now probably become the most followed conspiracy theories out there. Its followers strongly believe that Democrats, government officials, and celebrities are part of a cannibalistic, child-sex-trafficking cult, and that Donald Trump is the hero destined to stop them. Just like many other ‘successful’ conspiracy theories, QAnon is actually based on, and therefore linked to, other theories such as Pizzagate, Frazzledrip and the historic mystery that surrounds the Adrenochrome drug.

As a result of its worrying influence and following, QAnon has allegedly resulted in kidnappings, car chases, attempted shootings and a murder. Some of you might remember when, in December 2016, religious father Edgar Maddison Welch travelled from his home in the small town of Salisbury, North Carolina, to a pizza restaurant in Northwest Washington, D.C., armed with three loaded guns—a 9-mm AR-15 rifle, a six-shot .38‑caliber Colt revolver, and a shotgun.

Welch walked through the front door of the pizzeria called Comet Ping Pong expecting to discover children there in need of rescue. I know what you’re thinking right now; this sounds completely crazy. Where did he get this nonsense from? Welch had followed the conspiracy theory now famously known as Pizzagate, which claimed that Hillary Clinton was running a child sex ring out of Comet Ping Pong.

The idea originated in October 2016, when WikiLeaks made public a trove of emails stolen from the account of John Podesta, a former White House chief of staff and then the chair of Clinton’s presidential campaign; Comet was mentioned repeatedly in exchanges Podesta had with the restaurant’s owner, James Alefantis, and others. The emails were mainly about fundraising events, but high-profile pro-Trump figures such as Mike Cernovich and Alex Jones began advancing the claim—which originated in fake news corners of the internet such as 4chan and 8chan and then spread to more accessible platforms like Twitter and YouTube—that the emails were proof of ritualistic child abuse.

Some conspiracy theorists asserted that it was taking place in Comet Ping Pong’s basement, which actually never had any basement. References in the emails to ‘pizza’ and ‘pasta’ were interpreted as code words for ‘girls’ and ‘little boys’. Just like many other internet users in the US as well as in the rest of the world, Welch had been binge-watching conspiracy theory videos on YouTube, which quickly brainwashed him and any bit of common sense he had left.

“The intel on this wasn’t 100 percent,” Welch told The New York Times after his arrest. He simply sincerely believed that children were being held against their will at the pizzeria, and so he took actions. This specific story by itself should give you an idea of the impressive and worrying QAnon (and other linked conspiracy theories) has on people. But what has 28-year-old Patrick Cage got to do with it?

Who is Patrick Cage?

According to The Atlantic, in 2018, Cage, a Californian who works in international environmental policy, discovered a gambling platform called PredictIt. It is an unusual betting site since its users don’t wager on card games or horse racing. Instead, they make predictions about politics. “People put money on questions like ‘Will Kanye run in 2020?’ and ‘How many times will Trump tweet this week?’,” writes The Atlantic. Its tag line: “Let’s Play Politics.”

Because Cage had been following politics closely since the 2016 US election, he thought PredictIt would be a good way to test his newfound political knowledge. So he bet that Kanye West would run for president. “That’s probably my proudest moment,” Cage told The Atlantic. “I dumped 20 cents in November 2018 and netted a dollar off that investment.”

After this win, Cage quickly realised that most of the time, his predilections weren’t much better than if he had chosen at random. But he kept on betting, and shortly after that, he noticed bets that had odds that strangely seemed completely off. According to him, in the spring of 2019, for example, the PredictIt market gave former FBI Director James Comey a 1-in-4 chance of being indicted in the next six months. Cage had never heard anything about a Comey indictment on the news.

He checked trustworthy sources as well as fake news websites such as Breitbart but still, he couldn’t find where this bet was coming from. So he made a bet that Comey would not be indicted, and won. Cage kept on seeing these strange bets afterwards: would a federal charge against Hillary Clinton come by a certain date? What about one against Barack Obama? Unable to find the sources of these predictions, Cage ended up in PredictIt’s comments section.

There, he read thousands of comments that made no sense to him, but explained why such random bets had impressive odds. That’s how, years before most people had heard of QAnon, Cage learned that Q is an anonymous figure who claims to have a high-level security clearance and access to inside information about a devil-worshipping deep state.

The bets finally made sense and Cage decided to delve into the world of QAnon followers in order to pick the right bets and make some serious money, easily. Believers were so convicted in these beliefs that they were willing to “put hundreds or thousands on the line,” he said. “So I started shovelling more and more money in.”

He began scanning the betting platform for any bet that looked far-fetched and had abnormal odds, and bet against them every time he couldn’t find any information about them on news sources. Cage even started keeping track of YouTube QAnon channels and other forums in order to predict where to bet against next.

According to The Atlantic, “Cage has made money every time QAnon has been wrong—which they have been on every bet he’s made so far. He’s put about $800 in and made around $400 in profits.”

Although Cage started betting against QAnon followers back in 2018, today and even in the near future, this little hobby of his could well stay sustainable and profitable. After all, QAnon believers, as well as other conspiracy theorists, swap absurd ideas back and forth constantly. From anti-vaxxers and nazis to people who believe the moon-landing was fake, just like Michael Jackson’s death too, conspiracy theories have now reached a point of no return.

No wonder Trump’s embrace of bizarre conspiracy theories about voter fraud has had similar effects—people are now almost formatted to believe fake news and conspiracy theories. QAnon followers were convinced that Trump would win the election and start making mass arrests of Democrats. Cage probably bet against this idea too, and I’m sure he made more money. Meanwhile, other people lost real money for believing fake stories. You win some, you lose some.

What is Frazzledrip? Everything you need to know about the fake conspiracy theory

YouTube is the gateway to almost any video content you can think of. From compilations of otters holding hands to outdated DIY documentaries speculating on whether Michael Jackson is dead or not, there are very few topics you won’t be able to explore on the video-sharing platform. And while gaining such access has undoubtedly made everyone’s life easier, the rise of YouTube has also helped the rapid spread of conspiracy theories. Among some of the most famous conspiracy theories is Frazzledrip, one that became so big it even got Sundar Pichai, Google’s CEO, in trouble. What is the Frazzledrip conspiracy theory about and how did it take over YouTube?

What is Frazzledrip?

The conspiracy theorists behind Frazzledrip believe that Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping off a child’s face and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice. Supposedly, the Hillary Clinton video was later found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name ‘Frazzledrip’.

Frazzledrip is just an addition to two other popular conspiracy theories: Pizzagate and QAnon. QAnon followers believe that a group of Satan-worshipping Democrats, Hollywood celebrities and billionaires run the world while engaging in paedophilia, human trafficking and the harvesting of a supposedly life-extending chemical from the blood of abused children. Of course, this satanic clique includes Hillary Clinton.

QAnon is actually basing all its beliefs on previously established conspiracy theories, some new and some a millennium old. One of them is Pizzagate, the conspiracy theory that went viral during the 2016 US presidential campaign, when rightwing news outlets and influencers promoted the idea that references to food and a pizza restaurant located in Washington DC in the stolen emails of Clinton’s campaign manager John Podesta were actually a secret code for a child trafficking ring.

While many politicians have shown questionable pasts, especially when it comes to paedophilia, it should be noted that the three conspiracy theories mentioned above are completely unfounded. There isn’t a video out there that depicts Hillary Clinton ripping off a child’s face or drinking blood, simply because no such thing ever happened. Yet, multiple conspiracy theories of the Trump era seem to believe that Hillary Clinton is a secret paedophile and murderer. And, up until the end of 2018, Frazzledrip videos had taken over YouTube completely. So what happened?

Google’s CEO was questioned on Pizzagate and Frazzledrip by the House Judiciary Committee

More than one billion hours’ worth of content is viewed on YouTube every single day. About 70 per cent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”

If you had typed “Frazzledrip” in YouTube’s search bar at the beginning of 2018, you would have found thousands of videos on Hillary Clinton’s alleged murder and child trafficking. Previous users from Gab.ai, 4chan and 8chan had also flocked to YouTube to share their views with users from these sites linking to YouTube more than to any other website—thousands of times a day, according to research from Data and Society as well as from the Network Contagion Research Institute, both of which track the spread of hate speech.

Now, while it remains possible if you’re willing to put some effort into it, finding videos about this specific conspiracy theory has become harder on the platform. That’s because, on 11 December 2018, while Google’s CEO Sundar Pichai was testifying before lawmakers in Washington about different matters, one lawmaker asked him about the way YouTube’s algorithms could be used to push conspiracy theories and highlighted how urgent it was for this problem to be regulated.

Congressman Jamie Raskin said, “The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.”

Raskin added, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?”

In other words, Raskin shed light on the fact that YouTube, which Google purchased for $1.65 billion in 2006, had a major conspiracy theory problem—and technically still does today, only not as much with Frazzledrip. And at the time, it looked like neither Congress nor YouTube were anywhere near solving it.

YouTube’s algorithm had an extremism problem

YouTube’s content algorithms determine which videos show up in your search results, in the suggested videos stream, on the homepage, in the trending stream, and under your subscriptions. When you go to the platform’s homepage, algorithms dictate which videos you see and which ones you don’t. Same applies when you search for something.

As common as this sounds in today’s digital world, YouTube’s algorithms had an extremism problem. Whether you were previously watching a right-leaning, left-leaning or even non-political video, YouTube’s algorithm would always recommend increasingly more extreme videos in order to keep users’ attention and push them to watch as many videos as possible.

As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina The New York Times in March 2018, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).

Because of this algorithm, people who watched videos of Hillary Clinton and Bernie Sanders ended up being recommended videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11.”

Back to 2020, and while the problem has been slightly diminished, it remains present. It isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories or to think that Michael Jackson didn’t die (he did) or that Hillary Clinton is a child-eating pedophilic cannibal (she’s not).

YouTube previously said in a statement, “False information is not necessarily violative, unless it crosses the line into hate speech, harassment, inciting violence or scams. We’ve developed robust Community Guidelines, and enforce these policies effectively.” Yes, YouTube’s algorithms have been tweaked slightly but changing the way they work altogether would be bad for business.

While explaining that YouTube takes problematic videos on a case-by-case basis, Pichai said “Freedom of speech is at the foundation of YouTube. As such, we have a strong bias toward allowing content on the platform even when people express controversial or offensive beliefs. That said, it’s not anything goes on YouTube.”

Yet today, Frazzledrip and Pizzagate on YouTube have simply been replaced by QAnon. Even though YouTube removes millions of videos on average each month, it is slow to identify troubling content and, when it does, is too permissive in what it allows to remain. We’ve seen the likes of 4chan, 8chan, 8kun and Voat make it to the top only to quickly crash and burn. What’s next after YouTube? Should we be worried?

Until we get an answer to this precise question, there is already something else we should all be worried about. The Pizzagate shooter reportedly had watched a YouTube video about the conspiracy days before heading to Washington from his home in North Carolina, telling a friend that he was “raiding a pedo ring… The world is too afraid to act and I’m too stubborn not to.” Let’s not forget that US citizens have guns, and most of them are not scared to use them—be that for the right reason or not.