It all started on 4chan, an anonymous English-language imageboard website that was initially created in 2003 by Christopher Poole, who served as the site’s head administrator for more than 11 years before stepping down. At the time, Reddit or Voat didn’t exist yet.
This led 4chan to host a wide variety of ‘boards’ on all sort of topics. From anime and manga to video games, music, literature, fitness, politics, and sports, 4chan offered boards focused on anything and everything—and people loved it!
The platform was created as an unofficial counterpart to the Japanese imageboard Futaba Channel, also known as 2chan, and because of that, its first boards were created for posting images and discussion related to anime.
Before 4chan became the subject of media attention as a source of controversies, it was often described as the hub of internet culture, and its community was said to have an important role in the formation of prominent memes, such as lolcats, Rickrolling, and rage comics, as well as hacktivist movements such as Anonymous—but then, questionable activities started taking place.
First, the site was blamed for leaking the stolen nude photographs of dozens of female celebrities including Jennifer Lawrence and Rihanna, which later became known as The Fappening. After that, 4chan users (who always remain anonymous because registration is not possible on the site) invented Ebola-Chan, a ‘mascot’ for Ebola and encouraged iPhone owners to microwave their phones.
Although the website had around 22 million monthly users in 2014, today, while it still exists, users now tend to go for other platforms like Voat or Reddit. But what made 4chan so special in the first place?
For one thing, users never need to make an account or pick a username. That means that they can say and do virtually anything they want with only the most remote threat of accountability. It also means that they can’t message other users or establish any kind of social relationship with them, unless they reveal their identity in some way. For a social network, that’s a pretty strange way to work.
On top of that, 4chan threads expire after a certain amount of time—R-rated boards, for example, expire in less time than G or PG ones—which means that users rarely see the exact same thing. Few posts last more than a few days before they’re deleted from 4chan’s servers. Posts are ‘organised’ reverse-chronologically and the site’s interface is deliberately minimalist, which can make it difficult for non-regular users to fully get it.
In other words, 4chan is a forum with no names, very few regulations, and even fewer consequences. No wonder things went south. In late 2010, a 4chan user conducted a survey of other site users, which found that most 4chan users didn’t discuss the site offline and most users wouldn’t let their kids join it.
As a result, the site’s founder Poole urged readers to take the survey with “a massive grain of salt.” Since the website thrives on anonymity, he pointed out, there’s ultimately no way to know who uses it with any certainty.
1. Celebgate, also known as The Fappening: the leak of stolen celebrity nude photos, which still exist as downloadable torrents across the internet.
2. #Leakforjlaw: a similar social media prank that encouraged women to post their nude photos in support of Jennifer Lawrence.
3. Google and poll-bombing: voting or searching for the same terms en masse, to either sabotage an online vote or make a topic trend artificially. According to The Washington Post, 4chan has successfully gotten a swastika to trend on Google.
4. #Cutforbieber: a Twitter hashtag that encouraged Justin Bieber fans to cut themselves to demonstrate their love for the performer.
5. Gamergate: an ongoing movement to expose ‘corruption’ in video game journalism, which was created by 4chan users. Gamergate has since wrecked the lives of several female gamers and commentators and spawned a larger discussion about the way that industry treats women.
6. Many fake bomb threats: a vast number of hoaxers have posted mass bomb and shooting threats to 4chan, prompting several arrests and evacuations.
7. The cyberbullying of Jessi Slaughter: one of the earliest high-profile incidents of cyberbullying, in which 4chan members sent death threats and calls to an 11-year-old girl who would later make multiple suicide attempts.
8. Apple wave: an alleged feature of the iPhone 6 promoted by 4chan users on Twitter, wherein people can charge their phones by microwaving them. Needless to say, that was nothing but a hoax.
In November 2018, it was announced that 4chan would be split into two, with the work-safe boards moved to a new domain, 4channel.org, while the NSFW boards would remain on the 4chan.org domain.
Technically, 8chan didn’t replace 4chan. As mentioned previously, 4chan still exists, but after the platform played a considerable role in the Gamegate controversy, it was forced to ban the topic altogether, which resulted in many Gamergate affiliates migrating to 8chan instead.
Just like 4chan, 8chan, also called Infinitechan or Infinitychan (stylised as ∞chan) is an imageboard website composed of user-created message boards. Here again, anonymous users moderate their own board, with minimal interaction from site administration.
8chan was first launched in 2013 by Fredrick Brennan who then handed the keys over to Japanese pornography enthusiast Jim Watkins in 2016. The platform quickly became notorious for its link to white supremacist communities as well as boards promoting neo-Nazism, the alt-right, racism and antisemitism, hate crimes, and mass shootings. 8chan was also known for hosting child pornography; as a result, it was filtered out from Google.
Although 8chan had hundreds of topic areas, the site was most notorious for its /pol/ board, short for politically incorrect.
Around the same time, it was revealed that the man responsible for the mass shooting that took place at a Walmart in El Paso, Texas on 3 August 2019, had posted a racist, anti-immigrant screed on 8chan an hour before where he wrote: “Do your part and spread this brothers!”
It was the third instance this year that a shooter had posted a manifesto to 8chan, but this time, the anonymous image board faced blowback from its service providers, including Cloudflare, which the site used for security, and Tucows, its domain host. Both dropped 8chan as a client. The platform went offline in August 2019, but things weren’t over yet.
In November 2019, 8chan rebranded itself as 8kun and the extreme free speech and anonymity started all over again. While the new 8kun looks identical to 8chan, it presents a few differences. 8kun is currently only accessible from the dark web, meaning that to reach it you need software like Tor, which allows users to browse the web anonymously and reach unindexed websites.
It’s not complicated to download the Tor browser, but it’s still an extra step that makes 8kun harder to find and likely to have fewer visitors than its predecessor, which had millions of users. Unlike 8chan, 8kun doesn’t have a /pol/ board, which is where the alleged perpetrators of the El Paso and Christchurch, New Zealand, massacres (as well as a shooter in Poway, California) first posted their manifestos before opening fire.
To replace one evil with another, Q, the mysterious online figure at the centre of the QAnon conspiracy theory replaced the controversy first created on the now-dead /pol/ board.
Q posted many messages on 8chan before believers in the QAnon theory spread them over other social networks. QAnon believers are certain that the Mueller investigation was really an effort for President Donald Trump to arrest a vast Democratic paedophile ring.
When 8chan went down, Q’s breadcrumbs stopped too. Without 8chan, Q had no method of proving that its messages were coming from the same Q who had been posting all along. But as soon as 8kun was launched, Q started posting again.
When Watkins testified to Congress in September 2019, he wore a Q pin, making it clear that he is a supporter of the conspiracy theory. Brennan, who initially founded 8chan but is now a vocal critic of the website, alleged in a tweet that “the point of 8kun is Q, full stop. Every other board migrated is just for show.”
The point of 8kun is Q, full stop.
— Fredrick Brennan 🦝🔣📗 (@fr_brennan) November 5, 2019
Every other board migrated is just for show. Tom Reidel, President, N. T. Technology admitted to me that their main concern is the QAnon traffic.
Jim doesn't care about you or your board, unless you are QAnon and your board is /qresearch/. https://t.co/qEIOAtTdMi
And although it seems obvious why Q had nowhere else to go, many wondered why the community of white-nationalist trolls that gathered on 8chan for years also moved to 8kun. Many first flocked to Discord, the chat app for gamers that’s also become a popular tool for hate groups to organise, chat, and indoctrinate new followers.
They shared links and invited interlocutors to join other groups across the internet, including on the encrypted messaging app Telegram. Others migrated to new online boards and forums that fit more specifically with their ideological leanings, like NeinChan, which attracts a particularly antisemitic crowd or JulayWorld, which has a small fascist board that has attracted some former 8channers.
But many users started to see 4chan and 8chan’s anonymous message boards as outdated, which led some of them to give 8kun a try. 2019 marked a new era, one where extremists didn’t feel the need to hide as much anymore. They simple moved to Voat, Reddit and YouTube and created more cesspools of hate.
YouTube is the gateway to almost any video content you can think of. From compilations of otters holding hands to outdated DIY documentaries speculating on whether Michael Jackson is dead or not, there are very few topics you won’t be able to explore on the video-sharing platform. And while gaining such access has undoubtedly made everyone’s life easier, the rise of YouTube has also helped the rapid spread of conspiracy theories. Among some of the most famous conspiracy theories is Frazzledrip, one that became so big it even got Sundar Pichai, Google’s CEO, in trouble. What is the Frazzledrip conspiracy theory about and how did it take over YouTube?
The conspiracy theorists behind Frazzledrip believe that Hillary Clinton and former Clinton aide Huma Abedin were filmed ripping off a child’s face and wearing it as a mask before drinking the child’s blood in a Satanic ritual sacrifice. Supposedly, the Hillary Clinton video was later found on the hard drive of Abedin’s former husband, Anthony Weiner, under the code name ‘Frazzledrip’.
Frazzledrip is just an addition to two other popular conspiracy theories: Pizzagate and QAnon. QAnon followers believe that a group of Satan-worshipping Democrats, Hollywood celebrities and billionaires run the world while engaging in paedophilia, human trafficking and the harvesting of a supposedly life-extending chemical from the blood of abused children. Of course, this satanic clique includes Hillary Clinton.
QAnon is actually basing all its beliefs on previously established conspiracy theories, some new and some a millennium old. One of them is Pizzagate, the conspiracy theory that went viral during the 2016 US presidential campaign, when rightwing news outlets and influencers promoted the idea that references to food and a pizza restaurant located in Washington DC in the stolen emails of Clinton’s campaign manager John Podesta were actually a secret code for a child trafficking ring.
“The #Hillgramage 2018”. I’m shaking tonight with this drop. This would be an appropriate outcome. From #Epstein to #CometPingPong and everything in between, We’ve been waiting for this. We’re coming @HillaryClinton, I’m sharpening my pitchfork right now. #FRAZZLEDRIP #Pizzagate https://t.co/NFbCc4AZTt
— Hi, I’m Mike (@ImMikeRobertson) April 15, 2018
While many politicians have shown questionable pasts, especially when it comes to paedophilia, it should be noted that the three conspiracy theories mentioned above are completely unfounded. There isn’t a video out there that depicts Hillary Clinton ripping off a child’s face or drinking blood, simply because no such thing ever happened. Yet, multiple conspiracy theories of the Trump era seem to believe that Hillary Clinton is a secret paedophile and murderer. And, up until the end of 2018, Frazzledrip videos had taken over YouTube completely. So what happened?
More than one billion hours’ worth of content is viewed on YouTube every single day. About 70 per cent of those views come from YouTube’s recommendations, according to Algotransparency, a website that attempts to track “what videos YouTube’s recommendation algorithm most often recommends.”
If you had typed “Frazzledrip” in YouTube’s search bar at the beginning of 2018, you would have found thousands of videos on Hillary Clinton’s alleged murder and child trafficking. Previous users from Gab.ai, 4chan and 8chan had also flocked to YouTube to share their views with users from these sites linking to YouTube more than to any other website—thousands of times a day, according to research from Data and Society as well as from the Network Contagion Research Institute, both of which track the spread of hate speech.
Now, while it remains possible if you’re willing to put some effort into it, finding videos about this specific conspiracy theory has become harder on the platform. That’s because, on 11 December 2018, while Google’s CEO Sundar Pichai was testifying before lawmakers in Washington about different matters, one lawmaker asked him about the way YouTube’s algorithms could be used to push conspiracy theories and highlighted how urgent it was for this problem to be regulated.
Congressman Jamie Raskin said, “The point at which it becomes a matter of serious public interest is when your communication vehicle is being used to promote propaganda that leads to violent events.”
Raskin added, “Is your basic position that [Frazzledrip] is something you want to try to do something about, but basically there is just an avalanche of such material and there’s really nothing that can be done, and it should be buyer beware or consumer beware when you go on YouTube?”
In other words, Raskin shed light on the fact that YouTube, which Google purchased for $1.65 billion in 2006, had a major conspiracy theory problem—and technically still does today, only not as much with Frazzledrip. And at the time, it looked like neither Congress nor YouTube were anywhere near solving it.
YouTube’s content algorithms determine which videos show up in your search results, in the suggested videos stream, on the homepage, in the trending stream, and under your subscriptions. When you go to the platform’s homepage, algorithms dictate which videos you see and which ones you don’t. Same applies when you search for something.
As common as this sounds in today’s digital world, YouTube’s algorithms had an extremism problem. Whether you were previously watching a right-leaning, left-leaning or even non-political video, YouTube’s algorithm would always recommend increasingly more extreme videos in order to keep users’ attention and push them to watch as many videos as possible.
As Zeynep Tufekci, an associate professor at the School of Information and Library Science at the University of North Carolina The New York Times in March 2018, the YouTube advertising model is based on you watching as many videos as they can show you (and the ads that appear before and during those videos).
Because of this algorithm, people who watched videos of Hillary Clinton and Bernie Sanders ended up being recommended videos featuring “arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of Sept. 11.”
if this is recommended to me, imagine what goes to people who actually watch this sort of content pic.twitter.com/tPcOsdbM0r
— John Ganz (@lionel_trolling) December 11, 2018
Back to 2020, and while the problem has been slightly diminished, it remains present. It isn’t easy to balance a platform that claims to be for freedom of expression with societal responsibility. It’s not illegal to believe in conspiracy theories or to think that Michael Jackson didn’t die (he did) or that Hillary Clinton is a child-eating pedophilic cannibal (she’s not).
YouTube previously said in a statement, “False information is not necessarily violative, unless it crosses the line into hate speech, harassment, inciting violence or scams. We’ve developed robust Community Guidelines, and enforce these policies effectively.” Yes, YouTube’s algorithms have been tweaked slightly but changing the way they work altogether would be bad for business.
While explaining that YouTube takes problematic videos on a case-by-case basis, Pichai said “Freedom of speech is at the foundation of YouTube. As such, we have a strong bias toward allowing content on the platform even when people express controversial or offensive beliefs. That said, it’s not anything goes on YouTube.”
Yet today, Frazzledrip and Pizzagate on YouTube have simply been replaced by QAnon. Even though YouTube removes millions of videos on average each month, it is slow to identify troubling content and, when it does, is too permissive in what it allows to remain. We’ve seen the likes of 4chan, 8chan, 8kun and Voat make it to the top only to quickly crash and burn. What’s next after YouTube? Should we be worried?
Until we get an answer to this precise question, there is already something else we should all be worried about. The Pizzagate shooter reportedly had watched a YouTube video about the conspiracy days before heading to Washington from his home in North Carolina, telling a friend that he was “raiding a pedo ring… The world is too afraid to act and I’m too stubborn not to.” Let’s not forget that US citizens have guns, and most of them are not scared to use them—be that for the right reason or not.