YouTube’s recommendation system is not set in stone. Every year, the company makes small changes to its ‘Up Next’ sidebar to reinforce its quicksand algorithm. In 2019, a research team from Google Brain, Google’s AI division, began testing a new algorithm that incorporated what is known as ‘reinforcement learning’ to build a “long-term addiction machine.” The new AI, called Reinforce, was designed to maximise user engagement over time by steering them into different parts of YouTube, eventually expanding their taste—instead of suggesting videos similar to the ones they had already watched.
Reinforce was YouTube’s most successful launch in two years. At an AI conference, Minmin Chen, a Google Brain researcher, admitted that sitewide views increased by nearly 1 per cent—which on YouTube’s scale surmounts to million more hours of watch time and revenue from advertising. She added that the new algorithm was already starting to alter users’ behaviour.
Enter YouTube’s far-right creators. Specialising in content targeting cross-genre exploration, these YouTubers tremendously benefitted from the algorithm changes, which unknowingly facilitate far-right radicalisation. They cleverly called out left-wing biases in videos reviewing the latest Star Wars episode and ranted about feminism while streaming Call of Duty—both moves attempting to ‘red-pill’ (an internet slang term for converting to far-right beliefs) movie buffs and gamers with niche content.
On a platform where 70 per cent of the user engagement is aided through recommendation, these entertainers perfected the art of tumbling viewers down a rabbit hole of far-right content. They built their audience with a subversive yet satirical take on leftist issues—fostering an addictive experience that shuts out all other views.
A probe by The New York Times proved the algorithm to be a useful recruiting tool for far-right extremist groups. Bellingcat, an investigative journalism website, found YouTube cited as the most frequent cause of members’ red-pilling in a far-right chat room. An analysis of around 30,000 Twitter accounts affiliated with the alt-right by VOX-Pol, a European research group, found accounts linked to YouTube more often than to any other site.
“They’re not selling politics as politics but conservatism as a lifestyle brand,” stated Ian Danskin, creator of Innuendo Studio, in a video titled ‘The Alt-Right Playbook: How to Radicalize a Normie’. He states that the practice of abandoning progressive principles and embracing conservatism is sold to its increasingly-pigeonholed viewers as “something that will make them happy.”
In his video, Danskin breaks down the steps taken by these creators into five main actions: identify the audience, establish a community, isolate, raise their power and give them a mission. With comments like “This was me two years ago,” “Can confirm this is how it works,” and “Damn dude, this hits hard,” Danskin makes his point as an active member of BreadTube—a collective of left-wing YouTubers united by their shared interest in combating the far-right to deradicalise viewers.
Often interchangeable with LeftTube, videos made by these crowdfunded creators mimic the aesthetics of right-wing YouTube by mixing politics with other mainstream interests like films, video games, popular culture and philosophy. Creators in this movement don’t get outraged by far-right ideals, but instead feature a theatrical yet didactical style to convey leftist thoughts, adopting a laid-back, rolling-my-eyes approach to counter far-right propaganda.
Initially created on Reddit, the term BreadTube comes from Peter Kropotkin’s The Conquest of Bread, a book on anarcho-communism. The community is highly decentralised, mostly running on mutual cameos and shout-outs between creators. Major figures associated with BreadTube are ContraPoints, Philosophy Tube, Lindsay Ellis and Hbomberguy. The label imposed on these creators, however, is highly-debated, with these YouTubers identifying to the term in varying degrees.
The core of BreadTube’s strategy is to hijack the YouTube algorithm to help burst its political bubble and foster a space for deradicalisation. BreadTubers use the same titles, tags and descriptions as far-right YouTubers so that their content is recommended to the same audience as the far-right. In some cases, these channels and creators respond directly under these far-right videos to increase their exposure and redirect traffic.
The success of BreadTubers can be quantified by the number of likes and comments amassed under their videos. The movement is referenced by academics as a case study in decentralisation. With contested claims of the YouTube algorithm increasingly promoting far-right ideals, BreadTube is definitely a step in the direction of beating the far-right at their own game. And quoting Danskin himself to sum up the success of the movement, “One thing we have that the alt-right doesn’t is hope.”
The topic surrounding China and its interference into Western democracies through the use of propaganda has been a recurrent subject of discourse within the last few years, often labelled as a major worldwide threat. And now, social media and the internet are both argued to be big contributors to this potential threat.
In recent years, there has been a lot of speculation on whether the Chinese government may be using internet personas and content creators as a means to create pro-China content, specifically through YouTube. But what incentive would the Chinese government have for this, and just how common is the interference of political propaganda within internet culture?
In June 2019, Canadian YouTuber and The Washington Post’s Global Opinions Columnist J.J. McCullough posted a video on YouTube, titled They tried to get me to post Chinese propaganda. In the video, he claims to have received an email from a man named Franco, saying the following:
Just watched your videos, and we thought it would be a great to place our content, we wonder if you want to help us upload this video to your YouTube channel?
And for that, we will support your YouTube channel for $500
Pls feel free to email us back if you have any questions or requests.
McCullough goes on to explain that the video attached to the email featured anti-Falun Gong propaganda, a religious movement that the Chinese communist party has been trying to eliminate since the late 90s. What was most shocking for McCullough though, was the lack of research behind Franco’s email—McCullough has previously written a number of articles for The Washington Post criticising the Chinese government.
McCullough followed up with the email to investigate the story further for himself, but explained to his viewers that he had no intention of actually participating in this, criticising the “endless cycle of disinformation” created by political propaganda.
Screen Shot spoke to Hannah Bailey, DPhil student in Social Data Science at the Oxford Internet Institute, whose research specialises in China’s use of state-sponsored digital disinformation. When asked about what may motivate a government like China’s to seek content collaborations and agreements with influential internet personas globally, she explains that “a lot of authoritarian states have a vested interest in shifting international narratives.”
“They’re trying to really position themselves in a positive light, because they are quite aware of the fact that international audiences don’t have a particularly favourable opinion of China,” Bailey continues.
A recent investigation conducted by The Times claims that British YouTuber duo Lee and Oli Barret, who are known for their pro-China content, have had some of their videos funded by Chinese Radio International, a state-owned broadcaster. Some of the duo’s videos include titles like Western media LIES about China, There is SO MUCH we can Learn from China // 中国有太多值得我们学习的地方, and We Can Live a Much Better Life in China // 在中国我们可以过上更好的生活, to name a few.
They have since responded to The Times’ investigation on their channel, denying being funded by Chinese Radio International but admitting to making branded sponsored content as most influencers do, which they say is not affiliated with the government.
In one particular video, titled Xinjiang – Let’s Talk About It, Lee Barret says that he suspects the Xinjiang camps are used as re-educational facilities, although it is important to highlight he discloses that he has no evidence of this, and it is purely his speculation. This, however, was published at a time when the Xinjiang camps were receiving global media attention around their treatment of Uighur Muslims, with multiple reports disclosing information about forced labour.
The speculation around whether these YouTubers are promoting pro-China content and propaganda is therefore not uncalled for—after all, they have been frequently accused of this. Lee and Oli Barret’s case is also not unique; they are part of a much larger group of influencers who are known to create distinct ‘pro-China’ content.
When asked what motives these influencers may have behind the creation of their content, Bailey explains that “maybe these YouTubers are more unaware of the problems in engaging with and promoting the Chinese government in this way” but instead “just see this as something to achieve monetary outcome.”
“The content of our channel is completely independent. We choose what content we will produce, we decide what we are going to say in that content, we decide what we are going to publish on our channel,” Lee Barret explains in one of their videos. It is, of course, difficult for us to pinpoint whether they are being funded for certain and by whom, as these are accusations they frequently dispel. But there is a lot of existing evidence suggesting that the Chinese government is using the internet to infiltrate propaganda on a global scale.
“China’s kind of new to the international influence operations. They only recently started using a lot of international social media platforms like YouTube and Twitter in the last 5 or so years. But they definitely made their presence on their own platforms through various forms” clarifies Bailey.
She goes on to explain how the tools used to propagate this “have become really sophisticated.” For example, China is notorious for its 50 Cent Army, a group of state-backed internet commentators (whose numbers were proclaimed to range from 500,000 to two million), who were reportedly hired to manipulate public opinion online for the benefit of the Chinese Communist Party. Very similarly, just last year, Twitter reportedly purged 170,000 accounts linked to a Chinese influence campaign. The idea of using YouTubers to infiltrate propaganda online is therefore not that surprising.
But why would anyone try and cultivate propaganda through YouTube? Perhaps this is an attempt to reach a younger demographic. It also highlights just how influential the internet can be (especially when it comes to YouTube and its infamous conspiracy theories) and how much of us are swayed into it.
We live in a digital age, more precisely during a time of severe online misinformation that is largely caused by social media and the speed in which content is spread. So remember to be constructive and critical of everything that you consume online, be that Chinese propaganda or not.