YouTube’s new algorithm wants to keep us ethically addicted

By Laura Box

Published Oct 2, 2019 at 09:52 AM

Reading time: 3 minutes

3862

YouTube’s newest algorithm is set to make the platform even more addictive—an exciting prediction for investors who expect the changes to increase profit by tens of millions—while simultaneously promising to reduce the echo-chamber the platform has created. But can YouTube balance increasing addiction with its self-imposed ethical responsibilities? Probably not.

YouTube’s magnitude is almost inconceivable; it would take a month to watch the amount of content uploaded to the platform every two minutes. Since growing to be the second largest search engine on the planet, inferior only to its parent company Google, public demand has increased for the service to address the negative implications of its algorithm.

YouTube has come under fire for its tendency to create echo chambers; that is, once a user clicks on one type of video, it’s more likely to recommend similar videos, thus reinforcing biases and making it likely to “push users into an immersive ideological bubble,” as academic Derek O’Callaghan suggests. This phenomenon has seen led some to credit the platform with a significant role in the rise of populism.

In 2014, O’Callaghan pointed out that these echo chambers can quickly lead users into bubbles of far-right extremism. As extremist content often contravenes hate laws, videos with these views are technically illegal in the countries that YouTube has recommended them in. Despite these warnings, which emerged 5 years ago, research from Swansea University earlier this year found that YouTube is still significantly more likely to prioritise extremist recommendations to users who had interacted with it, showing that little change has been made.

Prolific author and YouTuber Hank Green argues that the echo chamber isn’t necessarily bad for content creators, writing that “our channels are our homes on the internet, and we need to figure out how to make them safe.” He explains that this environment allows YouTube communities to feel welcomed and engaged. But the unnerving flip side is that the communities of those expressing highly controversial views also feel welcomed and engaged. Previously believed to be reserved for the shadowy corners of the internet, alt-right YouTubers, such as Carl Benjamin (Sargon of Akkad), who has consistently made homophobic, sexist and racist comments, and sympathetically interviewed people who’ve had rape allegations against them, calling them “victims”, have been given a safe space by the platform and amassed huge followings.

Now, YouTube is claiming that the latest algorithm will reduce this echo chamber.

While this appears positive on the surface, there’s evidence that YouTube’s attempts to rectify past public frustrations have merely been band-aid solutions. After studies indicated that the majority of climate content on YouTube was by climate deniers, public outrage pushed YouTube to prioritise videos from legitimate climate sources. Now when users search “Climate change is a hoax”, legitimate, factual information and embarrassed climate deniers are among the first to appear. Despite this, climate change denial videos still quickly rack up, not only views, but overall positive perceptions. Within a week, a denial video (avoid the comment section if you value your mental health) received over 100K views and thousands of likes, showing that the echo chamber on these videos is truly alive and well.

This is the suspicious and irksome nature of YouTube. On the surface, it appears ethically conscious by hiding videos that spread false information, yet it’s apparent that alt-right and scientifically misinformed content makers are still somehow rewarded by its algorithm.

YouTube’s business model has been built around incentivising creators of shocking and controversial content. The platform has not only allowed misogynist, racist and homophobic content to stay online, but it has monetised this content. The platform rewards YouTubers, regardless of the harm their belief systems create, as long as their content is shocking and controversial enough to garner significant views.

YouTube’s CEO Susan Wojcicki recently declared that the company is taking steps to improve its platform in an acknowledgement of social responsibilities. She outlined four “Rs” it will use to do so: Remove (harmful content), Raise (authoritative voices), Reduce (recommendations of harmful content), and Reward (those with videos at a certain standard).

But she forgot YouTube’s fifth and most important R: Revenue.

For a platform that has built its success on the innate nature of its current algorithm, it seems pretty unlikely that YouTube will jeopardise this model. It’s likely that the changes will simply provide surface-level solutions to placate the concerned public, while any genuine self-imposed social responsibility will be outweighed by the capital their algorithm craves.

Keep On Reading

By Abby Amoakuh

Why are people tagging Bad Bunny in videos of them crying? The DtMF TikTok trend explained

By Abby Amoakuh

Sydney Sweeney calls wedding off and consciously uncouples from fiancé Jonathan Davino

By Fatou Ferraro Mboup

Is Benson Boone Mormon? The singer opens up about how religion has shaped his life

By Charlie Sawyer

Calls for Gisèle Pelicot to be named TIME Magazine’s Person of the Year after Trump takes title

By Abby Amoakuh

Enough founder Katie White and experts debate whether self-swab DNA kits are a breakthrough or a risk to rape justice

By Fatou Ferraro Mboup

Channel 4’s Go Back to Where You Came From is a disturbing social experiment that completely misses the mark

By Charlie Sawyer

Bear attack on Rolls-Royce exposed as insurance scam using human in costume

By Abby Amoakuh

Mikey Madison tells Pamela Anderson why she rejected an intimacy coordinator on Anora set

By Fatou Ferraro Mboup

How celebrities like Mark Wahlberg and Gwen Stefani are monetising spirituality through the Hallow app

By Abby Amoakuh

Unpacking the many controversies of Disney’s live action Snow White and its lead Rachel Zegler

By Charlie Sawyer

Anna Kendrick’s revelations about her 7-year abusive relationship on Call Her Daddy matter more than you think

By Abby Amoakuh

Trying to manifest your dream partner for 2025? Influencer-backed app To Be Magnetic says it can help

By Fatou Ferraro Mboup

Everything you need to know about Dakota Johnson, Pedro Pascal and Chris Evans’ love triangle in Materialists

By Abby Amoakuh

Ayo Edebiri calls out Elon Musk for sparking racist abuse by spreading fake news about her

By J'Nae Phillips

The goth girl glow-up: How Jenna Ortega’s helping Gen Z rethink this dark aesthetic

By Charlie Sawyer

How Netflix’s Adolescence and Kyle Clifford’s triple murders connect to Andrew Tate

By Joe Pettit

Why the internet made a CEO’s killer its new sex symbol

By Abby Amoakuh

Lavender marriages are going viral right now as Gen Z throws in the towel on modern dating

By Charlie Sawyer

How Emily Bhatnagar transformed her father’s cancer battle into a lifeline for sick children

By Abby Amoakuh

Iraq legalises child marriage following proposal to lower age of consent to nine