When rioters began to storm key government sites within the Brazilian capital on Sunday 8 January 2023, media outlets and netizens alike couldn’t help but recognise the stark similarities between the ongoing chaos they were witnessing and the violence previously put on display during the storming of the US Capitol in January 2021. Currently deemed the blueprint for any and all future political insurrection, the Capitol riot’s origin, escalation and materialisation all came down to one thing: social media. Now, while considering the present situation in Brazil, we’re all asking ourselves the same question: how do we combat extreme radicalisation online?
Similarly to the far-right political unrest that took place after former President Donald Trump was defeated in the 2020 election, the primary motivation for the Brazilian rioters is to take back power for their chosen leader, former President Jair Bolsonaro. It seems as though a number of citizens have been less than happy with the current executive Luiz Inácio Lula da Silva, more commonly referred to as Lula, who defeated Bolsonaro in a highly contested election towards the end of 2022.
According to The Guardian, thousands of Bolsonaro supporters targeted the nation’s Congress, Presidential Palace and Supreme Court. These rioters were then removed after security forces intervened. Leftist President Lula announced a federal security intervention in Brasília—bringing policing under the control of the central government—lasting until 31 January after capital security forces were initially overwhelmed by the invaders.
In what can only be described as an eerily accurate comparison to Trump, far-right former President Bolsonaro managed to somewhat discourage the violence while not explicitly condemning the mob. Rather than using his political prowess to shut down the riots, he more so took the advantage to make a divisive dig—tweeting about how while the present situation was wrong, the leftist insurgencies that had occurred in 2013 and 2017 had also been unpeaceful.
In regard to global response, numerous national politicians as well as world leaders have categorically denounced the insurrection and expressed dismay at this blatant attack on democracy. Colombian President Gustavo Petro wrote on Twitter: “All my solidarity to @LulaOficial and the people of Brazil. Fascism has decided to stage a coup. It is urgent for the OAS (Organisation of American States) to meet if it wants to continue to live as an institution.”
US President Joe Biden also made his opinions clear, stating: “I condemn the assault on democracy and on the peaceful transfer of power in Brazil. Brazil’s democratic institutions have our full support and the will of the Brazilian people must not be undermined. I look forward to continuing to work with @LulaOficial.”
I condemn the assault on democracy and on the peaceful transfer of power in Brazil. Brazil’s democratic institutions have our full support and the will of the Brazilian people must not be undermined. I look forward to continuing to work with @LulaOficial.
— President Biden (@POTUS) January 8, 2023
Bolsonaro is currently residing in Florida—another thing the Brazilian conservative has in common with Trump, aka the tycoon of chaos. After facing a number of investigations which pertain to his time in office, Bolsonaro seemingly sought refuge in sunny Orlando. Presumably, the former President fancied a ride on Space Mountain before he faced any political repercussions back home.
Shortly after the violence began, US representative Alexandria Ocasio Cortez (AOC) took to Twitter, demanding that the US cease granting Bolsonaro refuge:
Nearly 2 years to the day the US Capitol was attacked by fascists, we see fascist movements abroad attempt to do the same in Brazil.
— Alexandria Ocasio-Cortez (@AOC) January 8, 2023
We must stand in solidarity with @LulaOficial’s democratically elected government. 🇧🇷
The US must cease granting refuge to Bolsonaro in Florida. https://t.co/rzsZl9jwZY
It’s evident that the far-right took umbrage with the recent transfer of power and, potentially having seen the events of January 2021, took it upon themselves to try and force the government’s hand—a strategy that very rarely heeds results. The question that still stands is how political radicalisation and unrest online can then manifest itself into real-life insurrection?
It’s no surprise that social media plays a gargantuan role in online movements, cultural discussions and political radicalisation. However, it’s only now that we’re seeing its sheer influence regularly depicted onto the world stage.
In the most recent case of Brazil, The Washington Post explored the ways in which social media directly drove the far-right mayhem that took place. Analysing social media insights from Brazilian researchers, the news outlet stated that there had been a “war cry party” sentiment circulating on a number of platforms such as TikTok, Twitter and other far-right specific channels.
According to the media outlet, Brazilian researchers stated that among Bolsonaro supporters, a counter narrative had begun to circulate on Sunday 8 January—blaming the Lula government and people from his party for infiltrating peaceful and democratic demonstrations to turn the country against supporters of Bolsonaro. This message also had echoes of the 6 January insurrection, wherein many Trump supporters blamed left-wing activists for the violence.
In fact, Brazilian analysts had even caught wind of disinformation dominating social media in the months leading up to Bolsonaro’s election defeat. On TikTok, researchers found that five out of eight of the top search results for the keyword “ballots” were for terms such as “rigged ballots” and “ballots being manipulated,” while the Portuguese translation of “Stop the Steal” flooded people’s timelines.
TikTok has proven itself to be a platform often complicit in promoting and propping up radicalisation, particularly among young boys and men. You only have to consider the popularity of hyper-masculine misogynist and alleged sex offender Andrew Tate to recognise the significance of the app’s reach.
Another key element to this conversation is the omnipresent role of Twitter. While it may be far more entertaining to concentrate on the petty and laughable aspects of Elon Musk’s recent social media takeover—a far more sinister theme is beginning to emerge.
The billionaire and Tesla CEO recently announced that he’d be lifting the ban on political advertisements on Twitter, thereby aligning the platform with Meta’s Facebook which also allows paid political ads. As reported by Politico, it’s the latest in a series of Musk moves that have reversed policies that were put in place under former CEO and co-founder Jack Dorsey.
Dorsey previously banned all political ads in November 2019, saying in a Twitter thread that paying for political reach “has significant ramifications that today’s democratic infrastructure may not be prepared to handle.”
Far-right discourse nested into Twitter a long time ago and seemingly has zero plans of changing course. With political extremism and radicalisation at an all time high, it’s beyond worrying to consider how social media may’ve officially become a catalyst for terror.
TikTok, a place of algorithmic beauty. A social media site that knows exactly what you want, when you want it. Barely any searching is done on the platform, all you’ve got to do is simply lean back and let your For You Page (FYP) take the reins. But what happens when TikTok’s AI-powered feed starts pushing you down the wrong path?
What was once a service for watching cats go viral and having content creators lose their shirt, is now a place that is pushing toxic behaviours, fake news, and problematic right-wing rhetoric. Without even realising it, you’ll find yourself radicalised.
This is one of the very real dangers of TikTok, an app that takes care of everything for its users. All you need to do is prepare for a thumb-cramp from scrolling too much. While it’s undeniable that the platform has a lot to offer, it can also promote problematic views and ideas targeted towards impressionable brains. In order to shed light on just how easy it is for someone to end up on the wrong side of TikTok, I took it upon myself to investigate, armed with nothing more than a burner account and some hours to kill.
So, how willing is TikTok to show me more radicalising, inflammatory content just to keep me scrolling? Short answer? Very.
Let’s start with some basic ground rules. A fresh account is a must—I’m using one with hopefully very little expository information about myself through cross-website data collection and cookies. Next, I’m planning on scrolling through my FYP only, seeing how far I can get purely by interacting with this feature’s algorithm. This means that searching for any toxic or alt-right phrases in the app’s search bar is off-limit as I want to keep the process as organic as possible.
My ideal end result is to be fed something on my FYP that’s either incredibly inflammatory or possibly endangering. I’d also like to see if the algorithm does in fact show me a consistent stream of right-wing content—more specifically, offensive content that incites hate and perpetuates a very narrow and exclusive worldview.
If I can get to a spot like that on TikTok, I’ll call it a job well done. So let’s get into it.
My first step into this experiment consists primarily of an unfiltered FYP with an algorithm that doesn’t know me yet—that doesn’t quite understand what it is I’m looking for. ‘Satisfying’ sand-cutting videos, stitched with other people’s content, and daytime television clips plague my bottomless feed. I can’t seem to find a hook yet. An anime blind unboxing video perhaps?
Side note: anime is often co-opted online by alt-right communities and suspect individuals—so maybe this is a good way in. I press like and watch the unboxing clip more times than I’d like to admit.
@joey_blindbox_shop Replying to @Toy DreamWorks so cool and cute#joeydiy #blindbox #unboxing #asmr #asmrunboxing #unboxingtoys #legend #traditional #dogfigure
♬ original sound - JOEYDIY
I make sure to scroll cellularly so that I can curate the page a bit more with the “not interested” option. TikTok seems to think I’m very interested in police clips. I feel like I’m on to something with this content, so I start chasing it even more.
Almost immediately, bingo. I’m shown very shallow British military propaganda followed by a short sound bite from Piers Morgan’s Donald Trump interview. This is probably going to be easier than I thought. I make sure to engage with the comments—liking and saving the video as I hover over the clip. I want to show the algorithm that I’m interested and engaged in the content it’s feeding me.
A new day begins and I’m fired up and ready to get scrolling. The first thing I see? An esoteric montage of nihilistic, poignant clips which begins with a segment from Joe Rogan’s podcast. Rogan, of course, is renowned for both his personal controversies as well as his by-proxy endorsement of overtly problematic individuals which he features on his Spotify podcast. This video is surprisingly intense, but I think it’s a step in the right direction.
The next thing on the feed is an equally doomism-inspired montage highlighting the dangers of mass media consumption. I’m getting warmer.
As I continue scrolling, I’m hit with an essay video about Ted Kaczynski, also known as the Unabomber, a former math teacher-turned-domestic terrorist who was responsible for the deaths of three people before his capture in 1996. His manifesto has since become a left-wing liberal obsession in recent years—thanks to the man’s distaste for the advancements of society and a desire to return to simpler, more primitive life.
Oh dear, it seems like we’re going the wrong way now. This is radicalising content, sure, but on the other end of the spectrum. TikTok now knows I’m ready to take a metaphorical leap off of the edge but it’s not sure which building yet. I was looking for right-wing propaganda and brainwashing, instead I’ve found myself bathing in a beacon of radical leftism. I suppose this is a sort of early success for the experiment, but not where I wanted to be exactly. I notify the app that I’m not interested and move on.
Here we go—my FYP has organically shown me an Andrew Tate video. I never thought I’d be so happy to see this bald bastard’s face on my feed. The clip shows the former kickboxer-turned-phoney philosopher talking about how dangerous London is, a typical talking point of Tories. With 100,000 likes on the clip, I engage with it as much as I can and move on.
What follows is a slew of high-profile US marine compilation videos, possible Russian psychological operations (PSYOPs) inflaming the current unstable global climate, more Tate clips—one from an account called MasculineInfluence—another Morgan appearance, this time of him talking about fat shaming, and a video titled “pure Brexit tackles.”
I take my time watching a clip of the overtly sketchy Hustlers University founder talking about how great England used to be. He stokes the old colonial flame as much as he can—it’s quite sad to watch honestly. But it’s all part of the mission here.
About 70 per cent of my FYP right now is some form of Tate video, a man who took the world by storm over the summer as his problematic opinions and thoughts reached an impressionable generation—and continues to do so with his reinstated Twitter account following Technoking Elon Musk’s takeover of the platform. But is this enough of a radicalisation?
It’s close, but I’m looking for something stronger, something that could push someone a step too far.
Sigma male content follows, a North Korean propaganda video next. Wait, what? I’m actually in awe of the mix of content I’m being fed on this account. On what basis is North Korea getting through in the first place? This video has 2.4 million likes, by the way.
After swiping on, I’m starting to get subtle hints of racism now. A football video appears comparing female footballers’ celebrations to their male counterparts, the joke being that the male footballers’ celebration shown is someone doing a Nazi salute after scoring. Comments supporting the salute can be found underneath the video. I save and interact.
I seem to be getting closer to a darker, more extreme end of the platform as more ambiguous, racial, and edgy jokes begin to surface. It’s beginning to get exhausting to sift through this much trash but I’m determined.
I’m also seeing videos imploring Britain to return to the way it was as a colonial power. Clearly nobody did too well in history class, given that we were one of the most oppressive powers at play during the time of colonialism. It’s obvious that the people engaging with this content are only steps away from a radical edge.
Steven Crowder clips from his ‘Change My Mind’ series show up. Then religious content begins to creep through. TikTok thinks I’m a young, right-wing, religious conservative at this point. The red flags are firmly planted. I’m now desperately searching for the next clip that will bring the experiment home.
A video claiming Israel is the “one group you shouldn’t talk about,” praising Kanye West for speaking up. This upsets me, but I’m at the door of anti-semitism now and I’m ready to see myself in.
Looks like I’ve made it. Videos talking about a global cabal that controls the world order are slowly seeping in—a cabal that wants to take away your personal freedoms and warp your perceptions. The next clip is one pointing out the fallacies of the LGBTQ+ movement. Comments are turned off on this one. Next up is an inflammatory video calling left-leaning liberal people the real fascists. Accounts promoting an “escape from the matrix” are more abundant than ever.
I do come across one that actually makes me laugh out loud. It’s a conspiracy video talking about nuclear explosions on Mars millions of years ago. It’s pretty out there and possibly taking me off track but I can’t help but watch it through.
Finally, a genuine nationalist and racist appears on my feed. It’s an account that looks a lot like a PSYOP, honestly. It’s someone masquerading as a nationalist’s ideal woman. The TikTok that ended up on my FYP is a slideshow with a blurry selfie of herself first, followed by screenshots from right-wing news and world happenings. “Europe is finally waking up,” she says. Her account is filled with videos warning of a great replacement—a conspiracy peddled by fascists and racists. I’d finally seen something truly disgusting, and it felt awful.
After two days of consistent scrolling, I can successfully consider myself radicalised on TikTok. Every video on my feed is some form of right-wing media or targeted harassment aimed at the LGBTQ+ community and other minority groups—many of the clips crossing the personal boundaries I’d set myself. A fresh account with a scrolling habit leaning towards right-wing, male media has been completely flooded with this type of dangerously inflammatory content. I can’t help but worry for anyone who has fallen into a hole like this in earnest. It’s a dark, scary, and worrisome place to be.
Something I’ve neglected to mention is how much of this content is curated and published with one specific target in mind— lonely and insecure men. This is particularly evident when we consider creators such as Tate or Hamza, who are hyper-critical of femininity in men too. It’s easy to see how a number of men can gravitate towards this form of content, especially given the expectations in society, or their own insecurities about themselves. I struggled a lot with my own masculinity and femininity growing up, and may have very well been swayed by these types of creators if I’d been exposed to them at an impressionable age.
Things are always changing, and progress is being made but it’s clear that so many of these people feel alone. And this leads them to seek out community in like-minded individuals—a community that quickly radicalises itself when no other voices are allowed to enter.
The Nazis and racists are one thing, but it’s hard not to feel bad for the men who are being targeted by so much of this hyper-masculine, Tate-esque content. From what I’ve witnessed in those clips’ comments sections, these are people who have been let down in life, who faced difficulties, and harboured resentment and grudges into their youth as well as adulthood.
And this just goes on to highlight the platform’s insidious nature. Individuals are essentially at the mercy of an algorithm that prioritises attention economy, and they’re at the mercy of their own inherent loneliness and resentment.
It also makes me worry about impressionable minds, the children being subjected to this platform. It’s hard to take a step back and make assessments for yourself when it feels like the whole world is projecting the same thing you’re thinking. The world that your FYP is curating for you offers very little space for critical thoughts and freedom to challenge the supposed ideals. There are glimmers of hope in the comment sections, but they’re often hidden among the trash.
A quick search shows that, though TikTok has support in place for guardians, little does it acknowledge the problem it poses to the growing generations. The video-sharing platform allows parents to set up restrictions and boundaries but stuff manages to slip through all the time. If I’d ever feel brave enough to repeat this experiment, I think I’d try it with an age-restricted account, to really dive into how problematic content filters through all the same.
In just two days, I’d managed to completely change the face of my FYP by simply guiding the algorithm towards a certain direction. In the process, I’d also subjected myself to some really disgusting content, with TikTok showing no signs of slowing down on the pump unless I’d decided to steer away from it myself. The algorithm picks up on something you favour, and drip feeds it to you—regardless of how negative or hateful the content is. I think I’ve shown just how easy it is to be subjected to this sort of stuff. If you’re above the age of 16, your feed is practically unfiltered and your account unprotected.
Anyone can make it down the rabbit hole I found myself in. This experiment was a strange, dark, and scary journey into a side of TikTok I honestly don’t want to revisit. Now, it’s time to nuke my dummy account and never step foot into this world again.