Internet trolls are not a new phenomenon, in fact, they’ve been around since the late 1980s. However, in an era where cancel culture has become synonymous with cyber-bullying, we spot the rise of a new genre of trolls altogether—ones who are backed by a company’s payroll to infiltrate, manipulate and control online conversations surrounding their rivals. Welcome to the nasty little business of professional trolling.
Professional trolling can be summed up as the coordinated efforts to spread online ‘disinformation’—a subset of misinformation that is deliberately deceptive and misleading. The practice essentially offers governments, political parties and tech firms a fast and cheap way to weaken their rivals. This is done by exclusively employing people to carry out such trolling activities.
Typically roped in from developing countries, these employees are responsible for posting troll comments on various social media forums. They work in organised groups by setting up call-centre like operations, later following companies and influencers targeted by their employers. Masquerading as ‘one of us’, they then infiltrate the comments section and manipulate the conversation by inundating social media with conspiracy theories.
Subjecting other users to social media conditioning, these professional trolls distort the truth by copy-pasting curated thoughts infinite times until they pass off as truth. These fabricated lies are then picked up by other users who end up liking and even sharing these thoughts in their own social circles.
In many ways, professional trolling can be compared to what is happening in the US’ news landscape and the infiltration of pink slime in the country’s local news.
Comment sections can often be the best part about a piece of news, adding a layer of social dialogue onto traditional journalism. It is for this aspect that many head over and engage with other like-minded individuals. However, this practice can also be extremely devastating.
A study published in The Journal of Computer-Mediated Communication asked participants to first read a news post on a fictitious blog explaining the potential risks and benefits of a new product and then head over to the comments section to engage with other readers. The sample comments posted under the article then exposed participants to both civil and uncivil opinions. The results were surprising and disturbing at the same time—uncivil comments not only polarised readers but often impacted a participant’s interpretation of the story itself.
A digital analyst at The Atlantic further conducted a study to analyse this polarisation, only to resonate with the above-mentioned findings. The participants who were exposed to negative comments in the study were more likely to judge the quality of the article and—regardless of the content—doubt the truth it stated. These findings essentially make one believe that we are all just one comment away from losing our faith in humanity.
Last month, Facebook uncovered a massive ‘troll farm’ in Albania, linked to an Iranian militant group. The operations of the group had the “hallmarks of a typical troll farm,” which Facebook defines as “a physical location where a collective of operators share computers and phones to jointly manage a pool of fake accounts as part of an influence operation.”
“It looked like a team of trolls hot-desking,” tweeted Ben Nimmo, Facebook’s Global Influence Operations Threat lead. Nimmo noted the trolling operation resembling that of a full-time job from 6 am to 11 pm “with a break around lunchtime.”
Just last week, Digital Africa Research Lab and Buzzfeed News uncovered a large troll operation in Nigeria. Led in collaboration between a Nigerian PR firm and a UK-based nonprofit organisation, the operation paid social media influencers in Nigeria to tweet twice a week in support of a Columbian businessman, Alex Saab, accused of money laundering in the US. Following the report, Twitter went on to suspend more than 1,500 accounts for manipulating #FreeAlexSaab.
“Operations like these tend to be about making noise,” tweeted Nimmo. “They create the impression that a viewpoint is more popular than it is.” Although secretive in nature, such professional troll farms tend to share key attributes, which helps researchers and tech platforms to sniff them out almost instantaneously.
First attribute is that of shared physical location. “Troll farms are often propped up by a party that will pay for high-speed internet and computers that together power the network,” noted Axios. “It’s easier to finance and monitor operations that physically sit close together.”
Next up is the time frame: content from troll farms tends to be posted during work hours, with breaks for lunch as noted by Nimmo earlier. The last factor which distinguishes the presence of such operations is ‘hyper-targeted messaging’. “Posts from troll farms tend to zero in on a certain political message whereas most ordinary users post about an array of topics,” Axios concluded.
Professional trolling fosters a “symbiotic relationship between companies eager to weaken rivals and developing nations eager for cash.” In an interview with Axios, Carroll, a 20-year veteran with the FBI, admitted to seeing such trolls being employed from places like Vietnam, Philippines, and Malaysia. “Where there’s a lot of cheap labor and little oversight,” he added.
“We’re also seeing a lot more troll operations being picked up in Africa,” said Jean le Roux, a researcher at the Atlantic Council’s Digital Forensic Research Lab, in the interview. “More people in Africa are going online on social media,” he continued. “At the same time, Africa is one of poorer continents, which creates an easy recipe for countries like Russia to step in and pay someone to sit behind a computer all day.” These organisations go lengths to set up ‘cut-outs’ or systems to pay trollers without having to go through a bank that would get them noticed. Experts at covering up their tracks, they usually distribute money via a third-party on the ground itself.
With all this being said, however, professional trolling is seen as a double-edged sword, terrifying for some, favourable to others. While a good amount of critics perceive it as a practice that puts “the integrity of the internet at stake,” some companies praise it for bringing revenue-producing traffic to their website. For these organisations, all publicity is good publicity.
“The best way to slow down professional trolls is to make it more expensive for them to carry out disinformation campaigns,” le Roux added. A Twitter campaign periodically reminds followers to ignore anonymous comments on social media platforms. “You wouldn’t listen to someone named Bonerman26 in real life. Don’t read the comments,” a viral tweet reads.
While transparent comment systems and forums could help regulate the practice, the veil of anonymity given to users can’t be completely removed from such platforms—breeding in turn—a space where professional trolls influence users while flying under the radar. Whether or not you choose to read or engage with the comments of these paywall-backed trolls, it’s always a good idea to harbour scepticism for them. After all, like le Roux mentioned, trolling requires very few technical skills to carry out with almost every emerging economy susceptible to the practice.