On 1 July 2016, British entrepreneur Tim Stokely launched OnlyFans, a subscription-based platform that allows creators to charge fans for the content they share. Fast-forwarding to 2021, OnlyFans has been dubbed “the hottest social media platform in the world” with over 1 million creators and 100 million users. While DJ Khaled and Fat Joe cemented OnlyFans’ growth with their own sign-ups, a recent BBC investigation revealed the company’s failure in keeping a particular demographic off its platform: underage users—with children as young as 13 setting up accounts using fake documents of their older relatives.
As part of the investigation, BBC News spoke to several child protection experts and police forces across the UK and US, and obtained anonymised extracts from child counsellors at various schools. The BBC also set up an underage account leveraging a 26-year-old’s identification to show how the platform’s age-verification process could be easily cheated. From ‘co-authoring’ explicit material with older creators to spotting missing children in some videos, the investigation revealed a number of shocking insights regarding underage experiences on OnlyFans.
According to the Hertfordshire Police, a 14-year-old girl had managed to trick OnlyFans’ age-verification system by using her grandmother’s passport and bank details. The money made from selling a plethora of explicit images was then redirected by the girl from her grandmother’s account into her own.
Another case reported by the BBC followed Leah, a 17-year-old girl based in England, who was able to set up an underage account and sell explicit videos by using a fake driver’s licence. She told her mother, Caitlyn, about her original intention of posting pictures of her feet on OnlyFans after making money selling them on Snapchat. However, she quickly escalated into sharing explicit videos on the platform, which raked in as much as £5,000 under a week. The 17-year-old reportedly spent the money to buy presents for her boyfriend including more than £1,000 splurged on designer clothes. The BBC also found comments like “beautiful” and “sexy” under tweets that advertised her OnlyFans account. These tweets sometimes included teaser videos with some users even asking the underage creator to meet up offline.
Leah’s underage account was reported to OnlyFans by an anonymous user in January 2021. This led to a moderator reviewing the account and double-checking her identification document (ID). According to the company, her ID appeared legitimate and no further action was taken. In a statement to the BBC later, OnlyFans explained how Leah’s ability to access the platform was an “oversight” evading a red flag. The company also added that her account was approved during “a transition from one effective ID and age verification system to a new exceptionally effective one.” OnlyFans further highlighted how it checks social media when verifying accounts although there is no particular obligation for a website to investigate. Leah’s mother, however, stated that her daughter’s age was mentioned on every social media account she was on.
Although Leah stopped posting on OnlyFans, her account remained active on the platform for four months after the initial report in January. After being contacted by BBC News, OnlyFans shut down Leah’s page. The platform has also refunded all active subscriptions to her account. But images she previously shared have reportedly been leaked on the internet—leaving the 17-year-old anxious about leaving the house in the fear of being recognised.
According to her mother, Leah has had “big issues growing up and missed a lot of education.” The girl also had an experience where explicit images of her were once shared around school without her consent. The present situation added fuel to her harrowing experiences, with Leah delaying her plans for college altogether. “She won’t go out at all,” her mother said. “She doesn’t want to be seen.”
In terms of ‘co-authored’ content, OnlyFans requires creators to submit documentation proving all contributors to be above 18 years of age. All contributors are also required to be registered creators on the platform. The BBC investigation, however, unveiled a shocking case where an underage user was repeatedly cast in explicit videos featured on an account run by an older creator.
Aaron, a teenager based in Nevada, was 17 when he started making videos on the platform with his older girlfriend. According to his friend Jordan, Aaron didn’t have an independent account on the platform but “got sucked into” appearing in explicit videos posted by his girlfriend, Cody. The duo amassed as much as $5,000 for a single video, which they split among themselves.
“Aaron was elated that they were making such an amazing amount of money for just having sex on camera for other people to watch,” said a woman who has known the 17-year-old for many years. She added how Aaron had a tough childhood and was “very vulnerable to exploitation.” The teen also encouraged an underage Jordan to make videos on OnlyFans. “He used to say: ‘Bro, you can do it. We make so much money a week. It’s easy, you don’t have to work ever’,” Jordan admitted. Although his girlfriend’s account was reported to the police in October 2020, it wasn’t removed until the BBC contacted OnlyFans about the case in May 2021. Aaron, who is 18 now, has broken up with Cody—currently harbouring plans to start his own OnlyFans account.
As part of the investigation, BBC News also approached police forces in the UK and US about complaints they had received involving children appearing on the platform. The list of complaints lodged included claims of revenge porn faced by a 17-year-old in South Wales—who was blackmailed into continuing her OnlyFans account or having photographs from the platform shared with her family. Three other children complained about their images being uploaded to the platform without their consent while another 17-year-old claimed to have her face edited onto someone else’s body.
Then there is the case of missing children being linked to videos on the platform. “In 2019 there were around a dozen children known to be missing being linked with content on OnlyFans,” said Staca Shehan, the Vice President of the National Center for Missing and Exploited Children (NCMEC). In her statement to the BBC, Shehan explained how these numbers have nearly tripled over the last year. While much of the explicit content was self-initiated by underage children, NCMEC also admitted to finding evidence of sexual exploitation and child trafficking on the platform. The BBC further mentioned how a couple based in Florida were charged with human trafficking after selling a topless photograph of a missing 16-year-old girl on OnlyFans.
As a response to all the findings from the investigation, OnlyFans said that it would work with online exploitation agencies like NCMEC to raise any potential issues with the relevant authorities. The platform also plans to take swift action and disable accounts if notified. It also claimed to have updated its age-verification system since then to further reduce the chances of cases like these happening again.
BBC News, however, tested the platform’s “new exceptionally effective system” in April 2021. The updated age-verification system requires applicants to upload a picture of themselves holding up their ID cards next to their face. While a fake ID did not work, the BBC investigators were able to set up an account for an underage creator by using her 26-year-old sister’s passport. In this sense, OnlyFans’ verification system essentially failed to distinguish between the sisters despite the age gap.
After setting up an account, applicants must provide bank details to receive payment through OnlyFans. This step, however, is not a deterrent to posting videos and images on the platform. The investigation further revealed how creators can monetise their content by arranging payments through third-party apps. One of the most popular alternatives found was Cash App, with scores of accounts advertising the payment method on the platform—all of this in violation of the company’s guidelines that prohibits the mention of Cash App and its variants.
On 12 May 2021, the UK government published the Online Safety Bill with the aim of moderating and protecting children from being exposed to explicit content online. The bill is set to implement an age-verification process for accessing porn in the UK and imposing fines of up to £18 million or 10 per cent of a company’s global turnover if they fail to keep children safe on the platform. While concerns are raised about the time the bill requires to be implemented, some are wondering if the law would suffice as a deterrent for wealthy tech companies.
As critics argue how the government should have acted sooner, child safety experts are increasingly expressing their concerns over the mental risks children are exposing themselves to by appearing on such platforms. According to the notes shared by a Childline counsellor with the BBC, both underage users and creators of the platform include those with traumatic experiences of prior sexual abuse, mental health issues and suicidal thoughts. Although the majority of the accounts are self-initiated, the permanent way forward is rooted in high moderation—thereby reducing the chances of underage users “accidentally stumbling upon” such explicit content.
Innovations like OnlyFans may have “changed internet culture and social behaviour forever,” but it has also blurred the line between influencer culture and sexualised content on social media platforms. Although OnlyFans has had a tremendous impact on various lives over the pandemic, the one they have on such young, mouldable users may just be more etched and permanent.
On 16 March 2021, Instagram announced a major security update aiming to make the platform safer for younger audiences. The measure introduced restrictions on Direct Messages (DMs) between teens and adults they don’t follow, constant prompts to be cautious about online interactions as well as encouragement to make their accounts private. Barely two days later, BuzzFeed News obtained an internal Facebook post confirming the company’s plans to launch a separate version of Instagram altogether for children under the age of 13. What could possibly go wrong, right?
“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, wrote on the employee board. The message outlined two ‘youth pillars’ the platform would be focusing on: “accelerating integrity and privacy work to ensure the safest possible experience for teens” along with “building a version of Instagram that allows people under the age of 13 to safely use the platform for the first time.”
“Kids are increasingly asking their parents if they can join apps that help them keep up with their friends,” Joe Osborne, a Facebook spokesman, said in a supporting statement. Osborne further highlighted the absence of child-friendly social networking apps in the present market—hence working on building additional products like Messenger Kids to fill the gap.
“We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests and more,” he added. Given the app’s popularity among teens, Instagram ultimately seeks to tap into an audience of children under the age of 13 as a viable growth segment. Shifting this potential segment onto a separate platform not only helps Instagram regulate the social media but also expand its user base and ‘future-proof’ the app’s demand in the lifestyle of the next generation.
However, child safety experts and health advocates were quick to jump onto the scene, digging up the demography’s brushes with predators on the social media platform—thereby urging Facebook to scrap all plans on implementing ‘Kidstagram’.
In a letter coordinated by the non-profit youth advocacy Campaign for a Commercial-Free Childhood, more than 20 groups and dozens of individual researchers labelled ‘Instagram for kids’ as a tool that will “put young users at great risks.” Citing a “growing body of research” demonstrating the negative effects of social media on the youth, the letter implored Mark Zuckerberg to scrap the project.
“Instagram, in particular, exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers,” the letter read, adding how the platform’s “relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.”
The letter further highlighted how the effects of Instagram—while proven to be negative on teens—will be even more grave for those under the age of 13. “Young children are highly persuadable by algorithmic prediction of what they might click on next, and we are very concerned about how automated decision making would determine what children see and experience on a kids’ Instagram platform,” the letter said.
Although the groups agree that “something needs to be done to protect the millions of children who have lied about their age to create Instagram accounts,” they outlined how “launching a version of Instagram for children under 13 is not the right remedy.” The letter thereby urged the tech giant to abandon its plans that are still “in the early planning stages.”
“Doing so would send a strong message that Facebook understands the vast concerns about the effects its business model is having on young people and is open to solutions that truly benefit children and teens—not just Facebook’s market share,” the letter concluded.
Although Facebook is yet to comment regarding the letter, at a hearing related to Facebook’s antitrust concerns earlier this year Zuckerberg shrugged off all criticisms of the platform, stating that “there is clearly a large number of people under the age of 13 who would want to use a service like Instagram” to “stay connected with friends.”
Let’s be honest here, the backlash that ‘Instagram for kids’ is getting isn’t surprising, especially given the case studies of Messenger Kids and YouTube Kids.
Previous attempt of the tech giant to dip its toes into the coveted market segment—with Messenger Kids in 2017—was quick to run into problems. Two years after its launch, Facebook uncovered a major design flaw that made it possible for kids to enter group chats with strangers without the authorisation of their parents. In the following weeks, Facebook quietly shut down those group chats and alerted users, without making any public statements disclosing the issue.
YouTube is yet another platform that has run into trouble after launching its child-friendly alternative. Launched in 2015, YouTube Kids had to crack down on inappropriate videos being displayed to its users. Earlier this month, the House Subcommittee on Economic and Consumer Policy hammered the service for its low-quality content, a high degree of product placement and insufficient content moderation. Just last week, Viacom, Disney, and 10 advertising technology firms came to a settlement in a lawsuit that accused these companies of launching tracking software on children-focused apps without the consent of their parents.
While Adam Mosseri, the head of Instagram swears by its upcoming transparency and control features, stating the absence of ads altogether on the curated platform, a plethora of researchers are accusing the tech giant for its attempt at “normalising the idea that social connections exist to be monetised.”
“From a privacy perspective, you’re just legitimising children’s interactions being monetised in the same way that all of the adults using these platforms are,” said Priya Kumar, a PhD candidate at the University of Maryland. In an interview with BuzzFeed News, Kumar mentioned how a lot of the children using YouTube Kids often end up migrating to the main platform either by choice or by accident—a bane for parents but a boon for companies. “Just because you have a platform for kids, it doesn’t mean the kids are going to stay there,” she added.
Although Messenger Kids and YouTube Kids have raised concerning issues, some independent entertainment companies that have successfully tapped into the coveted market segment can’t be kept out of the loop. Moonbug, for example, is a successful example of a media network delivering developmentally appropriate and accessible content to children all over the world. Utilising the data collected to “understand the needs, wants and current trends” of the young audience, the on-demand video platform distributes fun and safe content via its kids-friendly app.
While Instagram is constantly introducing new security and anti-bullying tools, they might just be far from solving the problem altogether. After all, if kids below 13 can lie about their age on Instagram, what stops adults from lying about their age on such ‘under-13’ platforms? Maybe ‘Kidstagram’ should remain a hashtag for the time being.