On Tuesday 27 July, Facebook reaffirmed its intention to build an Instagram for children under 13, despite pressure from lawmakers to back down on the plan. On the same day, the company also announced new updates to address concerns about the safety of young users on its platforms, Instagram more specifically.
In a blog post, Facebook said it is developing “a new Instagram experience for tweens” managed by parents and guardians as part of its efforts to “reduce the incentive for people under the age of 13 to lie about their age.”
“The reality is that they’re already online, and with no foolproof way to stop people from misrepresenting their age, we want to build experiences designed specifically for them, managed by parents and guardians,” the post read.
In March, BuzzFeed News obtained an internal Instagram memo stating the company had “identified youth work as a priority” and was planning to build a version specifically intended for kids. In May, 44 attorneys general signed a letter addressed to Facebook CEO Mark Zuckerberg, urging him to scrap plans for an Instagram intended for younger users, citing mental health and privacy concerns. The letter came less than a month after child safety groups and Congress expressed similar concerns.
Facebook’s confirmation that it plans to go along with the development of an Instagram for kids, reportedly called Instagram Youth according to Bloomberg, was ‘hidden’ into an announcement around adding more safety measures to the popular photo-sharing platform. This will include setting the accounts of users under the age of 16 to private by default in order to cut down on unwanted interactions with strangers.
The company is also introducing changes to how advertisers can target users under the age of 18. Previously, any user could be targeted based on their interests and activity; information that Facebook collects from across the web, not just its own properties, analysing individuals’ web browsing history, app usage, and more. Now, advertisers will only be able to target under-18 users based on their age, gender, and location. This applies to users on Instagram, Messenger, and Facebook.
Facebook has long been criticised for how it enforces age restrictions across its platforms. Prior to 2019, it only asked users to confirm they were over the age of 13 and later required their date of birth during the registration process. As you can imagine, at the time, most underage users simply lied.
In a conversation on the Breakfast Club radio show Tuesday 27 July, head of Instagram Adam Mosseri said he knew its Instagram for kids efforts would “get a lot of heat” but called it “the right thing to do, so we gotta do it.” Let’s wait and see, I guess.
On 16 March 2021, Instagram announced a major security update aiming to make the platform safer for younger audiences. The measure introduced restrictions on Direct Messages (DMs) between teens and adults they don’t follow, constant prompts to be cautious about online interactions as well as encouragement to make their accounts private. Barely two days later, BuzzFeed News obtained an internal Facebook post confirming the company’s plans to launch a separate version of Instagram altogether for children under the age of 13. What could possibly go wrong, right?
“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, wrote on the employee board. The message outlined two ‘youth pillars’ the platform would be focusing on: “accelerating integrity and privacy work to ensure the safest possible experience for teens” along with “building a version of Instagram that allows people under the age of 13 to safely use the platform for the first time.”
“Kids are increasingly asking their parents if they can join apps that help them keep up with their friends,” Joe Osborne, a Facebook spokesman, said in a supporting statement. Osborne further highlighted the absence of child-friendly social networking apps in the present market—hence working on building additional products like Messenger Kids to fill the gap.
“We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests and more,” he added. Given the app’s popularity among teens, Instagram ultimately seeks to tap into an audience of children under the age of 13 as a viable growth segment. Shifting this potential segment onto a separate platform not only helps Instagram regulate the social media but also expand its user base and ‘future-proof’ the app’s demand in the lifestyle of the next generation.
However, child safety experts and health advocates were quick to jump onto the scene, digging up the demography’s brushes with predators on the social media platform—thereby urging Facebook to scrap all plans on implementing ‘Kidstagram’.
In a letter coordinated by the non-profit youth advocacy Campaign for a Commercial-Free Childhood, more than 20 groups and dozens of individual researchers labelled ‘Instagram for kids’ as a tool that will “put young users at great risks.” Citing a “growing body of research” demonstrating the negative effects of social media on the youth, the letter implored Mark Zuckerberg to scrap the project.
“Instagram, in particular, exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers,” the letter read, adding how the platform’s “relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.”
The letter further highlighted how the effects of Instagram—while proven to be negative on teens—will be even more grave for those under the age of 13. “Young children are highly persuadable by algorithmic prediction of what they might click on next, and we are very concerned about how automated decision making would determine what children see and experience on a kids’ Instagram platform,” the letter said.
Although the groups agree that “something needs to be done to protect the millions of children who have lied about their age to create Instagram accounts,” they outlined how “launching a version of Instagram for children under 13 is not the right remedy.” The letter thereby urged the tech giant to abandon its plans that are still “in the early planning stages.”
“Doing so would send a strong message that Facebook understands the vast concerns about the effects its business model is having on young people and is open to solutions that truly benefit children and teens—not just Facebook’s market share,” the letter concluded.
Although Facebook is yet to comment regarding the letter, at a hearing related to Facebook’s antitrust concerns earlier this year Zuckerberg shrugged off all criticisms of the platform, stating that “there is clearly a large number of people under the age of 13 who would want to use a service like Instagram” to “stay connected with friends.”
Let’s be honest here, the backlash that ‘Instagram for kids’ is getting isn’t surprising, especially given the case studies of Messenger Kids and YouTube Kids.
Previous attempt of the tech giant to dip its toes into the coveted market segment—with Messenger Kids in 2017—was quick to run into problems. Two years after its launch, Facebook uncovered a major design flaw that made it possible for kids to enter group chats with strangers without the authorisation of their parents. In the following weeks, Facebook quietly shut down those group chats and alerted users, without making any public statements disclosing the issue.
YouTube is yet another platform that has run into trouble after launching its child-friendly alternative. Launched in 2015, YouTube Kids had to crack down on inappropriate videos being displayed to its users. Earlier this month, the House Subcommittee on Economic and Consumer Policy hammered the service for its low-quality content, a high degree of product placement and insufficient content moderation. Just last week, Viacom, Disney, and 10 advertising technology firms came to a settlement in a lawsuit that accused these companies of launching tracking software on children-focused apps without the consent of their parents.
While Adam Mosseri, the head of Instagram swears by its upcoming transparency and control features, stating the absence of ads altogether on the curated platform, a plethora of researchers are accusing the tech giant for its attempt at “normalising the idea that social connections exist to be monetised.”
“From a privacy perspective, you’re just legitimising children’s interactions being monetised in the same way that all of the adults using these platforms are,” said Priya Kumar, a PhD candidate at the University of Maryland. In an interview with BuzzFeed News, Kumar mentioned how a lot of the children using YouTube Kids often end up migrating to the main platform either by choice or by accident—a bane for parents but a boon for companies. “Just because you have a platform for kids, it doesn’t mean the kids are going to stay there,” she added.
Although Messenger Kids and YouTube Kids have raised concerning issues, some independent entertainment companies that have successfully tapped into the coveted market segment can’t be kept out of the loop. Moonbug, for example, is a successful example of a media network delivering developmentally appropriate and accessible content to children all over the world. Utilising the data collected to “understand the needs, wants and current trends” of the young audience, the on-demand video platform distributes fun and safe content via its kids-friendly app.
While Instagram is constantly introducing new security and anti-bullying tools, they might just be far from solving the problem altogether. After all, if kids below 13 can lie about their age on Instagram, what stops adults from lying about their age on such ‘under-13’ platforms? Maybe ‘Kidstagram’ should remain a hashtag for the time being.