Less than two years after Facebook hired Frances Haugen, the data scientist and product manager decided she had seen enough. Testifying before a Senate panel on 5 October 2021, Haugen unravelled a string of concerning insights—rivalled only by the company’s resistance to change.
Among Facebook’s priority of “putting profits ahead of people’s safety” were major concerns about the mental impact of Instagram on its youngest users. Proven to exacerbate body issues in teenage girls and amplify harmful content with its engagement-first algorithm, the revelations warranted change. A recent announcement by Nick Clegg, Facebook’s vice president of global affairs, seems to be a step in the new direction.
“We’re going to introduce something which I think will make a considerable difference,” Clegg told CNN’s State of the Union, less than a week after Haugen’s testimony. “Which is where our systems see that the teenager is looking at the same content over and over again and it’s content which may not be conducive to their well-being, we will nudge them to look at other content.” With a new mantra of “more friends, less politics,” the social media giant ultimately plans to reduce the amount of political content on feeds, instead focusing on content from friends on the platform. “We’re going to introduce new controls for adults of teens on an optional basis, obviously, so adults can supervise what their teens are doing online,” Clegg said, expressing an openness to the idea of letting regulators have access to Facebook algorithms that are used to amplify content.
In addition to pausing its controversial plans of building an Instagram for kids and building a set of ‘opt-in’ parental supervision tools for teenagers, Clegg also unveiled a feature called ‘Take a Break’—where the platform “will be prompting teenagers to simply take a break from using Instagram.”
Although Clegg didn’t provide an exact timeline for both the features, a Facebook spokesperson added how they are “not testing yet but will soon.” In response to an email from The Verge seeking further details, the spokesperson outlined a blog post by Instagram head Adam Mosseri which mentioned that the company was exploring the tools. “Recent reporting from the Wall Street Journal (WSJ) on our research into teen’s experiences on Instagram has raised a lot of questions for people,” the blog post by Mosseri read. “To be clear, I don’t agree with how the Journal has reported on our research.” Addressing the claims of its algorithm proliferating negative body images among teenage girls, he mentioned both ‘nudge’ and ‘Take a Break’ as new features in the platform’s quest on youth safety.
However—as Mosseri himself puts it—the former “encourages people to look at other topics if they’re dwelling on content that might contribute to negative social comparison,” while the latter allows users to “put their account on pause and take a moment to consider whether the time they’re spending is meaningful.” In short, breaks and nudges are being introduced to help users safeguard their own exposure to harmful content on the platform, but the company is not planning to remove such content in the first place. Facebook is once again inviting regulations, but only those that it’s comfortable with.
With China limiting the online presence of its youth with restrictions on gaming and TikTok, several lawmakers have argued for more regulation against tech giants like Facebook. “I’m just tired of hearing ‘trust us’, and it’s time to protect those moms and dads that have been struggling with their kids getting addicted to the platform and being exposed to all kinds of bad stuff,” Senator Amy Klobuchar told CNN after Clegg’s interview.
According to her, the US needs a new privacy policy where people can ‘opt-in’ favour of allowing their online data to be shared. The country should also update children’s privacy laws, its competition policy and require tech companies to make their algorithms more transparent. Although all these recommendations are not new, it’s concerning how they are not a given for social media companies and corresponding laws in 2021.
Now that the world beyond Facebook knows about its capitalist priorities and misconduct, the platform is on a deadline to deliver its promise of a safe online space for generations to come. Although user-initiated solutions are definitely a step in the way, it does less to nothing in this broader quest. Time is ticking, Zuck. It’s only a matter of time before everyone unfriends your giant social network.