On 16 March 2021, Instagram announced a major security update aiming to make the platform safer for younger audiences. The measure introduced restrictions on Direct Messages (DMs) between teens and adults they don’t follow, constant prompts to be cautious about online interactions as well as encouragement to make their accounts private. Barely two days later, BuzzFeed News obtained an internal Facebook post confirming the company’s plans to launch a separate version of Instagram altogether for children under the age of 13. What could possibly go wrong, right?
“I’m excited to announce that going forward, we have identified youth work as a priority for Instagram and have added it to our H1 priority list,” Vishal Shah, Instagram’s vice president of product, wrote on the employee board. The message outlined two ‘youth pillars’ the platform would be focusing on: “accelerating integrity and privacy work to ensure the safest possible experience for teens” along with “building a version of Instagram that allows people under the age of 13 to safely use the platform for the first time.”
“Kids are increasingly asking their parents if they can join apps that help them keep up with their friends,” Joe Osborne, a Facebook spokesman, said in a supporting statement. Osborne further highlighted the absence of child-friendly social networking apps in the present market—hence working on building additional products like Messenger Kids to fill the gap.
“We’re exploring bringing a parent-controlled experience to Instagram to help kids keep up with their friends, discover new hobbies and interests and more,” he added. Given the app’s popularity among teens, Instagram ultimately seeks to tap into an audience of children under the age of 13 as a viable growth segment. Shifting this potential segment onto a separate platform not only helps Instagram regulate the social media but also expand its user base and ‘future-proof’ the app’s demand in the lifestyle of the next generation.
However, child safety experts and health advocates were quick to jump onto the scene, digging up the demography’s brushes with predators on the social media platform—thereby urging Facebook to scrap all plans on implementing ‘Kidstagram’.
In a letter coordinated by the non-profit youth advocacy Campaign for a Commercial-Free Childhood, more than 20 groups and dozens of individual researchers labelled ‘Instagram for kids’ as a tool that will “put young users at great risks.” Citing a “growing body of research” demonstrating the negative effects of social media on the youth, the letter implored Mark Zuckerberg to scrap the project.
“Instagram, in particular, exploits young people’s fear of missing out and desire for peer approval to encourage children and teens to constantly check their devices and share photos with their followers,” the letter read, adding how the platform’s “relentless focus on appearance, self-presentation, and branding presents challenges to adolescents’ privacy and wellbeing.”
The letter further highlighted how the effects of Instagram—while proven to be negative on teens—will be even more grave for those under the age of 13. “Young children are highly persuadable by algorithmic prediction of what they might click on next, and we are very concerned about how automated decision making would determine what children see and experience on a kids’ Instagram platform,” the letter said.
Although the groups agree that “something needs to be done to protect the millions of children who have lied about their age to create Instagram accounts,” they outlined how “launching a version of Instagram for children under 13 is not the right remedy.” The letter thereby urged the tech giant to abandon its plans that are still “in the early planning stages.”
“Doing so would send a strong message that Facebook understands the vast concerns about the effects its business model is having on young people and is open to solutions that truly benefit children and teens—not just Facebook’s market share,” the letter concluded.
Although Facebook is yet to comment regarding the letter, at a hearing related to Facebook’s antitrust concerns earlier this year Zuckerberg shrugged off all criticisms of the platform, stating that “there is clearly a large number of people under the age of 13 who would want to use a service like Instagram” to “stay connected with friends.”
Let’s be honest here, the backlash that ‘Instagram for kids’ is getting isn’t surprising, especially given the case studies of Messenger Kids and YouTube Kids.
Previous attempt of the tech giant to dip its toes into the coveted market segment—with Messenger Kids in 2017—was quick to run into problems. Two years after its launch, Facebook uncovered a major design flaw that made it possible for kids to enter group chats with strangers without the authorisation of their parents. In the following weeks, Facebook quietly shut down those group chats and alerted users, without making any public statements disclosing the issue.
YouTube is yet another platform that has run into trouble after launching its child-friendly alternative. Launched in 2015, YouTube Kids had to crack down on inappropriate videos being displayed to its users. Earlier this month, the House Subcommittee on Economic and Consumer Policy hammered the service for its low-quality content, a high degree of product placement and insufficient content moderation. Just last week, Viacom, Disney, and 10 advertising technology firms came to a settlement in a lawsuit that accused these companies of launching tracking software on children-focused apps without the consent of their parents.
While Adam Mosseri, the head of Instagram swears by its upcoming transparency and control features, stating the absence of ads altogether on the curated platform, a plethora of researchers are accusing the tech giant for its attempt at “normalising the idea that social connections exist to be monetised.”
“From a privacy perspective, you’re just legitimising children’s interactions being monetised in the same way that all of the adults using these platforms are,” said Priya Kumar, a PhD candidate at the University of Maryland. In an interview with BuzzFeed News, Kumar mentioned how a lot of the children using YouTube Kids often end up migrating to the main platform either by choice or by accident—a bane for parents but a boon for companies. “Just because you have a platform for kids, it doesn’t mean the kids are going to stay there,” she added.
Although Messenger Kids and YouTube Kids have raised concerning issues, some independent entertainment companies that have successfully tapped into the coveted market segment can’t be kept out of the loop. Moonbug, for example, is a successful example of a media network delivering developmentally appropriate and accessible content to children all over the world. Utilising the data collected to “understand the needs, wants and current trends” of the young audience, the on-demand video platform distributes fun and safe content via its kids-friendly app.
While Instagram is constantly introducing new security and anti-bullying tools, they might just be far from solving the problem altogether. After all, if kids below 13 can lie about their age on Instagram, what stops adults from lying about their age on such ‘under-13’ platforms? Maybe ‘Kidstagram’ should remain a hashtag for the time being.
Forget street corners, dodgy cars, or the dark web. According to the trailblazing report DM for Details: Selling Drugs in the Age of Social Media published over the weekend by the think-tank Volteface, social media platforms are the new marketplace for selling and buying drugs, particularly among young people.
Drugs being sold on the internet is not necessarily news, but the report reveals how the phenomenon came a long way since drugs were bought with Bitcoins on Silk Road. The proliferation of drug dealing accounts on the most common platforms such as Snapchat, Instagram, and Facebook speaks of a grand-scale hashtag-driven trend, one that regulators are evidently struggling to keep up with.
Through evidence-based policy and reform, Volteface aims to reduce the harm drugs pose to individuals and society. When the organisation began the study, it was not expecting the issue of online drug dealing to be this extensive. Scarlett Furlong, the policy advisor at Volteface and co-author of the report, told Screen Shot, “When we started this research, we weren’t really sure about how big of an issue this was, particularly in the U.K. context. When finding that 1 in 4 young people have seen drugs advertised for sale we realised the range was quite abnormal,” adding that, “We were meant to publish this report last February, but after seeing how relevant all data were, we realised it was necessary to publish all our findings.” Alongside interviews and focus groups with children, the police, and youth workers, the report’s results are mainly based on polls of 2,006 young people, aged 16 to 24 years old, as well as observational trawls of Facebook, Snapchat, and Instagram, where the researchers went undercover to observe the ways online drug dealings operate.
According to the report, it seems that social media is not only a place for customers to find and purchase drugs, but also an arena for targeting different demographics to become new customers, including new dealers. Young people who otherwise might have never ended up within an environment where drugs are sold and bought offline, could now find themselves just a few hashtags and scrolls away from online communities of drug dealers. One of the main concerns raised by the report is the role that social media platforms are playing in normalising drug use. During its study, Volteface interviewed an anonymous young person who admitted that, “it romanticises it. People think that nothing can go wrong. It really overshadows any drug education people had in the past, like in PSHE lessons or anything like that.”
Personal relationships between dealers and customers have always existed, but social media has tightened the gap between the two parties, creating a close relationship before any transactions take place. It creates the same feeling of familiarity we might have with strangers we follow on Instagram. As quoted in the report, potential buyers “stumble upon numerous dealers showing what young people perceive to be their ‘authentic’ self, or, as is the case with all social media, a side of themselves that the dealer wants the social media user to see. Drug dealers posting about going to college, talking about their family and going to comedy shows.” Dealers are suddenly like everyone else, and with that dispelling any connotations of dangerous and illicit activities that previously existed.
The design of our favourite online platforms, as well as their algorithms, plays a crucial role in the spreading of online drug dealing. Via features such as the ‘suggested friends’ bar and by looking at other people’s ‘following’, ‘follower’ or ‘friends’ list, young persons could find dealers extremely easily. Compared to the dark web, for instance, social media platforms are more user-friendly and way less complicated to navigate, and features such as screenshots, hashtags, swipe-ups, and saved Instagram stories are allowing dealers to be as creative as they want when advertising their products.
Drug dealing has always had specific codes attached to it, including slang that develops with time. Within this context, emojis have become the ultimate alphabet to communicate prices and offers. With an entire visual vocabulary at their disposal, dealers found the perfect way to describe their products clearly without using explicit descriptions nor posting graphic images of actual drugs—although this happens quite frequently too. Eventually, like every digital phenomenon, drug dealing too has picked up the dynamics that are intrinsic to social media platforms: with dealers doing shout-outs to other dealers, calling out scammers, and even growing a considerable amount of followers.
The report goes beyond revealing the shift from offline to online dealing, as it’s shedding light on current drug trends more broadly. “One of the things I was most surprised by was the prescription medications that were sold online. Xanax is the fourth most seen drug that young people have seen online,” Furlong said. The rise of prescribed drugs and their success on social media is an indication of today’s tendencies when it comes to drug consumption. Xanax is the fourth most seen drugs on social media, just after weed, cocaine, and ecstasy—a finding that hasn’t yet been properly acknowledged by institutions, schools, and experts.
The relentless proliferation of online drug dealing and the explicitness through which dealers are using social media platforms clearly speak of a delay in regulations and effective controls. Like with most digital phenomenons, regulators are struggling to keep up with the pace and the codes of online drug dealing. “We want to share all the information we have collected with legislators and companies alike to make sure we all work together to prevent this to grow exponentially,” Furlong adds.
According to Volteface, the government should introduce a regulatory requirement for social media companies to monitor activity on their platforms and to ensure that they are aware of how language, emojis, and design features are used to facilitate drug dealing. Moreover, the report recommends that this research is used to inform already existing algorithms that monitor and remove dealers’ accounts and that “Snapchat, Facebook and Instagram should be included within the scope of the Government’s Online Harms regulatory framework”. Staying coherent with its belief, Volteface is confident that even in this context, cannabis legalisation would be the most effective policy to alleviate the problems outlined in the study.
This phenomenon isn’t simply the migration of drug dealing from real life to the Internet, but also seems to be a brand new way of dealing illegal substances altogether; it is a way of selling and purchasing drugs that is infused with the same social media dynamic young people relate to on a daily basis.