Instagram expands its ban on self-harm to cartoons, illustrations, and memes – Screen Shot
Deep Dives Level Up Newsletters Saved Articles Challenges

Instagram expands its ban on self-harm to cartoons, illustrations, and memes

A little over six months have passed since Instagram and fellow social media platforms have implemented the Sensitivity Screens feature in an attempt to censor images and videos representing and promoting self-harm. While the initiative initially only included photographic and video material, it has now been extended to censor drawings, illustrations, cartoons, and memes depicting or describing methods of self-harm and suicide.

The announcement followed the on-going call for action by Ian Russell, father of 14-year-old Molly Russell—the British teenager who committed suicide in 2017, whose Instagram and Pinterest accounts contained graphic material featuring self-harm. Throughout his campaign, Ian Russell has openly linked his daughter’s tragic death to Instagram’s lack of regulation when it comes to the spread of harmful content. He stated on several occasions that Instagram has played an evident role in Molly’s spiralling mental health: letting her entering an echo-chamber of hashtags, images, and groups encouraging self-harm and linking it to suicide.

By now, we all know how the algorithm works, one post about depression might be only a few hashtags away from others about suicide. As we naively scroll down our feed, it can be easier than expected to spiral into a hole of graphic imagery of cutting and bruising. This normalisation of self-harm could easily trigger a particularly vulnerable person. According to Russell, who has researched his daughter’s digital presence in search for answers, Pinterest was sending Molly automated emails featuring disturbing material. This evidently proves that social media platforms were not doing enough to block the spreading and sharing of disturbing content.

In an attempt to stem the proliferation of toxic content and the unforeseen consequences that self-violence images are having on users, Adam Mosseri, head of Instagram, is improving the company’s effort to tackle the issue, by expanding the already existing ban to drawing and illustrations as well. Earlier this year, Instagram changed its policies and invested in new technologies that allowed the platform to block double the content, with 77 per cent of content being removed before even being reported. “To help us stay aware of new trends or cultural nuances, we meet every month with academics and experts on suicide and self-harm. (…) In the UK, we are working with the Samaritans on an industry-wide effort to shape new guidelines to help people in distress. Outside of Europe, we also have additional technology that helps us proactively find people who might be in need. We want to bring this to Europe but there are important legal considerations under EU law, so we’re working with our European regulator,” Mosseri said in an official statement.

But the ban also comes with some side effects. The Sensitivity Screen feature has also affected accounts promoting mental-health awareness and fighting the stigma surrounding the issue. Many profiles that use illustrations to educate and spread information on mental health, ended up having their posts banned due to the new regulations. As much as it’s necessary to prevent people from being exposed to harmful content, it’s equally fundamental to let users share their experiences and have a medium to reach out for help.

Once again, the elements responsible for mental health issues and suicides among teenagers are uncountable, and solely blaming Instagram would be a feeble attempt at simplifying an incredibly complex issue. Having said that, social media platforms have a responsibility when it comes to self-regulation, and the work of people such as Molly’s father shows how crucial it is to push aside social networks’ agenda and force platforms to regulate themselves more carefully. But Adam Mosseri and his team are not the only ones who have the obligation to make Instagram a safe place—users must scrutinise the social media platforms they use as well, and signal harmful content as soon as they come across it.

Instagram is thinking of removing the ‘like’ feature, what does this mean?

Like most people, I check Instagram before going to sleep and do the same as soon as I wake up. Posting on the platform wouldn’t be that big of a deal for us if likes weren’t such a big part of the process. Likes control us as soon as we press the ‘post’ button—only after having gone through the long procedure of picking a good picture, filtering it, etc. What would it be like if this social media standard of measurement was taken out of the equation?

Last week, former Facebook executive and Head of Instagram Adam Mosseri announced that the company would be running tests in Canada on a new version of the app where users could still like posts but only the owner of the post would be able to see how many likes the picture got. It looks like the company wants people to go back to its roots—focusing on the content that we share instead of the amount of likes we receive. As nice as this sounds, coming from a social media company, it also seems too good to be true.

With apps like Instagram, Twitter, and Facebook, amongst others, likes do more than feed into our constant attention seeking behaviour and our comparison obsession. Likes help the algorithms that basically control those platforms decide which content to show first, or which ads a user is most likely to click on. This kind of data is not something easy to let go of. Even though likes are not planned to be completely removed, just hidden from other users, this new way of consuming social media content is bound to affect the way we show our appreciation for certain posts.

Social media adapts a herd mentality: when a picture that already has a lot of likes shows up on your timeline, you’re more inclined to double tap it than one that doesn’t have a lot or has none. Not only does it reinforce the problem of how we look for validation online, but it also affects our mental health. Even Kanye West said it last year in one of his rants on Twitter—social networks are damaging people’s mental health and we should be protected from knowing how many likes and followers we have.

For some of the younger users of Instagram, pressure to post often as well as like their friends’ photos quickly is part of growing up with the technology. Millennials’ social status is based on how many likes, comments, and followers they have. Changing this could be a first step towards ‘digital detox’, although comments could become the new likes.

This test could raise concern amongst celebrities and influencers, who have monetised on their popularity through sponsored posts, other types of ads and, obviously, likes. Hiding likes would make it harder for them to ‘go viral’ and see how much engagement a post receives. Instagram would only benefit now from making it harder for businesses and influencers to thrive on its platform, because people would praise them for trying to make it a safer environment.

What about in the long run? If users can’t imagine how influential you are because your likes count is secret, then advertisers and influencers will probably just find or create another platform where more money can be made through the perpetuation of this herd mentality.

Our relationship with social media, and as a result likes, has slowly turned into something bordering on unhealthy. Even though this possible new version might not be as dramatic as it sounds, it could still change a few things—for the app and for our mental health. We could go back to posting pictures just to share them with our friends, families (and fans for celebrities and influencers) just for fun. Today, social media is more about winning at life—let’s make it enjoyable again.