Instagram and other social media platforms have long been struggling with the rise of communities encouraging self-harm and suicide on their platforms. Following the death of 14-year-old British Molly Russell in 2017, the social media giant, under its new CEO Adam Mosseri, announced this week that it will hide images of self-harm under its ‘Sensitivity Screens’ feature.
There have been a number of deaths and incidents of self-harm in recent years as a result of the spread of various social media groups influencing teens to perform self-harm practices. Most infamously are groups promoting and encouraging members to sustain harmful and sometimes deadly eating disorders. But after Russell’s parents reported that they believe she took her own life following her exposure to disturbing posts on Instagram and Pinterest, social media platforms were forced to confront the dark and sometimes harmful turn their platforms have taken.
“I have been deeply moved by the tragic stories that have come to light this past month of … families affected by suicide and self-harm,” Mosseri write in an op-ed published by the Telegraph on Monday, February 4. “We are not yet where we need to be on the issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe.” He added.
Under the new Sensitivity Screens feature, content that is clearly promoting self-harm, such as images of cuts or other forms of self-abuse will be removed from the platform immediately. This process is currently heavily reliant on community flagging, so the more vigilant and active we become as users, the safer the Instagram terrain will become—in theory. Of course, AI is also used to detect this type of content, however are well aware of the shortcomings of AI image recognition. Recent removals of what AI has detected as ‘pornographic’ images come to mind. Suffice to say, it is not all that accurate or reliable.
Instagram regards itself as a “supportive environment” says Mosseri; a platform for creating awareness among people facing difficult issues, and with that, some aspects of the new move have been criticised by those who see the platform as a place to freely express their struggles and find likeminded comradeship. For that Mosseri reported that Instagram still allows users to “share that they are struggling,” so photos or videos that don’t promote self-harm but talk about it and offer support will remain on the site, but won’t appear in search, hashtags, or the Explore tab.
The company also employs content reviewers trained to make it more difficult to find self-harm images, and is working on new ways to support people in need. “This is a difficult but important balance to get right,” Mossari wrote. “These issues will take time, but it’s critical we take big steps forward now.”
It seems that Mosseri is on the right track when it comes to the difficult task of making the platform safer for users who are vulnerable, yet a side-effect of Sensitive Screens will undoubtedly make this move it more difficult to find accounts that offer positive support to those suffering from mental ill health. With that, it’s crucial that in its chase to make the platform ‘safe’, Instagram (and other social media platforms) tread carefully around the terrain of these accounts and nourish channels, users and communities who do support those looking for a place to share, talk and help one another through virtual aid.