The company’s chief executive, Adam Mosseri, says Instagram is trying to find a difficult balance. Instagram is expanding its policies to ban content around images of self-harm and suicide. The photo-sharing application owned by Facebook will no longer allow fictional representations of self-harm or suicide, including drawings, memes, and graphic images of movies or comics.
In a blog post on Sunday, Instagram chief executive Adam Mosseri said the application aims to “achieve the difficult balance between allowing people to share their mental health experiences and at the same time, protect others from exposure to potentially harmful content. ”
Earlier this year, Instagram banned all self-injurious graphic images and said it would prevent even non-graphic content from being displayed, such as images of healed scars, in searches, hashtags, and in the exploration tab. After the change, Instagram “removed, reduced visibility or added sensitivity screens” to more than 834,000 pieces of content, the company said.
Read more: How to publish GIFs on Instagram easily
The new policies expand the measures that Mosseri had already outlined in an editorial article in the Daily Telegraph in February, in which he wrote that he would do more to protect vulnerable users from watching content that promotes suicide and self-harm. The article mentioned the death of British teenager Molly Russell, who took her own life in 2017. Russell had used Instagram to interact and publish content about depression and suicide, which led his family to blame the social network for his death.
Instagram can also now remove images that “include associated materials or methods” related to self-harm and suicide, and accounts that share this type of content will not be recommended in the search or other parts of the application, the company said.