By Molly Patrick
Adam Mosseri, the head of Instagram, has announced a ban on explicit self-harm imagery as part of a series of policy changes to tackle content deemed harmful to users’ mental health. A long overdue acknowledgment of the company’s insufficient regulations, the ban comes following public concerns about how the site permits easy access to graphic content and even promotes such images through account recommendations and the Explore page. Recently, the father of Molly Russell, a 14-year-old who died in 2017, told The Daily Telegraph that Instagram “helped kill my daughter” after the family found material relating to self-harm and depression on her account. Unregulated social media sites pose a detrimental threat to user’s mental health and Instagram’s announcement confirms the responsibility of these companies in monitoring and censoring content.
One Cardiff University student told me that Instagram had a profoundly harmful effect on her mental health. She recalled that Instagram’s algorithm, which recommends content based upon your previous searches, contributed negatively to her self-esteem as any self-harm images she viewed would be shown again alongside other suggestions in the Explore page. She described how it was then hard to disengage with harmful content. Instagram’s announcement that it will be removing self-harm imagery from Explore pages and suggested posts comes as a welcome change.
However, while Instagram’s policy changes address graphic self-harm imagery, they do not acknowledge that other more suggestive content has a harmful effect on users. Last year, Kim Kardashian posted a seductive photograph eating an ‘appetite suppressant’ lollipop. To Kardashian’s 127 million followers, many of them being young girls, advertisements such as these undoubtedly encourage disordered eating. Celebrity influencers should be held accountable for their activity on social media and more should be done by Instagram to limit harmful posts.
It is easy to be sceptical about Instagram’s ban on explicit self-harm imagery. In 2012, Tumblr banned self-harm blogs, however, a quick search still results in endless graphic images. This is because Tumblr’s policy only restricts content that promotes self-harm: pages dedicated to spreading awareness and encouraging recovery are permitted. On visual content sites such as Instagram and Tumblr, the image is given precedence over text. This means that self-harm is aestheticized and implicitly encouraged, even if the account declares itself to be advocating recovery. Many people struggling with their mental health find recovery accounts to be supportive resources from which they can communicate with others experiencing similar issues, however, the content of these pages needs to be subject to the same censorship as other accounts if Instagram’s policy changes are to be effective.
Instagram’s recent policy changes, then, seem to be timid in light of the changes that need to be implemented if the seismic shift in how people are consuming media continues to grow.