Instagram on Sunday announced it was expanding its ban on suicide-related content to include fictional depictions of suicide and self-harm.
"We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery," Instagram head Adam Mosseri said in a blog post.
"We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods."
Accounts sharing graphic content will also no longer be recommended in search or the Explore tab. The platform will also boost the visibility of helplines like the National Suicide Prevention Lifeline and The Trevor Project.
Instagram launched its policy on graphic images of self-harm in February, following public outcry in the United Kingdom over the death of British teenager Molly Russell, who killed herself after viewing graphic content on the platform.
Her father, Ian Russell, has called out Instagram repeatedly for not doing enough to keep teens from viewing self-harm-related content.
Instagram has removed, reduced the visibility of or added sensitivity screens to more than 834,000 pieces of graphic content since February, Mosseri said.
People experiencing suicidal thoughts or urges can call the National Suicide Prevention Lifeline at 1-800-273-8255 or visit SpeakingOfSuicide.com/resources for additional resources.