Facebook detailed its efforts to fight self-harm to mark World Suicide Prevention day on Tuesday, as the social media company announced new ways to expand on suicide prevention measures.
Earlier this year, Facebook started limiting graphic content that could trigger users following regular consultations with experts on the topics of suicide and self-injury.
Building on that, Facebook said Tuesday it’s hiring a health and well-being expert to join the company's safety policy team. The new expert will focus on the impacts of Facebook’s apps and policies, as well as explore ways to improve support in the community, Facebook said.
Facebook is also exploring ways to share with researchers public data from its platform on how people talk about suicide.
To limit exposure to potentially triggering content, Facebook no longer allows graphic images of cutting or self-harm, even if it is from someone seeking support. After content is removed, Facebook said it sends resources to the user who posted the initial posts promoting self-harm.
The company said it also made it harder to search for such content on Instagram and now displays a “sensitivity screen” over healed self-harm cuts to avoid unintentionally promoting self-harm.
Similarly, Facebook said it's “taken steps to address the complex issue of eating disorder content on our apps” by tightening its policy to prohibit content that “may promote eating disorders.”