Facebook will begin banning white nationalist or white separatist content on its platform starting next week, the social media giant announced on Wednesday.
Facebook officials formally decided to ban "white nationalism" and "white separatism" on the platform at a content moderation meeting on Tuesday, according to Motherboard.
The significant policy shift comes a year after Motherboard, a tech news outlet, reported that Facebook's content moderation had allowed "white nationalism and separatism" on the platform, though it barred explicit "white supremacy."
Facebook, in training documents for moderators last year, wrote that white nationalism "doesn’t seem to be always associated with racism (at least not explicitly)," incurring immediate backlash from civil rights groups and experts.
Brian Fishman, Facebook's policy director of counterterrorism, told Motherboard this week that after speaking to a range of experts Facebook has concluded "the overlap between white nationalism, [white] separatism, and white supremacy is so extensive we really can’t make a meaningful distinction between them."
"Our own review of hate figures and organizations – as defined by our Dangerous Individuals & Organizations policy – further revealed the overlap between white nationalism and separatism and white supremacy," Facebook said in the blog post announcing the change. "Going forward, while people will still be able to demonstrate pride in their ethnic heritage, we will not tolerate praise or support for white nationalism and separatism."
Experts told the social media giant that white nationalism and white separatism are linked to violence, Facebook said.
Facebook will now ban content that praises or supports white nationalism and separatism. And when users try to post or search explicit white nationalist or separatist content, they will be redirected to the website for Life After Hate, an advocacy and support organization that strives to help people transition away from hate groups. Life After Hate was founded by former white supremacists.
“If people are exploring this movement, we want to connect them with folks that will be able to provide support offline,” Fishman told Motherboard. “This is the kind of work that we think is part of a comprehensive program to take this sort of movement on.”
Facebook did not immediately respond to The Hill's request for comment seeking more information.
Fishman said the ban will not extend to implicit or coded white nationalism and white separatism, noting those messages are harder to identify and take down, according to Motherboard.
Color of Change, an advocacy organization that has been pressing Facebook to improve its record on civil rights for years, in a statement called the policy change a "critical step forward."
"Color of Change alerted Facebook years ago to the growing dangers of white nationalists on its platform, and today, we are glad to see the company’s leadership take this critical step forward in updating its policy on white nationalism," Color of Change President Rashad Robinson said in a statement.
"Facebook’s update should move Twitter, YouTube, and Amazon to act urgently to stem the growth of white nationalist ideologies, which find space on platforms to spread the violent ideas and rhetoric that inspired the tragic attacks witnessed in Charlottesville, Pittsburgh, and now Christchurch," Robinson said, referring to recent shootings and attacks by white supremacists in the U.S. and New Zealand.
Facebook's decision comes as the major tech platforms face heightened scrutiny over their efforts to remove white supremacist content. Lawmakers and experts have been putting pressure on the world's largest social media companies to take aggressive action against white supremacists in the wake of the New Zealand massacre.
The suspected shooter in New Zealand posted a white supremacist manifesto on Twitter and other social media outlets, laying out his bigoted views on Muslims and immigrants, according to New Zealand police. He then uploaded a live video of the attack to Facebook, filming himself shooting dozens of worshippers at one of the mosques he targeted.
Twitter, Facebook, YouTube and other platforms scrambled to take down the video, which had gone viral within minutes.
Facebook is the first of the tech companies to announce a specific policy shift since the New Zealand shooting.
"Unfortunately, there will always be people who try to game our systems to spread hate," Facebook wrote. "Our challenge is to stay ahead by continuing to improve our technologies, evolve our policies and work with experts who can bolster our own efforts."