Facebook's hate speech enforcement strategy still misses offensive content: report

Facebook's hate speech enforcement strategy still misses offensive content: report
© Getty Images

Facebook's strategy for removing hate speech on its platform suffers from inconsistencies that often allows for offensive content to remain on the social media site, according to a new ProPublica report.

The report shows an array of posts with hate speech and other inappropriate content, and details which posts were taken down and which remained once reported as offensive.

ADVERTISEMENT
In one instance, the company declined to take down a picture of a bloody corpse with the text “the only good Muslim is a f---ing dead one,” written across it.

The company told Holly West, who reported the image, that it did not find it to be in violation of its terms of service.

“We looked over the photo, and though it doesn’t go against one of our specific Community Standards, we understand that it may still be offensive to you and others,” Facebook wrote to her in an email.

However another, similar post, which read "Death to the Muslims," was taken down, though the post did not have an accompanying photo like the other post.

While both posts were in violation of Facebook’s terms of service, the former was only removed after ProPublica contacted Facebook about it.

According to ProPublica's report, such inconsistencies are not isolated. The group looked at the findings of 900 posts it reviewed and came to similar conclusions.

“Based on this small fraction of Facebook posts, its content reviewers often make different calls on items with similar content, and don’t always abide by the company’s complex guidelines. Even when they do follow the rules, racist or sexist language may survive scrutiny because it is not sufficiently derogatory or violent to meet Facebook’s definition of hate speech,” ProPublica noted.

The report also revealed some of Facebook’s odd community standards.

The company, for example, declined to remove a photo of a black man with a missing tooth wearing a Kentucky Fried Chicken bucket as a hat accompanied by the caption: “Yeah, we needs to be spending dat money on food stamps wheres we can gets mo water melen an fried chicken.”

After the image was reported, Facebook said that it did not remove the image because it did not include an attack on a specific group.

When a user posted “White people are the f---ing most,” however, is was quickly removed by Facebook.

Facebook has found itself in similar situations with problematic gender posts. ProPublica noted an instance in which Facebook declined to remove a post showing a woman in a lewd position in a shopping cart, emblazoned with the text “went to Wal-Mart, picked up a brand new dishwasher.”

The company removed another post though, which read “Men really are trash.”

“We’re sorry for the mistakes we have made — they do not reflect the community we want to help build,” Facebook Vice President Justin Osofsky said in a statement. “We must do better.”

He emphasized that Facebook is doubling the size of safety and security team to 20,000 in 2018.