Facebook utilizes a human editorial team in determining and presenting the content in its trending topics section, according to a new report.
Leaked documents show that Facebook’s trending topics are not solely derived from the social media network’s algorithms, according to The Guardian.
“The editorial team is responsible for accepting all topics that reflect real-world events,” Facebook’s internal editorial guide says. "We provide context to help people understand the trends and metadata to inform the algorithms that target trends."
The Guardian on Thursday reported that Facebook’s team of news editors works around-the-clock and was once as small as 12 people.
Facebook’s guide states that besides an editorial team, the company employs a “topic detection team” tasked with “surfacing pending topics and ranking them after they’re accepted." A “content ranking team” is also assigned with “delivering high-quality, relevant topic feeds once the topic is accepted.”
“A real-world event is something that happened recently, is happening now or will happen in the future,” Facebook’s editorial rules say of its acceptance criteria. "It’s intentionally broad so that we can be inclusive of a wide range of interests.”
They also determine what trends constitute “a national story,” often measured by if a topic appears on at least five out of 10 prominent news websites: BBC News, CNN, Fox News, The Guardian, NBC News, The New York Times, USA Today, The Wall Street Journal, The Washington Post, Yahoo News or Yahoo.
The editors can "inject" newsworthy topics under certain guidelines and can also "blacklist" duplicate topics or those not related to a real-world event. The team can also rename topics for consistency — its guidebook gives the example of replacing "Baja Peninsula" with "Odile," a hurricane that hit the peninsula in 2014.
The leak will likely snowball recent criticisms that Facebook has been weeding out conservative news topics from its users' feeds.
Facebook said that the guidelines show it has a “series of checks and balances in place to help surface the most important popular stories, regardless of where they fall on the ideological spectrum.”
“Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period,” Justin Osofsky, vice president of global operations said in a statement Thursday.
Osofsky said Facebook uses more than 1,000 news sources to “help verify and characterize world events and what people are talking about.”
“We have at no time sought to weight any one view point over another, and in fact our guidelines are designed with the intent to make sure we do not do so,” he said.
Reports on Monday emerged that contractors working as “curators” in Facebook’s trending topics section were omitting conservative news topics.
Sen. John ThuneJohn Randolph ThuneManchin keeps Washington guessing on what he wants Manchin-McConnell meet amid new voting rights push Republican leaders misjudged Jan. 6 committee MORE (R-S.D.) on Tuesday then sent a letter demanding answers about Facebook’s news curation from CEO Mark Zuckerberg.
“Facebook must answer these serious allegations and hold those responsible if there has been any political bias in the dissemination of trending news,” said Thune, the Senate Commerce Committee chairman. "Any attempt by a neutral and inclusive social media platform to censor or manipulate political discussion is an abuse of trust and inconsistent with the values of an open Internet.”
Facebook on Thursday provided The Guardian with a list of 1,000 trusted sources for its trending topics section. It contains conservative outlets including The Daily Caller, The Weekly Standard, Red State, the Drudge Report and Breitbart.
A top Facebook executive on Thursday said that the social media platform does not possess any inherent political bias.
“Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period,” said Justin Osofsky, Facebook’s vice president of global operations.
“We have at no time sought to weight any one viewpoint over another, and in fact our guidelines are designed with the intent to make sure that we do not do so."