Critics fear Facebook fact-checkers losing misinformation fight
Facebook’s program to hire third-party fact-checkers to crack down on misinformation on the platform has been ramping up, with partners adding staff and expanding their work.
But the program still faces skepticism from activists and tech industry critics who say the company and its partners are still not providing the resources needed to address the scope of the problem on a platform with more than 2 billion users.
Launched in December 2016 after intense criticism of how Facebook handled false or misleading content in the last presidential election, the third-party fact-checker program shifts responsibility for verifying the accuracy or truthfulness of content from Facebook to independent, verified outside organizations.
Facebook does not fact-check content itself. However, it does send posts that its algorithm flags as potentially false to its partners. The partners can review the content flagged by Facebook or search for and identify false content themselves.
There are six partners in the program evaluating U.S. content. Those The Hill reached out to all said they had recently or were currently adding staff to their efforts.
Lead Stories, which became a Facebook partner early last year, has seven people working full time on fact-checks. Together they review roughly 60 to 70 pieces of content per month, the site’s editor-in-chief Alan Duke told The Hill.
Heading into the election year Lead Stories plans to hire at least four more employees and ramp up to 200 pieces of content fact-checked a month.
“2020 is going to be intense,” Duke said in an interview last month. “We’re staffing up, [Facebook] helped us staff up, and they’ve given us the resources to do many times what were able to do to scale up.”
The editor of PolitiFact, a fact-checking website that looks at political posts on Facebook, told The Hill that it was fact-checking nearly 50 pieces of content per month in 2019 but plans to double that figure in 2020.
To get there, the nonprofit, which is operated by the Poynter Institute, will add to its five person Facebook fact-checking team.
Editor Angie Holan, when asked how much content flagged by Facebook Politifact reviewed in a day, described the flow of posts as “almost like a never ending list.”
“For our intents and purposes it’s more stuff than we can get to in a day,” she explained.
Factcheck.org, a project of the Annenberg Public Policy Center at the University of Pennsylvania, has two people dedicated to Facebook content and is planning on hiring an editor to oversee that work, director Eugene Kiely said. It currently fact-checks about 20 pieces of content a month and plans to increase that number to deal with an expected rise in misinformation tied to the 2020 census and presidential election.
Check Your Fact, an affiliate of the conservative news outlet the Daily Caller, declined to answer inquiries about its staff and output. According to information on its website, the program has three full time staff and fact-checked 50 pieces of Facebook content in November and 48 in December.
The Associated Press declined to provide detailed answers but its website mentions three full time employees working on Facebook posts who fact-checked eight pieces of content each of the last two months.
Science Feedback, the final of the six partners, could not be reached for comment. The international fact-checker, which focuses on science and health content, has six full time staff and fact-checked six pieces of Facebook content in November and nine in December.
Together, Facebook’s six partners have 26 full-time staff and fact-checked roughly 200 pieces of content per month.
Experts who spoke to The Hill said those changes were insufficient to make a serious dent in the fake accounts and disinformation they say are rampant on Facebook.
“The volume seems inadequate given the scale of the challenge that Facebook faces,” Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights, said.
“If you’re going to operate a site like Facebook, you would want to have resources available commensurate with the degree of false and misleading content. On a site with 2.4 billion monthly actively users you’re almost inevitably going to have a very high volume of false material.”
An official for Facebook told The Hill the platform is still adding partners and improving its flagging system. They acknowledged there is no “silver bullet” for misinformation, but stressed the program shows significant potential in the company’s view.
Experts say it is hard to know exactly how much misleading content is on Facebook.
Fadi Quran, campaigns director at the nonprofit activist group Avaaz, said Facebook does not disclose that data.
“The truth is no one can quantify exactly how much disinformation there is out there apart from the platforms who, if they were transparent, could give the most accurate approximation,” he said.
Quran said Facebook gave false information a wide reach.
“When we compare the top five real news stories [on impeachment] from America’s most read newspapers, like The New York Times or The Washington Post, to the top five fake news stories, or disinformation stories, the top five disinformation stories had reached up to 80 percent as much as the real ones,” he said.
Quran said that while fact-checking “needs to be significantly scaled up,” Facebook should also make changes such as notifying people who have viewed content that is fact-checked.
Some of Facebook’s toughest critics, though, question the fact-checking operation itself.
“Just the scale of the company itself makes responsible fact-checking pretty difficult, even if they were invested in doing it,” Sarah Miller, co-chairwoman of Freedom from Facebook, a coalition of progressive groups calling for breaking up the company.
Facebook has also been criticized for not subjecting posts from political figures to fact-checks.
Tristan Harris, the co-founder and director of the Center for Humane Technology, said that focusing on fact-checking ignores the bigger issue: “They have trillions of items of content.”
“They’ve created a digital Frankenstein that no matter how many more fact-checkers they hire, it is so far beyond the capacity of them to deal with it,” he said. Harris argued that the platform’s design itself incentivizes the rapid spread of disinformation.
Miller told The Hill that fact-checking is a distraction from the problem of microtargeting ads, which allow “any bad actor” to “target users with propaganda or scam content.”
Facebook has taken heat, including from Federal Election Commission Chairwoman Ellen Weintraub, for letting advertisers target incredibly small, specific audiences.
The social media giant announced it would continue to allow the method this month, even after Google announced it would stop microtargeting.
Facebook has been under intense pressure and scrutiny to deal with misinformation on its platform since the 2016 election. The effect of the third-party fact-checking program on addressing the problem remains to be seen, but critics say the problem will only get bigger.
“This is a tsunami,” Quran said of the amount of misinformation.
“It’s a tsunami that keeps on getting bigger and bigger because platforms are not taking the necessary actions to disincentivize this information and to fight it.”
The Hill has removed its comment section, as there are many other forums for readers to participate in the conversation. We invite you to join the discussion on Facebook and Twitter.