Far-right misinformation received highest engagement on Facebook: study

Getty Images

Content posted from news outlets rated as far-right received the highest levels of engagement on Facebook in the months surrounding the 2020 elections, according to a new study.

Moreover, researchers found that among far-right outlets, sources identified as spreading misinformation had on average 65 percent more engagement per follower than other far-right pages, according to the study released by New York University’s Cybersecurity for Democracy on Wednesday.

The study evaluated a total of 8.6 million Facebook and Instagram posts between Aug. 10 and Jan. 11 downloaded from the tool CrowdTangle. Researchers used lists of U.S. news sources and their Facebook pages from two independent data providers that rate the political leaning and quality of media and identified 2,973 news and information sources.

Throughout the six-month span, content from far-right sources consistently garnered the highest level of engagement compared to sources identified as coming from other political leanings, based on the study.

Facebook spokesperson Joe Osborne noted that the report only looked at engagement.

“This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook. When you look at the content that gets the most reach across Facebook, it’s not at all as partisan as this study suggests,” Osborne said in a statement.

The researchers acknowledged that its findings were limited based on the lack of data provided by Facebook. They were limited to information on engagements and not on how many users saw content. 

The study comes as Facebook and other tech giants are under increased scrutiny over their handling of misinformation.

Facebook CEO Mark Zuckerberg is scheduled to appear alongside other top tech CEOs before the House Energy and Commerce Committee later this month.

Democrats have criticized the platform’s content moderation policies, accusing the social media giant of not taking a strong enough approach to tackling misinformation and hate speech.

Republicans, however, have leveled unsubstantiated accusations that the tech giant is censoring content with an anti-conservative bias. NYU’s study further discredits the claims through the findings that far-right pages received the highest levels of engagement. A report released by NYU’s Stern Center for Business and Human Rights’ last month also concluded the anti-conservative bias claims are not based on any evidence.

Far-left sources were a distant second in earned engagement, according to the study, even on days where engagement peaked for the more politically “extreme” outlets, such as on Election Day or Jan. 6.

For example, the study found both far-right and far-left outlets saw a boom in interactions on those dates, but far-right pages saw more than 450 interactions per thousand followers while far-left pages saw just under 250 interactions.

The increase in engagement on the two key dates was “much less intense” for other news sources, compared to the more politically extreme outlets, based on the data.

The study also found that far-right sources did not suffer what researchers deemed a “misinformation penalty,” meaning sources of misinformation from far-right outlets outperformed far-right pages that were not identified as sources of misinformation. Researchers define a misinformation penalty as “a measurable decline in engagement for news sources that are unreliable.”

Far-right sources of misinformation had 426 interactions per thousand followers per week, compared to the 259 weekly interactions of the far-right pages that were not identified as misinformation sources, based on the study. 

But among all other partisan categories the “misinformation penalty” was at play, with sources spreading misinformation receiving at least slightly fewer interactions than those that did not.

For example, far-left sources not identified as spreading misinformation had more than 140 weekly interactions per thousand followers, while far-left sources identified as spreading misinformation only earned 60 weekly interactions.

The so-called penalty was the smallest among “slightly right” sources, but it was still at play with the misinformation sources reaching nearly 120 weekly interactions per thousand followers compared to the roughly 130 interactions of the non-misinformation sources.

The researchers acknowledged that its findings were limited based on the lack of data provided by Facebook. They were limited to information on engagements and not on how many users saw content. 

Last month, Facebook said it would be piloting ways to reduce the amount of political content users see. 

Updated: 4:23 p.m.

Tags Facebook Instagram Mark Zuckerberg misinformation
See all Hill.TV See all Video

Most Popular

Load more


See all Video