Health misinformation has been viewed nearly 4 billion times on Facebook over past year: study

Health misinformation has been viewed nearly 4 billion times on Facebook over past year: study
© Getty

Networks spreading health misinformation have received 3.8 billion views on Facebook in the past year, a new report finds.

The peak of the misinformation came as the coronavirus pandemic was reaching its worst in April, with those networks receiving roughly 460 million views in just a month, according to the report from nonprofit advocacy group Avaaz released Wednesday.

The reach of the top spreaders of health misinformation far eclipsed that of leading health organizations on the platform, according to the report, which also raises concerns about Facebook’s efforts to rein in misleading content.


Researchers found that only 16 percent of the health misinformation they uncovered had a warning label placed on it.

“Facebook’s algorithm is a major threat to public health,” said Fadi Quran, campaign director at Avaaz. “Mark ZuckerbergMark Elliot Zuckerberg2.5 million US users register to vote using Facebook, Instagram, Messenger Hillicon Valley: Trump's ban on TikTok, WeChat in spotlight | NASA targeted by foreign hackers | Instagram accused of spying in lawsuit The Hill's Morning Report - Sponsored by The Air Line Pilots Association - Trump contradicts CDC director on vaccine, masks MORE promised to provide reliable information during the pandemic, but his algorithm is sabotaging those efforts by driving many of Facebook’s 2.7 billion users to health misinformation spreading networks.”

The findings are especially concerning given the material effects of conspiracy theories and unfounded claims about the coronavirus.

More than 100 doctors and nurses working on the front lines of the pandemic sent a letter to America’s largest social media platforms earlier this year warning that misinformation was making it harder to treat patients.

A study released this month in the American Journal of Tropical Medicine and Hygiene also found that 800 people had died globally in the first three months of the year as a result of being exposed to coronavirus misinformation.

“Misinformation about people’s health, especially during a pandemic, can lead to direct, physical harm,” Rep. Anna EshooAnna Georges EshooDemocratic chairman says White House blocked FDA commissioner from testifying Hillicon Valley: Zuckerberg acknowledges failure to take down Kenosha military group despite warnings | Election officials push back against concerns over mail-in voting, drop boxes Democrat asks intel agencies if they're surveilling members of Congress MORE (D-Calif.) said in a statement to The Hill. “The fact that this misinformation has been viewed nearly four billions times on Facebook in the last year is utterly inexcusable and dangerous.”


The report released Wednesday faults Facebook’s algorithm for boosting the dangerous health content.

“[H]ealth misinformation is often sensationalist and provocative and will therefore receive significant engagement,” the report reads. “This engagement will, in turn, be interpreted by the algorithm as a reason to further boost this content in the News Feed, creating a vicious cycle where the algorithm is consistently and artificially giving health misinformation, for example, an upper hand over authoritative health content within the information ecosystem it presents to Facebook users.”

Facebook’s attempts to stem the spread of misinformation have already come under heavy criticism from Democratic lawmakers.

“What has become more clear with every new revelation is that Facebook’s algorithms continue to expose Americans to harmful misinformation,” Sen. Mark WarnerMark Robert WarnerIntelligence chief says Congress will get some in-person election security briefings Overnight Defense: Trump hosts Israel, UAE, Bahrain for historic signing l Air Force reveals it secretly built and flew new fighter jet l Coronavirus creates delay in Pentagon research for alternative to 'forever chemicals' House approves bill to secure internet-connected federal devices against cyber threats MORE (D-Va.) told The Hill. “Particularly in the context of an ongoing public health emergency, we cannot allow the public sphere – increasingly conducted online – to be dominated by harmful health misinformation and scams. We must acknowledge that the scale of these platforms calls for increased scrutiny.”

A Facebook spokesperson said in statement to The Hill, "We share Avaaz’s goal of limiting misinformation, but their findings don’t reflect the steps we’ve taken to keep it from spreading on our services."

The platform has promised action to combat the spread of health misinformation during the pandemic, committing to elevating information from trusted sources and limiting the spread of potentially harmful posts. The spokesperson noted that "from April to June, we applied warning labels to 98 million pieces of COVID-19 misinformation and removed 7 million pieces of content that could lead to imminent harm."

Some of those efforts have been successful. Facebook appears to have stunted the spread of a sequel to the viral “Plandemic” video released Tuesday by blocking links to it from being posted or shared via messenger immediately after it went live. However, not all sources of misinformation announce their intention to spread such material ahead of time.

"Given the previous Plandemic video violated our COVID misinformation policies, we blocked access to that domain from our services,” the Facebook spokesperson said. “This latest video contains COVID-19 claims that our fact-checking partners have repeatedly rated false so we have reduced its distribution and added a warning label showing their findings to anyone who sees it."

The social media giant also started notifying users who have interacted with misinformation about the coronavirus and has connected them with articles debunking common conspiracies.

Avaaz’s report says those steps have not been sufficient given the scale of the problem.

It recommends providing all users who have seen misinformation with independently fact-checked corrections, claiming that doing so lowers belief in false information by half.

The report also encourages Facebook to downgrade the visibility of misinformation in the news feed.


“During a global pandemic, Facebook is looking the other way while disinformation about the coronavirus goes viral on its platform — a direct threat to the health and safety of millions of people," Sen. Elizabeth WarrenElizabeth WarrenJudd Gregg: The Kamala threat — the Californiaization of America GOP set to release controversial Biden report Biden's fiscal program: What is the likely market impact? MORE (D-Mass.) said in a statement to The Hill. "No company should be too big to be held accountable for distorting facts and spreading falsehoods, especially during a public health crisis.”

To come up with the figures in their report, Avaaz first identified websites that previously reached large audiences with health misinformation, relying on third-party groups to initially locate sites. It then found the top Facebook pages that drove content to the previously identified websites.

Then, because Facebook does not make public how many times a link is viewed, the group created a metric for views calculated using by available data views of videos posted by top pages.