Facebook to take down posts with false claims about vaccines

Getty Images

Facebook said Monday it will remove posts that spread false information about “COVID-19 vaccines and vaccines in general.”

The policy expands on one that the social media platform put in place in December to remove misinformation about coronavirus vaccines. 

The company already prohibited false vaccine claims in paid ads, but the policy expansion means unpaid posts that Facebook believes are spreading vaccine misinformation are now subject to removal. 

“We will begin enforcing this policy immediately, with a particular focus on Pages, groups and accounts that violate these rules, and we’ll continue to expand our enforcement over the coming weeks,” Facebook said in an updated blog post

Groups, pages and accounts that “repeatedly share these debunked claims” may be removed from the site entirely, Facebook said. 

The update comes after the Oversight Board, an independent body that issues rulings on the company’s content removal decisions, recently recommended Facebook update its guidance around health-related misinformation. The Oversight Board said Facebook’s misinformation and imminent harm rule was “inappropriately vague” and recommended platform create a new community standard on health misinformation. 

“As the situation evolves, we’ll continue to review content on our platforms, assess trends in language and engage with experts to provide additional policy guidance to keep people safe during this crisis,” Facebook said in the post Monday.

The misinformation spreading online poses risks for real-world consequences. For example, the lead organizer of a recent demonstration at Dodger Stadium in Los Angeles told The New York Times the catalyst for the protest was the death of baseball legend Hank Aaron, following social media posts falsely linking Aaron’s death to his receiving the coronavirus vaccination.

Robert F. Kennedy Jr. was among the people boosting coronavirus vaccination misinformation about Aaron’s death online, including in a Jan. 22 tweet that remains online. The unsubstantiated claim has been debunked by the Fulton County medical examiner, who the Times noted said that there was no evidence that Aaron had an allergic or anaphylactic reaction to the vaccine.

Facebook’s updated policy on vaccine misinformation goes further than that of fellow social media platform Twitter.

Twitter announced in December, after Facebook’s policy update, that it would also begin labeling and removing posts with false claims about the coronavirus vaccine. 

A spokesperson for Twitter said the company does not have “additional details to share at this time” regarding plans to update its policy to apply to vaccine misinformation more broadly.

Updated at 4:33 p.m.

Tags Coronavirus COVID-19 Facebook misinformation Social media Twitter Vaccine
See all Hill.TV See all Video

Most Popular

Load more


See all Video