Instagram unveiled a new tool this week that will allow users to report misinformation spreading on its platform.
The new reporting feature is the latest installment in Instagram's ongoing battle to stave off the scourge of misinformation — including medical hoaxes, scams and Russian-inspired disinformation campaigns — that have flourished for years on the image-sharing platform owned by Facebook.
By the end of the month, users in the U.S. will be able to report images that they believe spread misinformation. Those flagged posts will be reviewed by Instagram's newly-established pilot fact-checking program.
“Starting today, people can let us know if they see posts on Instagram they believe may be false," Facebook spokeswoman Stephanie Otway said in a statement. "We’re investing heavily in limiting the spread of misinformation across our apps, and we plan to share more updates in the coming months.”
If the fact-checkers determine the post is false, Instagram will no longer promote the post on its explore and hashtag pages, but the post will be allowed to remain up.
Otway said Instagram's aim is to use the reports, as well as other red flags such as how the account has behaved in the past, to determine whether to send a post to the fact-checkers.
Users will now be able to designate posts as "false information" when they click on the button in the top right-hand corner of posts.
Instagram has been ramping up its work on misinformation amid a record-breaking measles outbreak in the U.S. this year, which has been attributed in part to the rise of anti-vaccine misinformation online.
Between January and Aug. 8 this year, the Centers for Disease Control and Prevention (CDC) has confirmed 1,182 individual cases of measles in 30 states.
Instagram is also considering a "pop-up" that would appear on content that contains vaccine-related misinformation.