Story at a glance
- Facebook employees reportedly proposed an update to the site’s algorithm to create a “nicer news feed.”
- A Facebook executive says these changes were not meant to be permanent.
Social media companies have spent the bulk of the 2020 presidential election combating misinformation surrounding the outcomes of the votes, prompting warnings and labels attached to posts deemed as potentially inaccurate or outright false.
In response to the burgeoning struggle — which has yet to firmly abate even as the Trump administration says it's ready to begin the transition process — employees at social media behemoth Facebook have proposed to alter the news feed algorithm that the website uses to display articles.
The New York Times reports that this will involve developing more robust “news ecosystem quality” ratings, known as NEQ, which will function as an internal score that evaluates the quality of a news outlet’s stories and journalistic practices.
When selecting which articles to display on user feeds, the NEQ reportedly plays a small role in selecting which articles are most prominently featured.
In the change proposed by company staff, Facebook CEO and Founder Mark Zuckerberg will approve augmenting the algorithm to take NEQ scores into greater consideration when sifting through articles online in a bid to advance more factual news.
Facebook had reportedly been working for years on ensuring newsfeed integrity throughout the presidential election to avoid any voter tampering. It saw news from mainstream outlets like The New York Times, NPR and CNN be widely broadcasted, while more partisan sites, like Breitbart, were featured less.
Workers reportedly called this a “nicer news feed” and asked if it would remain in place following the 2020 election.
Facebook’s Vice President of Integrity and Project Management Guy Rosen told reporters that the changes to the news feed algorithm were meant to be temporary.
“There has never been a plan to make these permanent,” Rosen said.
The issue of content moderation on social media sites like Facebook has left the companies in a tense chasm: halt false, misleading, or even hateful news and content, or take a hit on company expansion and lose lucrative advertisements, as well as face political backlash.
With the help of user feedback to determine articles that were “good for the world” and “bad for the world,” the new algorithm structure will reduce visibility of content deemed “bad” based on user opinions.
This move saw reduced user engagement and online sessions, so Facebook loosened the restrictions on “bad for the world” content.