Democrats urge tech giants to change algorithms that facilitate spread of extremist content

Democrats urge tech giants to change algorithms that facilitate spread of extremist content
© Greg Nash

House Democrats sent a letter to top social media platforms on Thursday urging the them to make permanent changes to algorithms that facilitate the spread of extremist and "conspiratorial" content.

The letter from the lawmakers comes roughly two weeks after a mob stormed the Capitol in a deadly riot.

Reps. Anna EshooAnna Georges EshooBiden can build on Pope Francis's visit to Iraq Biden convenes bipartisan meeting on cancer research House Democrats want to silence opposing views, not 'fake news' MORE (D-Calif.) and Tom MalinowskiThomas (Tom) MalinowskiNJ lawmakers ask Gannett to stop 'union-busting' efforts at 3 state newspapers Journalism watchdog files criminal complaint against Saudi crown prince Democrats don't trust GOP on 1/6 commission: 'These people are dangerous' MORE (D-N.J.) led dozens of their colleagues in letters addressed to the CEOs of Twitter, Facebook, Google and YouTube urging various changes across the platforms to mitigate the spread of extremist and conspiratorial content. 


“Online disinformation is not just about removing bad content. I see it as largely a product design issue. The algorithmic amplification and recommendation systems that platforms employ spread content that’s evocative over what’s true,” Eshoo said in a statement. 

Many of the people involved in the pro-Trump mob that stormed the Capitol asserted that there was widespread election fraud that tainted the 2020 election results. Former President TrumpDonald TrumpTrump announces new tranche of endorsements DeSantis, Pence tied in 2024 Republican poll Lawmakers demand changes after National Guard troops at Capitol sickened from tainted food MORE and his allies repeatedly echoed these claims on social media, stating that the election had been "stolen" from the former president. 

The unsupported claim, along with QAnon conspiracy theories and other disinformation narratives, spread online ahead of the Jan. 6 riot at the Capitol. 

“Social media platforms’ algorithms are designed to feed each of us increasingly hateful versions of what we already hate, and fearful versions of what we already fear, so that we stay glued to our screens for as long as possible,” Malinowski said in a statement. “In this way, they regularly promote and recommend white supremacist, anti-Semitic, anti-government, and other conspiracy-oriented material to the very people who are most susceptible to it — some of whom just attacked our Capitol.” 

After the attack at the Capitol, mainstream social media platforms cracked down on some of the disinformation. But the Democrats are calling for further action among the social media giants. 


Google-owned YouTube has faced harsh criticism over its role in spreading extremist content. In the letter to Google CEO Sundar Pichai and YouTube CEO Susan Wojcicki, Democrats said the platform should disable its auto-play by default feature and cease “all recommendations of conspiratorial material on users' homepages.” 

A spokesperson for YouTube declined to comment in response to the letter. 

According to YouTube, the platform has removed thousands of videos pushing claims of widespread voter fraud in the election, and has removed thousands of QAnon-related videos and channels. 

The lawmakers, however, wrote that YouTube needs to take further action to address the spread of extremist content. 

“If those are too difficult to identify using automated processes, the company should cease all recommendations until an effective, technical solution is developed. More broadly, it is our hope that YouTube will begin a fundamental reexamination of maximizing user engagement as the basis for algorithmic sorting and recommendation,” they wrote. 


In the similar letter to Facebook, the Democrats called for the platform to make permanent changes to its recommendation system, as it has done on a trial basis in the past. 

A spokesperson for Facebook said the company has no comment on the letter, but noted actions the company has taken since the riot at the Capitol. 

Facebook last week said it had emergency measures in place since before the U.S. election to not recommend civic groups for people to join, and it is keeping that in place as well as implementing additional requirements including requiring group administrators to review and approve posts before they go up in groups that “start to have a high rate of hate speech or content that incites violence.”

The letter to Twitter similarly calls for the platform to make permanent changes that limit the spread of misinformation and “other forms of harmful content.” 

A Twitter spokesperson said the company has received the letter and intends to respond. 

Even as mainstream platforms take action to crack down on disinformation, fringe social media platforms with minimal content moderation have seen a surge. 

One platform identified as having posts inciting violence ahead of the riot, Parler, was removed from the Apple and Google app stores and was dropped by Amazon’s web hosting service. As the platform looks to relaunch, House Oversight and Reform Committee Chairwoman Carolyn MaloneyCarolyn MaloneyDOJ declined to take up Chao ethics probe Government watchdog finds federal cybersecurity has 'regressed' in recent years Lawmakers line up behind potential cyber breach notification legislation MORE (D-N.Y.) requested the FBI investigate its role in the insurrection.