Report: 'Haphazard' social media policy updates 'falling short' against disinformation

Report: 'Haphazard' social media policy updates 'falling short' against disinformation
© istock

Top social media platforms' hundreds of policy changes in the past year and a half have failed to adequately address issues of disinformation, according to a report released Wednesday. 

The analysis released by Decode Democracy, a campaign about political deception launched by the nonprofit research organization Maplight, deems the tech companies’ self-regulation practices insufficient and urges lawmakers to take action. 

“This haphazard and reactionary pattern of behavior from large technology and social media companies makes it clear self-regulation is falling short and that we need stronger laws and regulations to limit disinformation and hold social media platforms accountable,” the report states. 


The Decode Democracy report examined updates made by nine popular platforms — Google, Facebook, Twitter, YouTube, WhatsApp, Instagram, Reddit, Snapchat and TikTok — between August 1, 2019, and Jan. 22. 

Combined, the platforms made at least 321 policy changes during that time frame. But the report argues that despite those changes, “social media platforms have largely failed to alleviate the growing problem of online disinformation.”

The report also hits platforms for reacting to crises and public pressures rather than “proactively addressing digital deception.” 

“It's closing the barn door after the animals have left,” Maplight president and co-founder Daniel Newman told The Hill. 

The report charts out a timeline of policy changes by category made by the platforms, which shows a large block of policies around “public health” were put in place in March 2020 as coronavirus cases surged. 

There was another increase in policy updates in June, amid nationwide protests after the police killing of George Floyd in Minneapolis. And in January, after the deadly riot at the Capitol, there was an increase in the output of “civic integrity” and “violence and extremism” policy changes, the report noted. 


The most policy changes were launched in October, a month before the presidential election. 

Issues around disinformation date back further than the start of the report, but many have been “heightened” by unprecedented challenges in the 18-month period tracked in the report, Newman said. 

“With all of us sheltering in place, spending more time online, with these massive crises of the coronavirus — and into the presidential election — all those things have exacerbated the pace of disinformation and the problems it causes for democracy,” he said. 

The report calls for Congress and the Biden administration to put in place a coordinated national response to combat disinformation, including establishing an interagency task force to study the harms of digital disinformation, appointing a disinformation expert to the COVID-19 task force and creating a website to debunk viral misinformation as it occurs. 

The report also urges Congress to update campaign finance laws to help hold digital platforms more accountable and create transparency for voters about digital political ads. 

“I think that there is unprecedented interest and awareness in Congress, and among Americans as a whole, about the threat that online disinformation poses to our democracy,” Newman said. 

“Sadly members of both the House and Senate lived through a violent attack that was fueled by misinformation that was amplified on social media,” he added. “So I do think that this is an opportunity.” 

The CEOs of Facebook, Twitter and Google will face questions from lawmakers about efforts to combat misinformation on their platforms next week during a House Energy and Commerce Committee hearing.