A new report recommends that the federal government prioritize plans to enact stronger transparency requirements and other measures for tech platforms in an effort to combat polarization on social media.
The report released Monday by the NYU Stern Center for Business and Human Rights argues that tech platforms have failed to "self-regulate sufficiently" and calls for government intervention.
The authors cite the Jan. 6 riot at the U.S. Capitol as an example of social media contributing to political polarization in a way that manifests in real-world harm.
"We’re not just talking about political polarization just in the abstract, but it has these very specific consequences which we are seeing basically eroding aspects of democracy and civil relationships among people and trust in institutions and so forth,” said Paul Barrett, the deputy director of the NYU Stern Center for Business and Human Rights and one of the authors of the report.
The report recommends the House Select Committee investigating the insurrection devote “ample resources” to determining how technology was used to incite the violence on Jan. 6.
It also recommends Congress empower the Federal Trade Commission to draft and enforce new standards for industry conduct, and pass legislation mandating more disclosure about the inner workings of platforms.
The report also urges President BidenJoe BidenJill Biden campaigns for McAuliffe in Virginia Fill the Eastern District of Virginia Biden: Those who defy Jan. 6 subpoenas should be prosecuted MORE to persuade lawmakers and the public to confront online polarization to avoid “future versions” of the Capitol insurrection.
Social media platforms have pushed back on accusations that they are intensifying political polarization. But at the same time, platforms have taken action during specific periods of time to strengthen policies around removing certain content.
For example, Facebook in April said it would take extra steps to limit misinformation in preparation for the verdict in the trial of Derek Chauvin for the killing of George Floyd.
“The fact that they actually acknowledge that they have the capacity, in their lingo, to ‘turn the dial’ at certain points and they acknowledge that they've done this in certain emergency situations I think proves a very strong implication that they know there is a connection to what they're doing and this social, political problem,” Barrett said.
“The question we raised is, if you could do it temporarily, explain to us why you wouldn't want to do that generally?” he added.
Facebook’s vice president of content policy, Monika Bickert, was faced with the same question during a Senate hearing in April. Bickert said at the time there is a “cost” to assessing content that violates standards through Facebook’s technology screening system.
But Barrett said if Facebook finds its screening technology has “too many false positives” when Facebook ramps up enforcement, the company should refine its system.
“Don't use that as an excuse to not move forward and be more enterprising and figuring out how to comb out material you're acknowledging in times of social unrest could be dangerous. That type of content is potentially problematic all the time,” he said.
Spokespeople for Facebook, YouTube and Twitter, the major platforms identified in the report, did not respond to requests for comment.
The report recommends that social media companies adjust their algorithms in an effort to depolarize their platforms more systemically, and also urging the companies to improve “dial-turning” measures.
The report further calls for the platforms to be more transparent in disclosing what they're doing and how they’re making their decisions to “counter suspicions” that decisions are made for political purposes.
The recommendations come as Washington braces for a rally on Sept. 18 with participants demanding "justice" for those facing federal charges for breaching the Capitol on Jan. 6. Far-right extremist groups like the Proud Boys and Oath Keepers are planning to attend, The Associated Press reported.
“The prospect of that event rally, dedicated to supporting people who are being described as 'political prisoners' and so forth, is exactly the kind of event that social media is helping to further inflame,” Barrett said.
“The embers are sort of burning red, there aren’t quite flames flickering from it yet, but with the ability to spread lies and organize via social media — it’s like pouring gasoline on that and that’s the danger,” he added.