Misinformation fueled the Capitol riots — a Biden commission could chart a path forward
Last week’s attack on the U.S. Capitol was based on lies.
The mob that stormed the building was acting on a tidal wave of misinformation about the election that was spread by the president, his fellow Republicans and their supporters using a web of partisan media outlets, social media and the dark corners of the internet.
The lies flourished despite an extraordinary amount of debunking by fact-checkers and Washington journalists. But that fact-checking didn’t persuade the mob that stormed the Capitol — nor did it dissuade millions of other supporters of the president. Fed a steady diet of repetitive falsehoods by elected officials and partisan outlets, they believed the lies so much that they were driven to violence.
Many people tried to prevent this. For the past four years, academics and journalists and philanthropists and foundations and tech leaders have thrown a lot at the problem of misinformation — hundreds of millions of dollars and millions of words, plus countless conferences and reports — and yet the problem seems worse than ever. It’s time for leadership.
In his first week in office, President-elect Biden should announce a bipartisan commission to investigate the problem of misinformation and make recommendations about how to address it. The commission should take a broad approach and consider all possible solutions: incentives, voluntary industry reforms, education, regulations and new laws. Although presidential commissions often accomplish little, there are promising signs for this one, as we explain below.
Biden can find a good precedent from the 1960s. In response to racial tensions and civil unrest, President Lyndon Johnson convened the National Advisory Commission on Civil Disorders. The Kerner Commission, as it was commonly known, sought to explain the root causes of civil unrest in American cities and make recommendations about how to defuse the tensions. The commission paid particular attention to the role of the media. In a statement that resonates with the current moment, the Commission concluded that there was “a significant imbalance” between what actually happened during events and what was reported in newspaper, radio and television coverage.
The new commission is needed because efforts so far haven’t worked. The tech platforms have tried a variety of strategies with limited success. Google and Facebook have funded research to combat misinformation and highlight fact-checking, including some by us at Duke. But it’s clear the tech companies’ efforts have not come close to putting a dent in the problem.
Journalism isn’t working, either, because many of President Trump’s supporters seek out the messages they want. They got the reality they preferred from conservative media commentators, at least until those commentators declared Trump had lost the election, when many of them sought refuge from the truth by turning to other outlets even less tethered to reality.
We make this recommendation recognizing that previous presidential commissions often had little impact. They became mired in the self interest of the stakeholders, and even when they produced substantive recommendations, they were largely ignored. How, then, can a commission on misinformation produce something more than political theater and symbolic action?
First, we may be in the midst of a rare moment of opportunity — what policy scholars call a “policy window.” The actions of President Trump and some of his most extreme followers have, for the moment, weakened the typically unified front of the Republican Party, with some Republican members of Congress calling for the president’s removal from office, and others finally acknowledging the increasingly pernicious effects of misinformation.
That, combined with the bipartisan support we have seen for revising Section 230 of the Communications Decency Act, which protects digital platforms from civil liability for the content that they host, is an important indicator that Congress could be ready to act.
The political landscape is more fertile now, too. Democrats, who have been targets in much of the misinformation, particularly the bogus claims about election fraud, now control the White House, Senate and House. So any policy recommendations that might emerge from this commission would now have a much greater likelihood of coming to fruition.
Google, Apple and Amazon Web Services have shown a new willingness to block apps such as Parler because of rampant misinformation and calls to violence. Other platforms, ranging from YouTube to TikTok to Snapchat to Twitch, have shown they’re also willing to take more aggressive action than they have in the past.
Still, the commission needs to take a bipartisan approach and look well beyond the election at the broad problem of misinformation that is eroding trust in government, science and our institutions. We need to go beyond yet another Congressional inquiry. The group needs to be truly diverse. In addition to policymakers and representatives from the media and tech sectors, participants should include representatives from the advocacy community for underrepresented groups, who are often disproportionately affected by misinformation. It should include a wide array of experts and stakeholders, such as academic researchers, technologists, legal scholars, advertisers, and community and foundation leaders.
The attack on the Capitol was rooted in misinformation. By moving swiftly to establish a commission with a broad mission and a diverse membership, President-elect Biden can seize the moment to address one of the greatest problems facing the nation.
Bill Adair, the founding editor of PolitiFact, is the Knight Professor of the Practice of Journalism and Public Policy in the Sanford School of Public Policy at Duke University.
Philip M. Napoli is the James R. Shepley Professor of Public Policy at the Sanford School of Public Policy at Duke University and the author of “Social Media and the Public Interest: Media Regulation in the Disinformation Age.“