It's time for Congress to regulate political advertising on social media
© Getty Images

Facebook recently announced that because of concerns about its inability to counter misinformation, it was placing a moratorium on new political advertisements in the week before Election Day. This move was reminiscent of one from Twitter last year that banned all political advertising for similar reasons.

Social media political advertisements that spread misinformation and disinformation increasingly threaten democracy and election integrity, in the U.S. and in other countries.

In 2016, Russian operatives ran 3,000 ads targeting up to 10 million American social media users to distort political opinion and influence the outcome of our election. Recent U.S. intelligence analysis found that Russia, China, and Iran are attempting to sow discord and sway the outcome of the 2020 election. But the problem is not just with foreign interference. Google — which owns YouTube — is allowing several ads by domestic political actors containing false or misleading information about mail-in voting to run in key swing states.

ADVERTISEMENT

Political advertising bans do not deal with the real problem. A Carter Center initiative to identify and mitigate online threats to democracy and elections worldwide (an effort funded partly by Facebook) leads us to believe that the solution requires a range of actions by tech platforms, citizens, independent watchdogs and governments.

While it’s fair to criticize platforms for not assuming gatekeeping responsibilities, it’s also time for Congress to take steps to regulate political advertising on social media.

In a March 2019 Washington Post op-ed, Facebook CEO Mark ZuckerbergMark Elliot ZuckerbergHillicon Valley: Five takeaways on new election interference from Iran, Russia | Schumer says briefing on Iranian election interference didn't convince him effort was meant to hurt Trump | Republicans on Senate panel subpoena Facebook, Twitter CEOs | Republicans on Senate panel subpoena Facebook, Twitter CEOs House Republicans urge Democrats to call hearing with tech CEOs MORE acknowledged the need for “a more active role for governments and regulators.” Zuckerberg argued that “legislation should be updated to reflect the reality of the threats and set standards for the whole industry.”

Immediate reforms should do three things: require platforms to mitigate manipulative interference by foreign actors, prohibit deliberately false information that seeks to suppress voter participation, and restrict the use of microtargeting filters that facilitate the spread of false or misleading political claims to discrete audiences.

First, Congress should pass The Honest Ads Act, a bipartisan bill introduced in 2019 that would modernize existing regulations enshrined in 1971’s Federal Election Campaign Act and 2002’s Bipartisan Campaign Reform Act. The Honest Ads Act would make social media advertising subject to the same transparency requirements regarding disclaimers and sponsorship disclosures that govern traditional advertising, order platforms to make reasonable efforts to prevent foreign actors from buying advertisements in advance of elections, and require social media platforms to maintain public databases of all political advertisements with information on their target audiences and costs.

ADVERTISEMENT

Passage of this act would establish a much-needed bulwark against foreign actors that interfere in our elections through targeted political advertising. Federal transparency requirements also would help reduce anonymous political advertising and disincentivize the spread of false or misleading information.

Congress also should pass the Deceptive Practices and Voter Intimidation Prevention Act, originally introduced in 2007 and revised and reintroduced last year. This would make unlawful the dissemination of false information about when and how to vote and help stanch the spread of disinformation aimed at suppressing voter participation.

The biggest challenge, though, is how to deal with false and misleading claims spread by politicians in advertisements. Political advertising isn’t bound by the “truth in advertising” requirements that govern commercial advertising. It is largely shielded from regulation in accordance with the constitutional right of free speech. Dozens of states have tried — so far unsuccessfully — to legislate false claims out of political advertisements, but courts have struck down such provisions. Any similar attempts at federal legislation would face stiff opposition and court challenges.

In recent years, both Republicans and Democrats, including both major party presidential candidates, have suggested revoking Section 230 of the Communications and Decency Act, which gives internet providers immunity from liability for most content that users publish on their platforms. While this would compel online platforms to act, it would also prompt them to practice excessive censorship for fear of legal action.

A more modest approach would be to enact legislation that limits the use of microtargeting filters for political advertising. The tools made available by online platforms themselves should be restricted to age, gender and location. Personal data collected from individual users through platform engagement should be off-limits for political advertising targeting purposes. While this would not fully address the problem, it would mitigate the ability of political actors to direct false and misleading advertisements to discrete audiences.

None of these solutions are silver bullets, but each would address a part of the problem that platforms either can’t — or won’t — address on their own.

Michael Baldassaro is senior advisor to the Carter Center’s Digital Threats Initiative, and David Carroll is the director of the Center’s Democracy Program.