Congress must confront online extremism

Congress must confront online extremism
© istock

With each new terror attack or mass murder linked to extremist content online, major tech companies like YouTube, Facebook and Twitter make promises to better police their platforms. But after the outrage subsides, very little of consequence changes. It is clear now that tech firms will not effectively and consistently enforce their terms of service, since doing so negatively impacts their bottom line. Patience for their bait and switch tactics is wearing thin, and it is time for Congress to hold this unregulated industry accountable by modifying the blanket legal protection afforded through Section 230 of the Communications Decency Act.

Tech companies have put profits over the physical safety of people for too long. How else can Facebook explain its inability to stop ISIS followers from exploiting its platform to host meetings, link to terrorist propaganda and organize? How else can YouTube explain its failure to permanently remove ISIS videos or take action against the responsible accounts that repeatedly upload them?

To bridge the chasm between claims of enforcement made by tech companies and the reality of internet and social media misuse by extremists and terrorists, the Counter Extremism Project (CEP), an organization I proudly cofounded with former United Nations Ambassador Mark D. Wallace and former Presidential Homeland Security and Counterterrorism Advisor Frances Townsend, regularly documents extremist content and the real-word implications of inconsistent enforcement.


Unfortunately, those examples are becoming more frequent and more deadly. From the El Paso mass shooting, to the Christchurch video, to the Pulse nightclub attack in Orlando, the path from online radicalization to horrific violence is unmistakable. Shooting, bombings and vehicle attacks by Islamists and white supremacists have all been linked directly to extremist content online. And prior to an attack being launched, many would-be killers post their manifestos to tech company platforms in the hopes of inspiring others.

The internet has transformed the way we communicate and given us exciting new ways to access information and entertainment. Yet, without rules, freedom usually devolves into chaos. While Congress has expressed concern and held hearings on the tragic consequences of extremist content online, it has been slow to force transparency and accountability on an industry that is protected from liability by a statute. To help save lives, ideologies and methodologies of hate easily accessible in a few clicks must have our urgent attention.

Since 1996, Section 230 of the Communications Decency Act has provided blanket legal protection to internet and social media companies for content posted online by third parties. There are, however, exceptions for child pornography and copyrighted material. There ought to be another exception for extremist, hateful material.

On the basis of Section 230, the 2nd U.S. Circuit Court of Appeals ruled recently that Facebook could not be held liable for allowing Hamas to promote, encourage, and celebrate terrorist attacks in Israel on its platform. The lawsuit was brought by the families of five murdered Americans, including Stuart and Robbi Force, whose son 28-year-old Taylor Force was stabbed to death in Israel by a Palestinian terrorist in 2016.

However, Chief Judge Robert Katzmann, in a dissenting opinion, observed: “Whether, and to what extent, Congress should allow liability for tech companies that encourage terrorism, propaganda, and extremism is a question for legislators, not judges. Over the past two decades ‘the internet has outgrown its swaddling clothes,’ and it is fair to ask whether the rules that governed its infancy should still oversee its adulthood.”

Judge Katzmann is correct. The rules governing the internet must be changed by Congress to meet the clear and present dangers that we now face. This will not be a simple task and of course it would best be accomplished with the active cooperation of tech companies. However, one way or another, it must be done.

Joseph I. Lieberman, the former U.S. senator from Connecticut and the 2000 Democratic nominee for vice president of the United States, is on the advisory board of the Counter Extremism Project.