EARN IT could offer framework for better platform moderation
The EARN IT Act, recently cleared for floor consideration by the Senate Judiciary Committee, remains a contentious bill, primarily over concerns that it might dissuade tech providers from using encryption. But amid ongoing debate about Section 230 and the role of tech platforms in our public discourse, legislation like EARN IT could, if paired with carefully crafted procedural protections, offer a model for how Congress can address bipartisan concerns about child sexual abuse material (CSAM) and other illegal content online.
Debates about Section 230 and the liability shield it grants to digital platforms typically center either on how to make platforms remove hate speech, misinformation and other disfavored content, or how to prevent them from censoring certain political speech, particularly that of conservatives. But such discussions fundamentally misunderstand what Section 230 was meant to do: define how best to assign liability for content-moderation decisions in order to achieve the ideal balance of expression and potentially harmful content.
No moderation system will ever be perfect. Some harmful content will always exist. But there is no reason to presume that the status quo, rooted in assumptions about the online environment from more than two decades ago, necessarily strike that balance in a way that makes sense today.
To the extent that the law currently allows harms that exceed the benefits of expression, it should be adjusted to deter those harms if doing so can be achieved at sufficiently low cost. Nearly everyone would agree that harmful content should be removed if it can be done without any effect on lawful expression. Thus, the question is finding the right tradeoff: one that would deter harms but not impose such massive legal liability as to drive online platforms out of business. This can be done, but it requires thoughtful consideration.
The EARN IT Act traces the edges of the problem but, without a truly holistic approach, it could do more harm than good. While Section 230 is largely beneficial, its grant of near-total immunity prevents the legal system from adapting to new developments. To be sure, as platforms discover new forms of harm, there are pressures that guide their behavior, such as concerns about image and the ability to grow and maintain a user base. But without legal consequences for making unreasonably bad decisions, such pressures may not provide enough incentive to find optimal solutions.
Rather than a blanket grant of legal immunity, Section 230’s protections should be conditioned on platforms demonstrating reasonable behavior. That is to say, an online service provider should have a duty of care to reasonably moderate illegal content. Implicit in the idea of “reasonable moderation” is the understanding that platforms will not be able to deal with all bad content.
It could be the case that platforms already operate as reasonably as would be possible, within the bounds of economic efficiency. But determining that should involve at least some oversight from a neutral court.
Analyzing whether a platform has behaved reasonably could include examining its use of encryption, as the EARN IT Act contemplates. Given that many malicious actors seek to steal user data, it may be completely reasonable to encrypt communications. But there may also be marginal cases where a platform unreasonably allowed encryption to be used to hide what it had good reason to believe was criminal behavior. Flexible standards of reasonableness, informed by well-developed industry best practices, can grapple with either of these situations.
Because courts largely have not had the opportunity to weigh these issues through a gradual and iterative process over the quarter-century that Section 230 has been in effect, it would be ill-advised simply to throw all the questions surrounding online moderation to the judicial process in one fell swoop. This would invite a torrent of litigation that threatens to do more harm than good.
To make the transition less chaotic, there should be procedural limitations, such as heightened pleading standards and an explicit safe harbor to cut litigation short at the pleading stages. These reforms also should incorporate industry standards and best practices, and a judicial review mechanism that can provide feedback to the process.
There are legitimate concerns when it comes to federal legislators tinkering with Section 230. Many lawmakers’ public statements suggest they want regulations that are totally inconsistent with the First Amendment. But there is more that can be done, within the bounds of the Constitution, to address the very real problem of harmful and illegal content online. The EARN IT Act is not perfect, but it sketches a framework that could be developed into a more balanced reform of Section 230.
Kristian Stout is director of Innovation Policy with the International Center for Law & Economics and co-author of the working paper “Who Moderates the Moderators?: A Law and Economics Approach to Holding Online Platforms Accountable Without Destroying the Internet.”
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.