The CEOs of Apple, Amazon, Google and Facebook are expected to appear this week before the House Judiciary Antitrust Subcommittee, which “has been investigating the dominance of a small number of digital platforms and the adequacy of existing antitrust laws and enforcement” for the last year. The appearance would come just days before the end the “Stop Hate for Profit” campaign, which “ask[ed] all businesses to stand in solidarity with our most deeply held American values of freedom, equality and justice and not advertise on Facebook's services in July.”
Facebook’s response to the campaign was anemic, demonstrating that a Facebook-only boycott is unlikely to achieve lasting change — despite its noble goal. Hundreds of millions of people and small businesses rely on Facebook for commerce in this pandemic-challenged economy. In our omnichannel world, consumers and businesses speak on the platform most convenient to them. To advance change on all platforms, “Stop Hate for Profit” should persuade its corporate supporters to influence reform of Section 230 of the Communications Decency Act, which deems social media not to be publishers.
The House Judiciary Antitrust Subcommittee should be asking questions about Section 230, as it is one of the reasons for the dominance of a small number of digital platforms.
Social media, through Section 230 and algorithms, has given trolls and slanderers vast audiences for their falsehoods. COVID-19 has exacerbated this problem. People sheltering-in-place are more dependent than ever on digital interfaces for necessities, news, research, education, commerce, and social interaction. Many, like me, want to know the source and veracity of their information, including whether a human in the U.S. — not a malicious foreign actor — is opining on domestic politics and social justice issues. Constant pandemic-induced cable news has sensitized many to the injustice of race-related violence and hate speech. Polls show a majority of Americans support the related protests.
A straight repeal of Section 230, however, is not practical. Whole business models are built on it. Digital businesses employ millions — and the largest are among the world’s most valuable companies. Nonetheless, social media lacks trust — particularly as Facebook has conceded and “Stop Hate for Profit” evidences.
Section 230 reform should regulate social media as we do other industries like banking, utilities, pharmaceuticals and insurance.
Today’s unique circumstances — with social media central to safe and productive living — demand the public trust that regulation brings. Over a 27-year career as an insurance industry lawyer, I can confirm an early mentor’s maxim: “Regulation adds legitimacy.”
Three questions members of the House Judiciary Antitrust Subcommittee should ask this week related to antitrust and Section 230 issues:
- Why shouldn’t digital platforms be subject to “know your customer” and security credentialing requirements?Our anti-money laundering laws require banks to know their customers to prevent criminal activity. Similarly, online banking transactions require security credentialing to confirm the identity of the customer. Freedom of speech and safeguarding our elections from foreign interference should outrank anti-money laundering in legal priorities. If we require social media companies to know their customers and credential security, actual human customer engagement will be confirmed — although users could still post anonymously. It also likely would deter online criminal activity.
- Why shouldn’t algorithms that drive information be regulated? Since Section 230 became law 25 years ago, data scientists have developed increasingly complex algorithms. Algorithms profile users and transmit user-aligned information — regardless of request and without verification of accuracy. Developing algorithms requires many complex decisions, some involving ethical and moral issues. A 2017 book by two Booz Allen Hamilton experts — [The] Mathematical Corporation, Where Machine Intelligence [+] Human Ingenuity Achieve the Impossible — describes the dilemma as: “You need an ethical brain trust to untangle the knotty issues of what’s best for the organization, what complies with legal constraints, and what society calls right.”
Knotty ethical issues involving public trust beg for regulation. If we can regulate insurance rating, which is complex and involves legal and moral issues, why can’t we regulate complex social media algorithms too?
- Why shouldn’t digital platforms have complaint reporting and internal appeal processes related to the Good Samaritan liability exemption? Social media companies have taken different approaches to Section 230’s Good Samaritan exemption, which allows objectionable material, whether constitutionally protected or not, to be removed. Facebook long has been a minimalist restricter, which is why Stop Hate for Profit based its campaign on “Facebook’s long history of allowing racist, violent and verifiably false content to run rampant on its platform.” We should require social media companies to adhere to a prescribed process to review user complaints of allegedly objectionable material. If, after review, the social media company does not remove the content, the user should have the right to an internal appeal decided by three disinterested parties — similar to the process required for health insurance claims denials.
With social injustice rightfully on the minds of America and the world in these unparalleled times, we should use all our tools to achieve a more just society, including protest and boycott. Most importantly, we should seize this opportunity to address antitrust concerns — and reform Section 230 and regulate social media with processes and controls that ensure trust and accuracy.
Michael H. Lanza is Executive Vice President and General Counsel of Selective Insurance, with oversight for the company's legal department, as well as regulatory, ethics and compliance, and legislative and government affairs functions.