Supreme Court to hear challenge to Big Tech’s Section 230 liability protections
The Supreme Court on Monday agreed to hear two cases this term on whether social media companies can be held financially responsible for hosting terrorist content.
The family of Nohemi Gonzalez, a 23-year-old U.S. citizen killed during a 2015 series of Islamic State terror attacks in Paris, sued YouTube parent company Google, arguing the video sharing site not only provided a platform for videos containing terrorist content, but also recommended the videos to users.
The family alleges that YouTube algorithms allowed “hundreds of radicalizing videos inciting violence and recruiting potential supporters” to be targeted to users of the platform.
Section 230(c)(1) of the Communications Decency Act says companies like YouTube, Google and Twitter are generally shielded from liability for information uploaded by their users, but the case, Gonzalez v. Google, asks whether it should apply when tech companies make “targeted recommendations.”
A judge dismissed the case, and the family appealed to the Supreme Court.
The second case the Supreme Court agreed to hear, Twitter v. Taamneh, involves the 2017 death of Jordanian citizen Nawras Alassaf during an ISIS-affiliated attack in Istanbul.
Alassaf’s family sued social media giants Twitter, Google and Facebook, arguing that the companies did not take enough action to control terrorist content on their sites.
A lower court allowed the case to move forward, but Twitter argued that the earlier decision improperly expanded the scope of the Anti-Terrorism Act and warrants review from SCOTUS.
Both cases could have significant implications for online speech and the role of tech companies in controlling what users share through their platforms.