Section 230 and the Question of Internet Liability
In February, the Supreme Court heard oral arguments for Gonzales v. Google LLC and Twitter, Inc. v. Taamneh, two cases that may redefine the scope of liability that tech companies face for user content that is published on their platforms. Both cases are the first to bring Section 230(c)(1) of the Communications Decency Act into the Supreme Court. Section 230 states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider,” effectively providing websites immunity from legal liability for third-party content that is produced on their platforms. [1] The holdings on the cases have the potential to reshape the functions of web content providers, as the cases bring recommendation algorithms that have become commonplace in question.
In 2015, Nohemi Gonzales, a U.S. citizen, was killed by a terrorist attack in Paris, France. ISIS claimed responsibility for said attack the day after through a published Youtube video, in addition to a written statement. In pursuance to the Antiterrorism Act, which provides U.S. citizens with the ability to recover damages for injuries from acts of international terrorism, Reynaldo Gonzalez, the victim’s father, filed a claim against Google, Twitter, and Facebook. While they did not play an active role in the specific act of terrorism that killed Nohemi Gonzalez, Gonzales stated that the aforementioned websites aided and abetted the act of terrorism by providing platforms for ISIS “to recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations.” [2] Gonzalez specifically claimed that Google was liable as Youtube’s owner; according to Gonzalez, Youtube facilitated communication between ISIS and other users through its algorithms that recommended videos published by ISIS to potential members, included ads to generate revenue from ISIS videos, and failed to fully remove ISIS-related content from its platform—though Google had the authority to do so. Google moved to dismiss under claims of immunity under Section 230. The district court granted Google’s motion as the producers of the videos were ISIS, not Google, and the Ninth Court affirmed under the reasoning that Section 230 immunity held as long as the recommendation algorithm used the same neutral criteria for both harmful and non-harmful content. [3] In an amicus brief filed on behalf of the respondent, the ACLU additionally asserted that depriving platforms of immunity in the case of generic recommendation algorithms would incentivize platforms to censor “lawful content or to feature only speech of little interest or value to users,” and render protections provided in Section 230 to be meaningless. [4]
Gonzalez asserts that Youtube, as a subsidiary of Google, does not qualify as a “publisher or speaker” and thus not protected under Section 230, as the claim hinges on Youtube’s recommendation of harmful content through URL, not on its publishment or dissemination of them through files. In addition, Gonzalez argues that Youtube’s video recommendations are not “information provided by another information content provider,” as Youtube generates its own URLs, which Gonzalez claims are information actively provided by Youtube. In opposition to Gonzalez, Google claims that Youtube is a publisher in everyday language—selection and arrangement of content for user view through neutral criteria—and argues that Google should not be held liable for content that they did not create. [5]
The case of Twitter v. Taamneh further analyzes the Antiterrorism Act and its additory Justice Against Sponsors of Terrorism Act (JASTA), which states that “any person who aids and abets, by knowingly providing substantial assistance, or who conspires with the person who committed such an act of international terrorism” can be found secondarily liable. [6] Taamneh, a relative of Nawras Alassaf, who was murdered during an ISIS terrorist attack in Istanbul, Turkey, claims that Twitter was aware of ISIS use of the platform to spread propaganda and thus knowingly provided substantial assistance to terrorism. The district court dismissed Taamneh under the claim that there was no proposal on how relief could be granted under the Federal Rule of Civil Procedure, and the Ninth Court reversed and remanded the case.
Twitter claims that JASTA only asserts secondary liability when there is support for a specific act of terrorism, not for its general campaign or course of conduct; Taamneh claims that as the statute is phrased, Twitter is liable under JASTA for aiding the individuals who committed terrorist acts, even if Twitter was not actively involved in the specific act itself.
While the Supreme Court has yet to release a holding on both cases, the ramifications will likely reshape the internet and its social platforms as they function today as tech companies may face expanded liability for user content.