Gonzalez v. Google LLC

Gonzalez v. Google LLC, 598 U.S. 617 (2023), was a case at the Supreme Court of the United States which dealt with the question of whether or not recommender systems are covered by liability exemptions under section 230 of the Communications Act of 1934, which was established by section 509 of the Telecommunications Act of 1996, for Internet service providers (ISPs) in dealing with terrorism-related content posted by users and hosted on their servers. The case was granted certiorari alongside another Section 230 and terrorism-related case, Twitter, Inc. v. Taamneh.

In May 2023, the court ruled unanimously in Twitter that the charges against the social media companies were not permissible under antiterrorism law. Gonzalez was sent back to lower courts on a per curiam decision with instructions to consider the Court's decision in Twitter.

Background
In November 2015, a series of coordinated terrorism attacks occurred in Paris. At least 130 were killed by the terrorists, and the Islamic State took responsibility for the attack.

Among those killed was a single American, 23-year-old student Nohemi Gonzalez, on an exchange program with California State University, Long Beach. Her family began to seek legal remedies against Google, the parent company of YouTube. Their suit argued that through its recommendation system that tailors content based on user profiles, YouTube led users towards recruitment videos for the Islamic State, and were partially responsible for Nohemi's death. Google defended itself by relying on Section 230, passed as part of the Telecommunications Act of 1996, which provides immunity from content published on an Internet service provider's platform by third-party users. A lower court ruled in favor of Google, and the decision was upheld by the Ninth Circuit Court of Appeals.

In their appeal to the Supreme Court, the family focused more on the YouTube algorithm that has been tailored to deliver content believed to be of interest to the end user, arguing that while this was automatically done, it was a form of moderation that Section 230 does not fully cover. They wrote in their petition to the Supreme Court, "Whether Section 230 applies to these algorithm-generated recommendations is of enormous practical importance. Interactive computer services constantly direct such recommendations, in one form or another, at virtually every adult and child in the United States who uses social media."

Some leading figures in major U.S. political parties want the law changed, but for different reasons. Many Democrats point to concerns about the proliferation of content that is harmful to children, while many Republicans are worried about conservative viewpoints being blocked on certain sites.

Supreme Court
The Supreme Court granted certiorari to the case in October 2022, along with a related case Twitter, Inc. v. Taamneh also dealing with Section 230 and terrorism-related content. They will be the first cases that the Court will hear over Section 230, which since around 2015 has come under increasing partisan criticism towards Big Tech. Justice Clarence Thomas had spoken to a need to review Section 230 in previous dissenting statements to court orders, arguing that social media companies should be regulated like "common carriers", which would prohibit content-based discrimination.

Many of the Big Tech companies provided their own amicus curiae to support Google's recommender system as part of the case, as well as smaller sites including Reddit and the Wikimedia Foundation which rely on moderation systems that partially incorporate user moderation as part of their systems. While there is general support for updating Section 230 to reflect modern concerns, these briefs broadly stressed the need to let Congress pass legislation rather than having the Supreme Court issue its own judgement. This position was also upheld by Ron Wyden and Christopher Cox, the lawmakers behind Section 230, and law professor Eric Goldman who has written extensively about Section 230, in addition to mobile app platforms like Yelp and Craigslist and free speech advocacy groups like the ACLU and the Electronic Frontier Foundation.

Briefs in support of Gonzalez' position include several Republican Congresspeople including Ted Cruz, Mike Johnson, and Josh Hawley. Some advocacy groups, like the Anti-Defamation League, argue that Google and other Big Tech groups have used Section 230 to remain immune for their own conduct, while also defending robust section 230 protection for moderation decisions. Groups that support child protections on the Internet also provided briefs for Gonzalez.

Oral arguments in Gonzalez were held February 21, 2023. Observers to the Court found the Justices from both liberal and conservative sides questioning the issues around algorithms, stating that most Internet services are based on algorithms. The Justices had also questioned whether YouTube's algorithm was specifically tailored to promote terrorism-related content. The Justices were not sure it would be possible to delineate content further, and the potential for a mass of lawsuits and economic impact should Section 230 be changed. Justice Amy Coney Barrett suggested that the result of the related Twitter case may help resolve the case against Google.

The Court issued decisions for both Gonzalez and Twitter on May 18, 2023. In Twitter, the Court unanimously held that the families' claims against the social media companies were not allowable under the Antiterrorism Act, and did not make any ruling related to Section 230. Subsequently, in the per curiam order given for Gonzalez, the Court vacated the Ninth Circuit's decision and remanded the case for that court to reconsider the case in light of the Twitter decision.