Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Supreme Court rules for Google, Twitter on terror-related content

The Supreme Court building in Washington, D.C.  (Jonathan Newton/Washington Post)
By Robert Barnes and Cat Zakrzewski Washington Post

The Supreme Court ruled for Google and Twitter in a pair of closely watched liability cases Thursday, saying families of terrorism victims had not shown the companies helped foster attacks on their loved ones.

“Plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack,” Justice Clarence Thomas wrote in a unanimous decision in the Twitter case. The court adopted similar reasoning in the claim against Google.

The court’s narrowly focused rulings sidestepped requests to limit a law that protects social media platforms from lawsuits over content posted by their users, even if the platform’s algorithms promote videos that laud terrorist groups.

“Countless companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” Google general counsel Halimah DeLaine Prado said in a statement. “We’ll continue our work to safeguard free expression online, combat harmful content, and support businesses and creators who benefit from the internet.”

The court’s decision to sidestep Section 230 in the decisions is a victory for Google and other social media companies, who argued that any change to the provision could upend the internet, leaving companies exposed to lawsuits over their efforts to police offensive posts, photos and videos on their service.

It had become clear at oral arguments that the justices were reluctant to make significant changes to the law, which has drawn criticism from leaders in both political parties. “We’re a court,” Justice Elena Kagan said at the time, adding that she and her colleagues “are not like the nine greatest experts on the internet.”

In the Twitter case, American relatives of Nawras Alassaf said the company failed to properly police its platform for Islamic State-related accounts in advance of a Jan. 1, 2017, attack at the Reina nightclub in Turkey that killed Alassaf and 38 others. In the Google case, the family of an exchange student killed in an Islamic State attack in Paris said Google’s YouTube should be liable for promoting content from the group.

The relatives in both cases based their lawsuits on the Anti-Terrorism Act, which imposes civil liability for assisting a terrorist attack. At issue was whether the company provided substantial assistance to the terrorist group.

But Thomas, writing in the Twitter case, said the link was too attenuated.

“As alleged by plaintiffs, defendants designed virtual platforms and knowingly failed to do ‘enough’ to remove ISIS-affiliated users and ISIS related content – out of hundreds of millions of users worldwide and an immense ocean of content – from their platforms,” he wrote. “Yet, plaintiffs have failed to allege that defendants intentionally provided any substantial aid to the Reina attack or otherwise consciously participated in the Reina attack – much less that defendants so pervasively and systemically assisted ISIS as to render them liable for every ISIS attack.”

The Google case specifically raised the issue of Section 230, a decades-old legal provision that courts have found shields social media giants from liability over the posts, photos and videos that people share on their services.

But the short, unsigned decision said the justices “decline to address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief.”Section 230 was enacted in 1996, years before YouTube, Facebook and other social networks existed. The law has proved to be a potent legal shield for the companies, who regularly use it to seek the dismissal of lawsuits over their decisions to host or remove content from their platforms.

The provision has become a lightning rod in the politically polarized debate over what responsibility social media companies have to moderate harmful or offensive posts on their sites. Both President Biden and former president Donald Trump have criticized Section 230, at times calling for it to be revoked.

But despite bipartisan concerns over the law, and a flurry of congressional hearings, there’s been little consensus among lawmakers about how to change it.

Tech industry-funded groups celebrated the court’s decision on Thursday. Chamber of Progress, which receives funding from Meta, Google and other companies and filed a brief supporting Google in the case, called the ruling an “unambiguous victory for online speech and content moderation.”

“While the Court might once have had an appetite for reinterpreting decades of internet law, it was clear from oral arguments that changing Section 230’s interpretation would create more issues than it would solve,” said Jess Miers, a lawyer for the group.

The cases are Twitter v. Taamneh and Gonzalez v. Google.