Supreme Court sides with social media giants on liability laws
In two decisions, justices unanimously decline to allow lawsuits that claimed the platforms contributed to terror attacks
The Supreme Court on Thursday left untouched a sweeping immunity law for internet companies known as Section 230 and sided with social media companies in cases that grappled with liability for content on their platforms.
Legal experts had said the two cases, Gonzalez v. Google and Twitter v. Taamneh, could result in the justices effectively rewriting one of the central laws underpinning the modern internet before Congress does.
But the Supreme Court took a more restrained approach, and the American Civil Liberties Union and other groups that weighed in on the cases said the decisions avoided curtailing free speech online.
“The Court will eventually have to answer some important questions that it avoided in today’s opinions,” Anna Diakun, staff attorney at the Knight First Amendment Institute at Columbia University, said in a news release. “Questions about the scope of platforms’ immunity under Section 230 are consequential and will certainly come up soon in other cases.”
The cases started with the family members of victims of terrorist attacks who want to hold the tech companies responsible for content on their platforms.
The justices, in an unsigned opinion in the Google case, declined to weigh in on Section 230, which in general prevents providers from being liable for information originating from a third party — a provision that some members of Congress are seeking action on.
Family members in that case had argued that YouTube recommendations, which are generated by algorithms, helped spur the growth of terrorist group ISIS and sparked the deadly 2015 attack that killed Nohemi Gonzalez.
The court found that the lawsuit against the social media giant appears to state “little, if any, plausible claim for relief.”
And in a separate opinion in the Twitter case, the nation’s highest court ruled in favor of the social media company in a case over whether it could be held liable because a terrorist group used the website to raise funds and recruit members.
The family of Nawras Alassaf, a victim of a 2017 nightclub attack in Istanbul, argued that Twitter should face litigation under the Antiterrorism Act since it knew an Islamic State terrorist group used its platform and didn’t do enough to stop them.
Justice Clarence Thomas, writing the unanimous opinion, said the plaintiffs' allegations were not enough to establish that the defendants “aided and abetted ISIS in carrying out the relevant attack.”
Thomas acknowledged that bad actors like ISIS are able to use platforms for terrible purposes, but he wrote the same could be true for cell phones, email or the internet in general.
“Yet, we generally do not think that internet or cell service providers incur culpability merely for providing their services to the public writ large,” Thomas wrote. “Nor do we think that such providers would normally be described as aiding and abetting, for example, illegal drug deals brokered over cell phones — even if the provider’s conference call or video call features made the sale easier.”
And in this case, Thomas wrote, there is no allegation that the platforms “do more than transmit information by billions of people, most of whom use the platforms for interactions that once took place via mail, on the phone or in public areas.”
“The fact that some bad actors took advantage of these platforms is insufficient to state a claim that defendants knowingly gave substantial assistance and thereby aided and abetted those wrongdoers’ acts,” Thomas wrote.
A contrary holding “would effectively hold any sort of communication provider liable for any sort of wrongdoing merely for knowing that the wrongdoers were using its services and failing to stop them,” Thomas wrote.
Patrick Toomey, deputy director of the ACLU National Security Project, said in a press release that the rulings should be commended.
“Twitter and other apps are home to an immense amount of protected speech,” Toomey said. “And it would be devastating if those platforms resorted to censorship to avoid a deluge of lawsuits over their users’ posts.”