Publications

Technology and Intellectual Property / May 2023

Are Social Media Companies Liable For Aiding and Abetting ISIS in its Terror Attacks? US Supreme Court Says “No”

In two closely watched, high-profile cases decided on May 18, both involving the liability of social media sites in international terrorist attacks committed by ISIS, the United States Supreme Court held that social media companies were not liable.

Twitter v. Taamneh

An operative of the Islamic State of Iraq and Syria (ISIS) carried out a terrorist attack on a nightclub in Istanbul in 2017 and killed Nawras Alassaf and 38 others. The family of Alassaf brought a claim against Facebook, Twitter and Google (as the owner of YouTube) for aiding and abetting ISIS in committing an act of international terrorism.  The plaintiffs alleged that the social media companies were liable because they allowed ISIS to use their platforms and “recommendation” algorithms (that match content, advertisements and users based on information about the use and content being viewed) as tools for recruiting, fundraising and spreading propaganda. The plaintiffs further alleged that the platforms profited from the advertisements placed on ISIS’ tweets, posts and videos.

In a unanimous 9-0 decision, the Court held that the social media companies were not liable for aiding and abetting the terrorist act for the following reasons:

1. They did not give such “knowing and substantial” assistance to ISIS to be considered to have participated in this nightclub attack.
2. The platforms’ connection with ISIS was the same as its connection with any user; they did not give ISIS any special treatment or words of encouragement; they neither participated in the attack nor sought for the attack to succeed.
3. Rather, the recommendation algorithms are merely part of the infrastructure through which all the content on their platforms is filtered.
4. Accordingly, at bottom, the allegations rest less on affirmative misconduct of the platforms and more on passive nonfeasance.
5. As a matter of policy, the Court held that if the social media platforms were held liable for this nightclub attack, the same logic would mean that these platforms would be liable for every ISIS terrorist act committed anywhere in the world. Moreover, it would mean that every communication platform would be liable simply because wrongdoers were using its services, and the platform failed to stop them.

Gonzalez v. Google

ISIS terrorists unleashed a set of coordinated attacked in 2015 across Paris, killing 130 people, including Nohemi Gonzalez. The family of Gonzalez sued Google, alleging that it was liable for the terrorist attack because it permitted ISIS and its supporters to upload videos on YouTube, it approved the videos for advertisements, and it shared proceeds with ISIS through YouTube’s revenue-sharing program.

Section 230 of the Communications Decency Act provides that, in general, platforms are not liable for the defamatory content uploaded by its users. (There are, however, exceptions when the platforms will not enjoy the protection of Section 230. Examples of such cases have involved online services that were found to have induced or encouraged development of illegal content; have failed to warn users about illegal activity that they knew about; breached an agreement to remove the defamatory information; or failed to act in good faith in filtering offensive material.) The question in this case was whether the broad protections of Section 230, which have come under criticism from both Democratic and Republican politicians for different reasons, would be narrowed by the Court.

In a brief unsigned opinion, the Court remanded the case back to the Ninth Circuit Court of Appeals to consider in light of their decision in Twitter v. Taamneh.

Analysis and Impact on Israeli Jurisprudence

While the Court in Gonzalez did not decide the extent of protection that Section 230 affords platforms against defamation claims, its statements clearly indicate that it was not, at least at this point, inclined to further limit that protection.  The Court stated, “we think it is sufficient to acknowledge that much (if not all) of plaintiffs’ complaint seems to fail under either our decision in Twitter or the Ninth Circuit’s unchallenged holdings [that YouTube was not liable for aiding and abetting ISIS simply because there was revenue-sharing on ads.]” The Court declined to specifically address the Section 230 claim for “a complaint that appears to state little, if any, plausible claim for relief.”

These decisions are clearly helpful to social media platforms in fending off claims in American courts that they are liable for the wrongdoing of users, whether that is for aiding terrorist acts or other illegal activity, or of defamation. The theme connecting these cases is that the Court found that it is not appropriate to hold sites, liable for the acts of their users.

We will be following closely whether these cases have any impact on Israeli jurisprudence in these areas.  While we would not expect it to have any direct impact on cases in Israel, which has a different structure for its defamation law and different protections of sites from defamatory content created by others, we think it could have an indirect impact on the rulings in Israeli cases, in finding that it is not appropriate to hold sites, which largely serve as passive actors, and which apply uniform algorithms to operate their sites, liable for the wrongful defamatory statements or illegal acts (including even terrorist bombings) of their users.