Earlier this year, the U.S. Supreme Court awarded two victories for internet and social media companies, allowing them to remain under the protection of under Section 230 of the Communications Decency Act, which safeguards internet companies from lawsuits over content posted by users of their platforms.
Section 230 was designed to ensure that internet companies can’t be sued over published content—no matter how harmful—so long as it’s posted by third parties to their platforms.
In the U.S. Supreme Court case involving YouTube, owned by Google, the justices chose not to amend Section 230, though a number of attorneys, including Sternberg, as well as many in political office, like President Joe Biden and former president, Donald Trump, have called for reform to this law in order to hold internet companies liable for at least some of the content on their platforms.
In a separate case, the justices also shielded Twitter from litigation that sought to apply the Anti-Terrorism Act—a law that allows Americans to recover damages related to “an act of international terrorism.”
In both the Twitter and YouTube/Google cases, family members of people killed by Islamist gunmen overseas sued, hoping to hold the internet companies liable, citing the presence of militant groups on these platforms or that the companies recommended content from these groups to users.
In a 9-0 decision, however, the justices reversed a lower court’s ruling on the lawsuit against Twitter by American relatives of Nawras Allassaf, a Jordanian man killed in a 2017 attack in an Istanbul nightclub claimed by the Islamic State militant group. Alassaf’s family accused Twitter of aiding and abetting the Islamic State in the massacre at an Istanbul nightclub in which Alassaf and 38 others were killed. The family said that Twitter failed to police their platform when it came to the militant group’s accounts and posts, and in doing so, violated the Anti-Terrorism Act.
The justices also returned the Google/YouTube case to a lower court. In this lawsuit, the family of Nohemi Gonzalez, a California college student killed in a 2015 Islamic State attack in Paris, sued, saying that YouTube provided illegal assistance to the Islamic State by recommending their content to users.
WHAT THE SUPREME COURT’S RULING MEANS FOR INTERNET DEFAMATION CASES
Though not as extreme as violation of the Anti-Terrorism Act, some may be may wondering how the Supreme Court’s recent ruling will affect the legality of online defamation of character, which is common these days. If someone makes online defamatory claims about an individual or even a business, can online platforms such as Twitter or Facebook be held liable?
The answer, unfortunately, is no—not in most cases.
However, that doesn’t mean that individuals or businesses are completely defenseless against online defamation. You may be able to submit a complaint to the online platform in question, and it’s possible that they may remove the defamatory statements. However, this typically involves an arbitration process in which you must first prove your case. It may not be possible in some situations.
If the online platform doesn’t prove helpful in removing the defamatory statements, another option that often proves more successful is filing a lawsuit against the author of the defamatory statement(s). Though the internet companies, themselves, cannot be held liable for defamatory statements, the author of such statements can. If you are the victim of online defamation, seek the help of an attorney who specializes in this area of law.