As the US Supreme Court weighs YouTube’s algorithms, a “litigation minefield” looms

By Andrew Chung

WASHINGTON (Reuters) – In 2021, a California state court dismissed a feminist blogger’s lawsuit accusing Twitter Inc of unlawfully banning posts criticizing transgender people as “hateful behavior.” In 2022, a federal court in California dismissed a lawsuit brought by LGBT plaintiffs accusing YouTube, part of Alphabet Inc., of restricting content posted by gay and transgender people.

These lawsuits were among many crushed by a powerful form of immunity enshrined in US law that covers internet companies. Section 230 of the Communications Decency Act 1996 exempts platforms from legal responsibility for content posted online by their users.

The nine justices will consider the scope of Section 230 for the first time in a key case scheduled to appear in the US Supreme Court on Tuesday. A ruling that weakens it could expose internet companies to litigation from all directions, legal experts said.

“There will be more lawsuits than there are atoms in the universe,” said law professor Eric Goldman of the University of Santa Clara Law School’s High Tech Law Institute.

Judges will hear arguments in an appeal by the family of Nohemi Gonzalez, a 23-year-old California woman who was shot dead during a 2015 rampage by Islamist militants in Paris, against a lower court’s decision to file a lawsuit against YouTube owner Google LLC to dismiss monetary damages, relying on Section 230. Google and YouTube are part of Alphabet.

The family alleged that YouTube, through its computer algorithms, unlawfully recommended videos made by the militant group Islamic State, which claimed responsibility for the attacks, to certain users.

A ruling against the company could create a “minefield for litigation,” Google told judges in a brief. Such a decision could change the way the internet works, making it less useful, undermining freedom of expression and hurting the economy, the company and its supporters say.

It could threaten services as diverse as search engines, job listings, product reviews, and displaying relevant news, songs or entertainment, they added.

Section 230 protects “interactive computer services” by ensuring that they cannot be treated as “publishers or speakers” of information provided by users. Legal experts warn that companies could use other legal defenses if Section 230 protections are curtailed.

Calls have come from across the ideological and political spectrum – including Democratic President Joe Biden and his Republican predecessor Donald Trump – for Section 230 to be reconsidered to ensure corporations can be held accountable. Biden’s administration asked judges to reinstate the Gonzalez family’s lawsuit.


Civil rights, gun control and other groups have told judges that platforms increase extremism and hate speech. Republican lawmakers have said platforms stifle conservative viewpoints. A 26-state coalition said social media companies are “no longer just posting” user content, they are “actively exploiting” it.

“It’s a huge ticket to getting out of prison,” Michigan State University law professor Adam Candeub said of Section 230.

Complaints against companies vary. Some target how platforms monetize content, serve advertising, or moderate content by removing or not removing certain material.

Legal claims are often based on breach of contract, fraudulent business practices or violations of government anti-discrimination laws, including on the basis of political views.

“There could be a situation where two sides of a very controversial issue could sue a platform,” said Scott Wilkens, an attorney at Columbia University’s Knight First Amendment Institute.

Candeub represented Meghan Murphy, the feminist blogger and writer who sued after Twitter banned her for posts criticizing transgender women. A California appeals court dismissed the lawsuit under Section 230, seeking to hold Twitter liable for content created by Murphy.

In a separate lawsuit by transgender YouTube channel creator Chase Ross and other plaintiffs, the video platform was accused of unlawfully restricting its content based on its identity while anti-LGBT slurs persisted. A judge blocked them, citing Section 230.


Gonzalez, who had been studying in Paris, died when militants shot at a crowd at a bistro during the shooting that killed 130 people.

In the 2016 lawsuit, her mother Beatriz Gonzalez, stepfather Jose Hernandez and other relatives accused YouTube of providing “material support” to Islamic State by recommending the group’s videos to certain users based on algorithmic predictions about their interests. The recommendations helped spread the Islamic State’s message and recruit jihadist fighters, the lawsuit says.

The lawsuit was filed under the US Anti-Terrorism Act, which allows Americans to seek damages related to “an act of international terrorism.” The San Francisco-based 9th US Circuit Court of Appeals dismissed it in 2021.

The company has garnered support from various tech companies, academics, lawmakers, libertarians and rights groups who fear that exonerating platforms from liability would force them to take down content, even if it is controversial, which would hurt freedom of expression.

The company has defended its practices. Without algorithmic sorting, it said, “YouTube would play every video ever posted in an infinite sequence – the worst TV channel in the world.”

(Reporting by Andrew Chung; Editing by Will Dunham)


Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button