supreme court agrees to hear cases that could reel in social media companies immunity scaled

Former President Donald Trump has called for Section 230 to be revoked.
Matt Cardy via Getty Images

The Supreme Court on Monday announced that it would hear two cases this term that could significantly change the nature of content moderation on the internet.

The court has agreed to hear Gonzalez v. Google and Twitter v. Taamneh. Both cases concern whether tech companies could be held legally liable for what users post on their platforms, as well as for content that users see because of the platform’s algorithm.

Websites generally can’t be held liable in either instance because of Section 230 of the Communications Decency Act of 1996, which states: “No provider or user of an interactive computer service shall be treated as the publisher of or speaker of information provided by another information content provider.”

Nohemi Gonzalez was one of 129 people killed during coordinated attacks carried out by the self-described Islamic State in Paris in November 2015.

Gonzalez’s father, Reynaldo Gonzalez, argues in his lawsuit against Google that YouTube’s recommendation algorithm aided the terrorist group’s recruitment efforts by promoting its videos to users in violation of what’s known as the Anti-Terrorism Act.

In Twitter v. Taamneh, the family of Nawras Alassaf, the victim of a 2017 nightclub attack carried out by the self-described Islamic State, alleges social media companies provided material support for terrorism and didn’t do enough to check the group’s presence on their platforms.

As Slate’s Mark Joseph Stern observed, there’s “cross-ideological consensus” among lower court judges that the time has come for the boundaries of Section 230 to be revisited.

Last year, Judge Marsha Lee Siegel Berzon of the Ninth Circuit Court of Appeals, a Bill Clinton appointee, urged her colleagues to reconsider legal precedent surrounding Section 230 “to the extent that it holds that section 230 extends to the use of machine-learning algorithms to recommend content and connections to users.”

In 2020, Supreme Court Justice Clarence Thomas signaled that he was open to hearing arguments over Section 230, writing, “in an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.”

Section 230 has come under attack from both Democrats and Republicans, albeit for different reasons. Former President Donald Trump tweeted “REVOKE 230!” after Twitter started putting fact-checking labels on his missives. And as a candidate in 2020, President Joe Biden told The New York Times editorial board that Meta CEO Mark Zuckerberg “should be submitted to civil liability and his company to civil liability, just like you would be here at The New York Times.”

Others have cautioned that limiting Section 230 could chill freedom of expression on the web. Its supporters argue it provides legal protections to small bloggers as well as websites like Wikipedia and Reddit, which might otherwise be held liable for the content of their comment sections or crowd-sourced material.

The Electronic Frontier Foundation, a nonprofit dedicated to civil liberties on the web, has referred to Section 230 as “​​one of the most valuable tools for protecting freedom of expression and innovation on the Internet” and says it “creates a broad protection that has allowed innovation and free speech online to flourish.”

Right-wingers have cited Section 230 while arguing that social media companies discriminate against conservative viewpoints ― even though on Facebook, for example, conservative media dominates and have said that these companies should therefore be subjected to the same legal constraints as traditional publishers.

Ironically, as some observers have noted, the restriction or elimination of Section 230 would likely lead to more limits on internet speech, not fewer.

“It could create a prescreening of every piece of material every person posts and lead to an exceptional amount of moderation and prevention,” Aaron Mackey, staff attorney at EFF, told NPR in 2020. “What every platform would be concerned about is: ‘Do I risk anything to have this content posted to my site?’”

Leave a Reply

Your email address will not be published. Required fields are marked *