Connect with us

Section 230

Supreme Court Justices Express Caution About Entering Section 230 Debate

During oral arguments for Gonzalez v. Google, justices repeatedly voiced concerns about potential unintended consequences.

Published

on

Photo of Justice Elena Kagan from April 2008 by Doc Searls used with permission

WASHINGTON, February 22, 2023 — Supreme Court justices expressed broad skepticism about removing liability protections for websites that automatically recommend user-generated content, marking a cautious start to a pair of long-awaited cases involving platform liability for terrorist content.

Gonzalez v. Google, argued on Tuesday, hinges on whether YouTube’s use of recommendation algorithms puts it outside the scope of Section 230, which generally provides platforms with immunity for third-party content.

A separate case involving terrorism and social media, Twitter v. Taamneh, was argued on Wednesday. Although the basic circumstances of the cases are similar — both brought against tech companies by the families of terrorist attack victims — the latter focuses on what constitutes “aiding and abetting” under the Anti-Terrorism Act.

Section 230 arguments central to Gonzalez

Section 230 protections are at the heart of Gonzalez. The provision, one of the few surviving components of the 1996 Communications Decency Act, is credited by many experts with facilitating the internet’s development and enabling its daily workings.

But the plaintiffs in Gonzalez argued that online platforms such as YouTube should be held accountable for actively promoting harmful content.

As oral arguments commenced, Justice Elena Kagan repeatedly raised concerns that weakening Section 230 protections could have a wider impact than intended. “Every time anybody looks at anything on the internet, there is an algorithm involved… everything involves ways of organizing and prioritizing material,” she said.

These organization methods are essential for making platforms user-friendly, argued Lisa Blatt, the attorney representing Google. “There are a billion hours of videos watched each day on YouTube, and 500 hours uploaded every minute,” she said.

Justice Brett Kavanaugh pointed to the inclusion of platforms that “pick, choose, analyze or digest content” in the statutory definition of covered entities. Claiming that YouTube forfeited Section 230 protections by using recommendation algorithms, Kavanaugh said, “would mean that the very thing that makes the website an interactive computer service also means that it loses the protection of 230.”

Eric Schnapper, the attorney representing the plaintiffs, argued that the provision in question was only applicable to software providers and YouTube did not qualify.

Justices concerned about unintended impacts of weakening Section 230

Despite Schnapper’s interpretation of the statute’s intent, Kavanaugh maintained his concerns about altering it. “It seems that you continually want to focus on the precise issue that was going on in 1996, but… to pull back now from the interpretation that’s been in place would create a lot of economic dislocation, would really crash the digital economy,” he said.

Weakening Section 230 could also open the door to “a world of lawsuits,” Kagan predicted. “Really, anytime you have content, you also have these presentational and prioritization choices that can be subject to suit,” she said, pointing to search engines and social media platforms as other services that could be impacted.

Deputy Solicitor General Malcolm Stewart, who primarily sided with the plaintiff, argued that even if such lawsuits were attempted, “they would not be suits that have much likelihood of prevailing.”

Justice Amy Coney Barrett noted that the text of Section 230 explicitly includes users of online platforms in addition to the platforms themselves. If the statute was changed, Barrett questioned, could individual users be held liable for any content that they liked, reposted or otherwise engaged with?

“That’s content you’ve created,” Schnapper replied.

‘Confusion’ about the case and the court’s proper role

Throughout the hearing, several justices expressed confusion at the complexities of the case.

During an extended definition of YouTube “thumbnails” — which Schnapper described as a “joint creation” because of the platform-provided URLs accompanying user-generated media — Justice Samuel Alito told Schnapper that the justice was “completely confused by whatever argument you’re making at the present time.”

At another point, Justice Ketanji Brown Jackson said she was “thoroughly confused” by the way that two different questions — whether Google could claim immunity under Section 230 and whether the company aided terrorism — were seemingly being conflated.

Just minutes later, after Stewart presented his argument on behalf of the Justice Department, Justice Clarence Thomas began his line of questioning with, “Well, I’m still confused.”

In addition to frequent references to confusion, multiple justices suggested that some aspects of the case might be better left to Congress.

“I don’t have to accept all of [Google’s] ‘the sky is falling’ stuff to accept… there is a lot of uncertainty about going the way you would have us go, in part just because of the difficulty of drawing lines in this area,” Kagan said. “Isn’t that something for Congress to do, not the court?”

Kavanaugh echoed those concerns, saying that the case would require “a very precise predictive judgment” and expressing uncertainty about whether the court could adequately consider the implications.

But Chief Justice John Roberts seemed equally hesitant to hand off the decision. “The amici suggest that if we wait for Congress to make that choice, the internet will be sunk,” he said.

Reporter Em McPhie studied communication design and writing at Washington University in St. Louis, where she was a managing editor for the student newspaper. In addition to agency and freelance marketing experience, she has reported extensively on Section 230, big tech, and rural broadband access. She is a founding board member of Code Open Sesame, an organization that teaches computer programming skills to underprivileged children.

Section 230

Section 230 Shuts Down Conversation on First Amendment, Panel Hears

The law prevents discussion on how the first amendment should be applied in a new age of technology, says expert.

Published

on

Photo of Ron Yokubaitis of Texas.net, Ashley Johnson of Information Technology and Innovation Foundation, Emma Llanso of Center for Democracy and Technology, Matthew Bergman of Social Media Victims Law Center, and Chris Marchese of Netchoice (left to right)

WASHINGTON, March 9, 2023 – Section 230 as it is written shuts down the conversation about the first amendment, claimed experts in a debate at Broadband Breakfast’s Big Tech & Speech Summit Thursday.  

Matthew Bergman, founder of the Social Media Victims Law Center, suggested that section 230 avoids discussion on the appropriate weighing of costs and benefits that exist in allowing big tech companies litigation immunity in moderation decisions on their platforms. 

We need to talk about what level of the first amendment is necessary in a new world of technology, said Bergman. This discussion happens primarily in an open litigation process, he said, which is not now available for those that are caused harm by these products. 

Photo of Ron Yokubaitis of Texas.net, Ashley Johnson of Information Technology and Innovation Foundation, Emma Llanso of Center for Democracy and Technology, Matthew Bergman of Social Media Victims Law Center, and Chris Marchese of Netchoice (left to right)

All companies must have reasonable care, Bergman argued. Opening litigation doesn’t mean that all claims are necessarily viable, only that the process should work itself out in the courts of law, he said. 

Eliminating section 230 could lead to online services being “over correct” in moderating speech which could lead to suffocating social reform movements organized on those platforms, argued Ashley Johnson of research institution, Information Technology and Innovation Foundation. 

Furthermore, the burden of litigation would fall disproportionally on the companies that have fewer resources to defend themselves, she continued. 

Bergman responded, “if a social media platform is facing a lot of lawsuits because there are a lot of kids who have been hurt through the negligent design of that platform, why is that a bad thing?” People who are injured have the right by law to seek redress against the entity that caused that injury, Bergman said. 

Emma Llanso of the Center for Democracy and Technology suggested that platforms would change the way they fundamentally operate to avoid threat of litigation if section 230 were reformed or abolished, which could threaten freedom of speech for its users. 

It is necessary for the protection of the first amendment that the internet consists of many platforms with different content moderation policies to ensure that all people have a voice, she said. 

To this, Bergman argued that there is a distinction between algorithms that suggest content that users do not want to see – even that content that exists unbeknownst to the seeker of that information – and ensuring speech is not censored.  

It is a question concerning the faulty design of a product and protecting speech, and courts are where this balancing act should take place, said Bergman. 

This comes days after law professionals urged Congress to amend the statue to specify that it applies only to free speech, rather than the negligible design of product features that promote harmful speech. The discussion followed a Supreme Court decision to provide immunity to Google for recommending terrorist videos on its video platform YouTube.   

To watch the full videos join the Broadband Breakfast Club below. We are currently offering a Free 30-Day Trial: No credit card required!

Continue Reading

Section 230

Congress Should Amend Section 230, Senate Subcommittee Hears

Experts urged Congress to amend tech protection law to limit protection for the promotion of harmful information.

Published

on

Photo of Hany Farid, professor at University of California, Berkley

WASHINGTON, March 8, 2023 – Law professionals at a Senate Subcommittee on Privacy, Technology and the Law hearing on Wednesday urged Congress to amend Section 230 to specify that it applies only to free speech, rather than the promotion of misinformation.

Section 230 protects platforms from being treated as a publisher or speaker of information originating from a third party, thus shielding it from liability for the posts of the latter. Mary Anne Franks, professor of law at the University of Miami School of Law, argued that there is a difference between protecting free speech and protecting information and the harmful dissemination of that information.

Hany Farid, professor at University of California, Berkley, argued that there should be a distinction between a negligently designed product feature and a core component to the platform’s business. For example, YouTube’s video recommendations is a product feature rather than an essential function as it is designed solely to maximize advertising revenue by keeping users on the platform, he said.

YouTube claims that the algorithm to recommend videos is unable to distinguish between two different videos. This, argued Farid, should be considered a negligently designed feature as YouTube knew or should have reasonably known that the feature could lead to harm.

Section 230, said Farid, was written to immunize tech companies from defamation litigation, not to immunize tech companies from any wrongdoing, including negligible design of its features.

“At a minimum,” said Franks, returning the statue to its original intention “would require amending the statute to make clear that the law’s protections only apply to speech and to make clear that platforms that knowingly promote harmful content are ineligible for immunity.”

In an State of the Net conference earlier this month, Frank emphasized the “good Samaritan” aspect of the law, claiming that it is supposed to “provide incentives at platforms to actually do the right thing.” Instead, the law does not incentivize platforms to moderate its content, she argued.

Jennifer Bennett of national litigation boutique Gupta Wessler suggested that Congress uphold what is known as the Henderson framework, which would hold a company liable if it materially contributes to what makes content unlawful, including the recommendation and dissemination of the content.

Unfortunately, lamented Eric Schnapper, professor of law at University of Washington School of Law, Section 230 has barred the right of Americans to get redress if they’ve been harmed by big tech. “Absolute immunity breeds absolute irresponsibility,” he said.

Senator Richard Blumenthal, R-Connecticut, warned tech companies that “reform is coming” at the onset of the hearing.

This comes weeks after the Supreme Court decision to provide immunity to Google for recommending terrorist videos on its video platform YouTube. The case saw industry dissention on whether section 230 protects algorithmic recommendations. Justice Brett Kavanaugh claimed that YouTube forfeited its protection by using recommendation algorithms but was overturned in the court ruling.

Continue Reading

Premium

Content Moderation, Section 230 and the Future of Online Speech

Our comprehensive report examines the extremely timely issue of content moderation and Section 230 from multiple angles.

Published

on

In the 27 years since the so-called “26 words that created the internet” became law, rapid technological developments and sharp partisan divides have fueled increasingly complex content moderation dilemmas.

Earlier this year, the Supreme Court tackled Section 230 for the first time through a pair of cases regarding platform liability for hosting and promoting terrorist content. In addition to the court’s ongoing deliberations, Section 230—which protects online intermediaries from liability for third-party content—has recently come under attack from Congress, the White House and multiple state legislatures.

Members of the Breakfast Club also have access to high-resolution videos from the Big Tech & Speech Summit!

Member download, or join with Free 30-Day Trial!

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending