Connect with us

Section 230

Attorney General Bill Barr Calls for ‘Recalibrated’ Section 230 as Justice Department Hosts Tech Immunity Workshop

Published

on

Photo of Attorney General Bill Barr in May 2019 by Shane McCoy used with permission

WASHINGTON, February 19, 2020 – Attorney General William Barr laid out the case for “recalibrating” Section 230 of the Communications Decency Act in response to what he called a concentrated power over information that resides in the hands of Silicon Valley tech companies.

Because “the big tech platforms of today often monetize” their power through advertising, “their financial incentives in content distribution may not always align with what is best for the user,” Barr said in remarks kicking off a Wednesday workshop at the U.S. Justice Department.

Originally a non-controversial law seen as a means to incentivize online free speech, Section 230 has come to be seen as amplifying the ills wrought by information technology. Populists on the right and progressives on the left are now calling for changes to Section 230.

The Justice Department’s workshop may be an effort to put Section 230 protections for tech companies on the chopping block.

At the same time, Barr’s agency is leading a major antitrust inquiry into the tech sector, potentially up to and including efforts to break up Google or Facebook.

“While the department’s antitrust review is looking at these developments from a competition perspective, we must also recognize what this concentration means for Section 230 immunity,” Barr said.

Background about the origins of Section 230

Section 230 became law as part of the 1996 Telecommunications Act. In those early days of the internet, Section 230 arose against a backdrop of online service providers such as America Online, CompuServe, and Prodigy. CompuServe did not engage in any form of content moderation, whereas Prodigy positioned itself as a family-friendly alternative by enforcing content guidelines and screening offensive language.

It didn’t take long for both platforms to be sued for defamation. In the 1991 case Cubby v. CompuServe, the federal district court in New York ruled that CompuServe could not be held liable for third party content of which it had no knowledge, similar to a newsstand or library.

But in 1995, the New York supreme court ruled in Stratton Oakmont v. Prodigy that the latter platform had taken on liability for all posts simply by attempting to moderate some, constituting editorial control.

The decision prompted pro-technology representatives Ron Wyden, D-Ore., and Rep. Chris Cox, R-Calif., to introduce an amendment to the Communications Decency Act, ensuring that providers of an interactive computer service would not be held liable for third-party content, thus allowing them to moderate with impunity.

See Broadband Breakfast’s four-part series on the CDA:

Section I: The Communications Decency Act is Born

Section II: How Section 230 Builds on and Supplements the First Amendment

Section III: What Does the Fairness Doctrine Have to Do With the Internet?

Section IV: As Hate Speech Proliferates Online, Critics Want to See and Control Social Media’s Algorithms

Barr blasts the ‘many’ problems with a ‘broad Section 230 immunity’

At the Justice Department on Wednesday, Barr made his views known for changing the law. He said that “the Department of Justice is concerned about the expansive reach of Section 230.” He complained that Section 230 blunts the impact of civil tort lawsuits that should have a greater bite in complementing criminal law enforcement efforts of the Justice Department.

In particular, he said, “the Anti-Terrorism Act provides civil redress for victims of terrorist attacks on top of the criminal terrorism laws, yet judicial construction of Section 230 has severely diminished the reach of this civil tool.”

Second, he said that “broad Section 230 civil immunity” can actually be used against the federal government. That was something, he said, that was not intended by the framers of Section 230.

Third, Barr said that Section 230 makes it harder to police “lawless spaces” online. “We are concerned that internet services, under the guise of Section 230, can not only block access to law enforcement — even when officials have secured a court-authorized warrant — but also prevent victims from civil recovery.”

“The concerns regarding Section 230 are many and not all the same,” Barr concluded. And yet he added: “We must also recognize the benefits that Section 230 and technology have brought to our society, and ensure that the proposed cure is not worse than the disease.”

First panel at the Justice Department workshop addressed free speech in light of Section 230

The first panel focused on the issue of liability for speech that takes place on tech platforms.

Section 230’s use of the word “publisher” makes it clear that this statute refers to defamation law, said Annie McAdams, a lead counsel in lawsuits against Backpage.com and Facebook over human trafficking.

The 26 words in Section 230 (c)(1) read: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

But as is often the case with debates about Section 230, McAdams was met with pushback by her fellow panelists.

While legal experts usually cite the Stratton Oakmont v. Prodigy decision is discussing Section 230, Fordham University School of Law Professor Benjamin Zipursky said he saw state tort law as a crucial element to understanding the law.

Zipursky referring to the entirety of Section 230 subsection (c). Subsection (c)(1) is about “treatment of publisher or speaker.” But subsection (c)(2) concerns the  broader issue of civil liability for actions taken by tech companies to restrict indecent material on their platforms.

Indeed, the entire Section 230(c) is subtitled as a “protection for ‘Good Samaritan’ blocking and screening of offensive material.”

According to Zipursky, after Prodigy was sued for attempting to filter information to some degree, companies wanted to avoid filtering at all costs so that they wouldn’t be considered a “publisher.”

Zipursky compared companies’ weariness to filter content with the legal obligations of those who provide emergency medical care.

Before state-wide “Good Samaritan” laws were passed, people who performed CPR or other emergency medical care were liable for any damages or injuries. Now, “good Samaritans” are protected for their effort to help and save, just as are  “interactive computer services” under Section 230.

Zipursky agreed that Section 230 has defamation-related language, but that McAdams’ interpretation isn’t a “realistic way” to view it, said Zipursky.

WilmerHale Partner Patrick Carome said that Section 230 liability protections were intended to be vast, and not just limited to defamation. And that is because of the excessive amounts of content that these platforms have to manage, he said.

Carome defended Section 230, arguing that it fostered an environment where small companies can also succeed. Allowing companies to self-moderate makes room for future companies to continue developing, and that Section 230 does what good laws do, he said: It “puts the focus on the actual wrongdoers.”

He also took vigorous exception to Attorney General Barr’s statement about Section 230 cutting into the ability of victims of terrorism to get compensation from platforms. “It’s just flat out wrong,” said Carome. “Those cases are for the most part being decided not on Section 230 grounds.”

Was Section 230 designed to limit liability for more than just publishing?

United States Naval Academy Professor Jeff Kosseff agreed that Section 230 cast a broader net. The authors of Section 230 did not intend for the law to be narrow and solely for defamation, he said. While he said that he anticipated political forces leading eventually to changes in Section 230,  those changes need to be carefully constructed so as to not stifle competition.

But Carrie Goldberg, a victims’ rights attorney who specializes in revenge porn case, agreed with McAdams. One of Goldberg’s clients attempted to sue Grindr, when an abusive ex impersonated him and sent his geolocation to several people through the app.

Section 230 is being used as an excuse to not intervene, she said. Section 230 puts users in danger and denies them “access to justice,” she said.

In response to Carome’s remark that Goldberg’s client should have been aided by the criminal justice system, Goldberg said Section 230 needs to be reformed because “it’s gone too far.”

Carome pushed back, reminding panelists that Section 230 does not only pertain to big tech. It protects the thousands of sites that would not survive “10,000 bites of litigation,” said Carome.

Zipursky advocated for “crafting a middle path” compared to the current law, instead of moving ahead with a “kneejerk reaction.”

Section 230’s impact upon criminal conduct

The second panel of the day focused on whether Section 230 liability had facilitated criminal activity.

University of Miami Professor Mary Anne Franks noted the irony of Section 230’s Good Samaritan clause. It does not model helpful behavior, but rather amplifies harm and profits from it, said Franks. Indeed, she said that subsection (c)(1) actually disincentivizes the tech platforms like Google and Facebook from acting as “good Samaritans.”

But Kate Klonick, a professor  at St. John’s University, disagreed. Large tech players like Facebook have economic incentives to avoid bad press. Additionally, it knows that advertisers do not want ads nearby or associated with harmful content, she said.

Moreover, tech develops at such rapid speeds it is difficult to foresee the consequences of rushed responses to the current law, said Klonick.

Computer and Communications Industry Association President Matt Schruers said that many of the larger companies have already taken initiative in following Section 230 and reporting harmful activity.

Schruers said that Section 230 does generate positive incentives because it allows companies to build platforms without fear of litigation. Removing subsection (c)(1) would result in a “heckler’s veto”: Important content on tech platforms would be deleted out of fear of liability.

When asked about the future possibility of artificial intelligence regulating bad content, Franks said the tech industry is always promising to fix tech issues with more tech. She called this an “illusion.”

As an example of big tech facilitating crime, Franks brought up Facebook Live. When Facebook Live was created, people were livestreaming crimes like rape and murder, said Franks. “This is the world that Section 230 built.”

Mark Zuckerberg did not kill anyone, countered Klonick, to applause from the audience. She said Franks was taking issue with humanity, and that Facebook was just the tool used to exhibit these actions.

Section 230

Repealing Section 230 Would be Harmful to the Internet As We Know It, Experts Agree

While some advocate for a tightening of language, other experts believe Section 230 should not be touched.

Published

on

Rep. Ken Buck, R-Colo., speaking on the floor of the House

WASHINGTON, September 17, 2021—Republican representative from Colorado Ken Buck advocated for legislators to “tighten up” the language of Section 230 while preserving the “spirit of the internet” and enhancing competition.

There is common ground in supporting efforts to minimize speech advocating for imminent harm, said Buck, even though he noted that Republican and Democratic critics tend to approach the issue of changing Section 230 from vastly different directions

“Nobody wants a terrorist organization recruiting on the internet or an organization that is calling for violent actions to have access to Facebook,” Buck said. He followed up that statement, however, by stating that the most effective way to combat “bad speech is with good speech” and not by censoring “what one person considers bad speech.”

Antitrust not necessarily the best means to improve competition policy

For companies that are not technically in violation of antitrust policies, improving competition though other means would have to be the answer, said Buck. He pointed to Parler as a social media platform that is an appropriate alternative to Twitter.

Though some Twitter users did flock to Parler, particularly during and around the 2020 election, the newer social media company has a reputation for allowing objectionable content that would otherwise be unable to thrive on social media.

Buck also set himself apart from some of his fellow Republicans—including Donald Trump—by clarifying that he does not want to repeal Section 230.

“I think that repealing Section 230 is a mistake,” he said, “If you repeal section 230 there will be a slew of lawsuits.” Buck explained that without the protections afforded by Section 230, big companies will likely find a way to sufficiently address these lawsuits and the only entities that will be harmed will be the alternative platforms that were meant to serve as competition.

More content moderation needed

Daphne Keller of the Stanford Cyber Policy Center argued that it is in the best interest of social media platforms to enact various forms of content moderation, and address speech that may be legal but objectionable.

“If platforms just hosted everything that users wanted to say online, or even everything that’s legal to say—everything that the First Amendment permits—you would get this sort of cesspool or mosh pit of online speech that most people don’t actually want to see,” she said. “Users would run away and advertisers would run away and we wouldn’t have functioning platforms for civic discourse.”

Even companies like Parler and Gab—which pride themselves on being unyielding bastions of free speech—have begun to engage in content moderation.

“There’s not really a left right divide on whether that’s a good idea, because nobody actually wants nothing but porn and bullying and pro-anorexia content and other dangerous or garbage content all the time on the internet.”

She explained that this is a double-edged sword, because while consumers seem to value some level of moderation, companies moderating their platforms have a huge amount of influence over what their consumers see and say.

What problems do critics of Section 230 want addressed?

Internet Association President and CEO Dane Snowden stated that most of the problems surrounding the Section 230 discussion boil down to a fundamental disagreement over the problems that legislators are trying to solve.

Changing the language of Section 230 would impact not just the tech industry: “[Section 230] impacts ISPs, libraries, and universities,” he said, “Things like self-publishing, crowdsourcing, Wikipedia, how-to videos—all those things are impacted by any kind of significant neutering of Section 230.”

Section 230 was created to give users the ability and security to create content online without fear of legal reprisals, he said.

Another significant supporter of the status quo was Chamber of Progress CEO Adam Kovacevich.

“I don’t think Section 230 needs to be fixed. I think it needs [a better] publicist.” Kovacevich stated that policymakers need to gain a better appreciation for Section 230, “If you took away 230 You would have you’d give companies two bad options: either turn into Disneyland or turn into a wasteland.”

“Either turn into a very highly curated experience where only certain people have the ability to post content, or turn into a wasteland where essentially anything goes because a company fears legal liability,” Kovacevich said.

Continue Reading

Section 230

Judge Rules Exemption Exists in Section 230 for Twitter FOSTA Case

Latest lawsuit illustrates the increasing fragility of Section 230 legal protections.

Published

on

Twitter CEO Jack Dorsey.

August 24, 2021—A California court has allowed a lawsuit to commence against Twitter from two victims of sexual trafficking, who allege the social media company initially refused to remove content that exploited the underaged plaintiffs – and then went viral.

The anonymous plaintiffs allege that they were manipulated into making pornographic videos of themselves through another social media app, Snapchat, after which the videos were posted on Twitter. When the plaintiffs asked Twitter to take down the posts, it refused, and it was only after the Department of Homeland Security got involved that the social media company complied.

At issue in the case is whether Twitter had any obligation to remove the content at least “immediately” under Section 230 of the Communications Decency Act, which provides legal liability protections for the content the platforms’ users post.

Court’s finding

The court ruled Thursday that the case should proceed after finding that Twitter knowingly knew such content was on the site, had to have known it was sex trafficking, and refused to do something about it immediately.

“The Court finds that these allegations are sufficient to allege an ongoing pattern of conduct amounting to a tacit agreement with the perpetrators in this case to allow them to post videos and photographs it knew or should have known were related to sex trafficking without blocking their accounts or the Videos,” the decision read.

“In sum, the Court finds that Plaintiffs have stated a claim for civil liability under the [Trafficking Victims Protection Reauthorization Act] on the basis of beneficiary liability and that the claim falls within the exemption to Section 230 immunity created by FOSTA.”

The Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act that became the package law SESTA-FOSTA was passed in 2018 and amended immunity claims under Section 230 to exclude enforcement of federal or state sex trafficking laws from intermediary protections.

The court dismissed other claims against the company made by the plaintiffs, but met the relatively low bar to move the case forward.

The arguments

The plaintiffs allege that Twitter violated the TVPRA because it allegedly knew about the videos, benefitted from them and did nothing to address the problem before it went viral.

Twitter argued that FOSTA, as applied to the CDA, only narrowly applies to websites that are “knowingly assisting and profiting from reprehensible crimes;” the plaintiffs allegedly fail to show that the company “affirmatively participated” in such crimes; and the company cannot be held liable “simply because it did not take the videos down immediately.”

Experts asserted companies may hesitate to bring Section 230 defense in court

The case is yet another instance of U.S. courts increasingly poking holes in arguments brought by technology companies that suggests they cannot be liable for content on their platforms, per Section 230, which is currently the subject of hot debate in Washington about whether to reform it or completely abolish it.

A number of state judges have ruled against Amazon, for example, and its Section 230 defense in a number of case-specific instances in Texas and California. Experts on a panel in May said if courts keep ruling against the defense, there may be a deluge of lawsuits to come against companies.

And last month, citing some of these cases, lawyers argued that big tech companies may begin to shy away from bringing the 230 defense to court in fear of awakening lawmakers to changing legal views on the provision that could ignite its reform.

Continue Reading

Section 230

Facebook, Google, Twitter Register to Lobby Congress on Section 230

Companies also want to discuss cybersecurity, net neutrality, taxes and privacy.

Published

on

Facebook CEO Mark Zuckerberg

August 3, 2021 — The largest social media companies have registered to lobby Congress on Section 230, according to lobby records.

Facebook, Google, and Twitter filed new paperwork late last month to discuss the internet liability provision under the Communications Decency Act, which protects these companies from legal trouble for content their users post.

Facebook’s registration specifically mentions the Safe Tech Act, an amendment to the provision proposed earlier this year by Sens. Amy Klobuchar, D-Minnesota, Mark Warner, D-Virginia, and Mazie Hirono, D-Hawaii, which would largely keep the provision’s protections except for content the platforms are paid for.

A separate Facebook registration included discussion on the “repeal” of the provision.

Other issues included in the Menlo Park-based company’s registration are privacy, data security, online advertising, and general regulations on the social media industry.

Google also wants to discuss taxes and cybersecurity, as security issues take center stage following high-profile attacks and as international proposals for a new tax regime on tech companies emerge.

Notable additional subject matters Twitter includes in its registration are content moderation practices, data security, misinformation, and net neutrality, as the Federal Communications Commission is being urged to bring back Obama-era policies friendly to the principle that ensures content cannot be given preferential treatment on networks.

Section 230 has gripped Congress

Social media critics have been foaming at the mouth over possible retaliatory measures against the technology companies that have taken increasingly strong measures against those that violate its policies.

Those discussions picked up steam when, at the beginning of the year, former President Donald Trump was banned from Twitter, and then from Facebook and other platforms, for allegedly stoking the Capitol Hill riot on January 6. (Trump has since filed a lawsuit as a private citizen against the social media giants for his removal.)

Since the Capitol riot, a number of proposals have been put forward to amend — in some cases completely repeal — the provision to address what some Republicans are calling outright censorship by social media companies. Even Florida tried to take matters into its own hands when it made law rules that penalized social media companies that banned politicians. That law has since been put on hold by the courts.

The social media giants, and its allies in the industry, have pressed the importance of the provision, which they say have allowed once-fledgling companies like Facebook to be what it is today. And some representatives think reform of the law could lean more toward amendment than outright repeal. But lawyers have warned about a shift in attitude toward those liability protections, as more judges in courts across the country hold big technology companies accountable for harm caused by the platforms.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending