Senators Discuss Section 230 Shortcomings and Potential Reforms

July 28, 2020 — Senators on Tuesday remained broadly divided on the extent and direction that changes to Section 230 should take. The tenor of the discussion at a Senate Commerce Communications Subcommittee hearing suggested that the law was overdue for an overhaul, as senator after senator criticiz

Senators Discuss Section 230 Shortcomings and Potential Reforms
Screenshot of Sen. John Thune from the webcast

July 28, 2020 — Senators on Tuesday remained broadly divided on the extent and direction that changes to Section 230 should take.

The tenor of the discussion at a Senate Commerce Communications Subcommittee hearing suggested that the law was overdue for an overhaul, as senator after senator criticized what the internet had become.

But proposals for concrete change were fewer. Subcommittee Chairman John Thune, R-S.D., and Ranking Member Brian Schatz, D-Hawaii, for example, introduced the Platform Accountability and Consumer Transparency Act calling for procedural transparency.

Some on the right, including Sen. Ted Cruz, R-Texas, and full committee Chairman Roger Wicker, R-Miss., offered both broad and narrow critiques of Section 230. On the left, Sen. Richard Blumenthal said the PACT Act didn’t go far enough.

And still others, including Sens. Amy Klobuchar, D-Minn., and Sen. Jacky Rosen, D-N.V., weighed into concerns about the intersection of artificial intelligence and the law.

Screenshot of Sen. Amy Klobuchar participating in the hearing remotely

A voice of caution against changes to Section 230

Witnesses warned against making hasty changes to the statute, with former Rep. Christopher Cox, a co-author of Section 230, pointing out the foundational role it had played in the development of the digital world since its inclusion as part of the 1996 Telecom Act.

“It’s important to remember just how much human activity is encompassed within this vast category we so casually refer to as the internet,” Cox said. “To the extent that any new legislation imposes too much compliance burden or too much liability exposure that’s connected to a website’s hosting of user created content, the risk is that too many websites will be forced to respond by getting rid of user generated content altogether.”

Also sounding a voice of caution was Jeff Kosseff, assistant professor of cyber science at the U.S. Naval Academy, who said that it was important to gather more facts before adjusting the law.

Screenshot of Jeff Kosseff, assistant professor at the U.S. Naval Academy, participating in the hearing remotely

“I don’t think we’re at the point of being able to reform, because we have so many competing viewpoints about what platforms should be doing on top of what we could require them to do because of the First Amendment, and other requirements,” he said.

Cox agreed, adding that another immediate challenge was to figure out what was actually doable. Reforming Section 230 seemed like a more daunting task than initially writing it had been, he said.

PACT Act would aim to increase platform accountability

The varied approaches that tech platforms take to objectionable content has “led to a limited ability for consumers to address and correct harms that occur online,” Thune said. “And as Americans conduct more and more of their activities online, the net outcome is an increasingly less protected and more vulnerable consumer.”

Thune and Schatz introduced the PACT Act in June. Thune said the bill would increase transparency without damaging the economic, innovative and entrepreneurial benefits stimulated by Section 230.

Screenshot of Sen. Brian Schatz participating in the hearing remotely

It would require platforms to post their content moderation procedures, submit quarterly reports to the Federal Trade Commission explaining content moderation decisions, define a prompt complaint and response system and implement a toll-free customer service line.

“Section 230 proponents say that Congress can’t possibly change this law without disrupting all of the great innovation that it has enabled, and I just disagree with that,” Schatz said. “The legislative process is about making sure that our laws are in the public interest.”

Blumenthal agreed with Thune and Schatz about the importance of increasing platform accountability.

“If there’s a message to the industry here, it is [that] the need for reform is now,” he said. “There’s a broad consensus that Section 230, as it presently exists, no longer affords sufficient protection to the public, to consumers, to victims and survivors of abuse.”

However, Blumenthal warned that the PACT Act did not go far enough, emphasizing the traumatic and lengthy process currently required in order for individuals to get abusive imagery such as child pornography removed from online platforms, involving obtaining a court order and locating all instances of the content.

Screenshot of Sen. Richard Blumenthal from the webcast

“I’m very concerned about the burden that’s placed on the victims and survivors,” he said. “The PACT Act does not provide any incentive for Facebook to police its own platform.”

Hate speech and algorithmic discrimination

“Most powerful online intermediaries today are anything but publishers and distributors of user generated content,” said Fordham Law Professor Olivier Sylvain. “They harvest, sort and repurpose user posts and personal data to attract and hold consumer attention, and more importantly, to market these valuable data to advertisers…The result is too often lived harm.”

Sylvain pointed to Facebook’s practice of collecting data on users to categorize them across hundreds of dimensions using automated processes.

“Under civil rights law, Congress forbids discrimination in ads on the basis of race, ethnicity, age and gender in the markets for housing, education and consumer credit,” he said. “But that is exactly what Facebook allowed building managers and employers to do.”

Screenshot of Olivier Sylvain, professor at Fordham University, participating in the hearing remotely

Klobuchar took a similar angle, highlighting certain ads targeted at African American-focused webpages during the 2016 election that told viewers they should vote by texting a falsified number that rather than waiting at the polls.

“One of the issues commonly raised regarding content moderation across multiple platforms is the presence of bias in artificial intelligence systems that are used to analyze the content,” Rosen said. “Decisions made through AI systems, including for content moderation, run the risk of further marginalizing and censoring groups that already face disproportionate prejudice and discrimination, both online and offline.”

In addition, content moderation often misses dangerous hate speech, Rosen continued, pointing out the antisemitic posts found to have been made by the Tree of Life synagogue shooter on a right-wing media platform prior to his deadly attack.

“There’s so much work to be done in this area, because despite the best efforts of even the most well-motivated social media platforms, we see examples where the algorithms don’t work…I think the most troubling challenge for writing law in this area is, what about the great middle ground, where the platforms are not bad actors, they’re trying to do the right thing, but it just doesn’t amount to enough?” Cox said.

Complexities of content moderation practices

“Is there an approach by which we can incentivize active, clear and consistent content moderation without the negative consequences of less open platforms and fewer new entrants into the internet ecosystem?” Sen. Tammy Baldwin, D-Wis., asked.

“I think you really hit the nail on the head in terms of what the challenge is here,” Kosseff said.

Rather than an overly prescriptive approach, Kosseff recommended moving toward transparency, adding that some platforms have already begun to take steps in that direction.

Witnesses emphasized the difficulty of large-scale content moderation for social media platforms.

“The scale of these efforts is staggering,” said Elizabeth Banker, deputy general counsel of the Internet Association. “Facebook took action against 1.9 billion pieces of spam in a three-month period. In multiple cases, Section 230 has shielded providers from lawsuits from spammers who sued over removing their spam material.”

However, some senators were less willing to extend tech platforms the benefit of the doubt.

“The reality is that platforms have a strong incentive to exercise control over the content each of us sees, because if they can present us with content that will keep us engaged on the platform, we will stay on the platform longer,” Thune said.

Screenshot of Sen. Ted Cruz from the webcast

Cruz repeated his oft-made claims of anti-conservative bias and censorship on social media platforms.

“Given the monopoly power they have over free speech, I view that as the single greatest threat to our democratic process we have today,” he said.

‘Otherwise objectionable’ is not overly vague, according to author of Section 230

The hearing also featured discussion of the Commerce Department’s petition on Monday asking the Federal Communications Commission to issue proposed rules narrowing Section 230’s protections, under the direction of an executive order from President Donald Trump.

Cox pointed out that the original iteration of the bill that evolved into Section 230 contained a provision explicitly denying the FCC authority to regulate the content of speech.

“I would like to see the FTC be more active in this area — I’d like to see the FTC holding platforms to their promises,” Cox added.

Screenshot of former Rep. Christopher Cox participating in the hearing remotely

One of the potential ambiguities raised by the petition was the phrase “otherwise objectionable.”

“I question whether this term is too broad and improperly shields online platforms from liability when they remove content that they simply disagree with, dislike or find distasteful,” Wicker said. “The term may require further defining to reduce ambiguity, increase accountability and prevent misapplication of the law.”

Cox explained that ‘otherwise objectionable’ should be understood with reference to the list of specific offenses preceding it, adding that it was “not an open-ended granted immunity for editing content for any unrelated reason a website can think of.”