Connect with us

Section 230

Broadband Breakfast Hosts Section 230 Debate

Two sets of experts debated the merits of reforming or removing and maintaining Section 230.

Published

on

June 1, 2021–Broadband Breakfast’s Live Online event hosted a debate about Section 230, with some arguing for a revision or a repeal and others suggesting it is integral to the healthy flow of information.

The debate, held on May 26 and moderated by Communications Daily’s Karl Herchenroeder, pitted DigitalFrontiers Advocacy founder Neil Fried and consulting company Precursor president Scott Cleland, who are proponents of Section 230 reform, against attorney Cathy Gellis and TechFreedom president Berin Szóka, who were for maintaining the safeguards protecting intermediary platforms from the liability posed by what their users post online.

Fried said Section 230 allowed platforms to moderate harmful remarks without the courts getting involved. His solution to blunt unlawful behavior is an adjustment to Section 230 creating more accountability. Reform could include distinguishing between small and large platforms, as they should not be treated the same.

Proponents of Section 230 have said that the likes of Facebook could never be without legal protections against what their users post.

Cleland shared the similar thoughts with Fried for the removal or adjustment of the provision. He explained that “repeal is comprehensive and constitutional”; he even went so far as to say “repeal is inevitable.”

For maintaining Section 230

On the other side, Gellis stated her position is that the provision “needs help, not destruction.”  She explained Section 230 allows immunity to create a healthy ecosystem for the sharing of ideas. In her rebuttal, she noted the value the country puts on free speech should prevent rules from being put into place to moderate information.

“We need to keep our eye on the ball of the ecosystem, to make sure the ecosystem is equipped without artificial barriers… It is not about big tech…it is about every platform of every size.”

Szóka was quick on his feet to both reiterate Gellis’ beliefs and to counter Cleland’s claims. He said he agrees there is too much hate speech, but that does not mean the internet is lawless.

“There is very little the government can do about such speech because of the first amendment…we cannot directly ban hate speech,” Szóka said. “Section 230 aims to do the next best thing.”

Our Broadband Breakfast Live Online events take place every Wednesday at 12 Noon ET. You can watch the May 26, 2021, event on this page. You can also PARTICIPATE in the current Broadband Breakfast Live Online event. REGISTER HERE.

Wednesday, May 26, 2021, 12 Noon ET — “Unpacking the Controversies Around Section 230”

When Congress approved the Communications Decency Act as part of the Telecommunications Act  in 1996, few saw Section 230 as the central issue surrounding online speech and debate. Long considered a foundational law for the internet in the United States, Section 230 has — slowly at first, but now in a torrent — come under reexamination. Join us for a debate between proponents and critics of Section 230.

Featuring panelists:

  • Neil Fried, Founder, DigitalFrontiers Advocacy
  • Cathy Gellis, Attorney
  • Berin Szoka, President, TechFreedom
  • Scott Cleland, President, Precursor
  • Moderated by Karl Herchenroeder, Assistant Editor, Communications Daily

In an Oxford style debate, the audience will be polled at both the beginning and end of the event about the following resolution: Section 230 is harmful and should be abolished or significantly changed.” Each panelist will give an opening statement and a rebuttal, following which the moderator and members of the live audience will be able to ask questions.

  • First affirmative opening statement (6 minutes): Neil Fried
  • First negative opening statement (6 minutes): Cathy Gellis
  • Second affirmative opening statement (6 minutes): Scott Cleland
  • Second negative opening statement (6 minutes): Berin Szoka
  • First affirmative rebuttal (4 minutes): Scott Cleland
  • First negative rebuttal (4 minutes): Berin Szoka
  • Second affirmative rebuttal (4 minutes): Neil Fried
  • Second negative rebuttal (4 minutes): Cathy Gellis

Explainer: With Florida Social Media Law, Section 230 Now Positioned In Legal Spotlight

Neil Fried was formerly chief communications and technology counsel to the House Energy and Commerce Committee and SVP for congressional and regulatory affairs at the Motion Picture Association. He also helped implement the 1996 Telecommunications Act while at the FCC and advised journalists while at the Reporters Committee for Freedom of the Press. In 2020 he launched DigitalFrontiers Advocacy, which advises clients on Communications Act and Copyright Act issues.

Frustrated that people were making the law without asking for her opinion, Cathy Gellis gave up a career as a web developer to become a lawyer so that she could help them not make it badly, especially when it came to technology. A former aspiring journalist and longtime fan of civil liberties, her legal work includes defending the rights of Internet users and advocating for policy that protects online speech and innovation. When not advising clients on the current state of the law with respect to such topics as platform liability, copyright, trademark, privacy, or cybersecurity she frequently writes about these subjects and more for outlets such as the Daily Beast, Law.com, and Techdirt.com, where she is a regular contributor.

Berin Szoka serves as President of TechFreedom. Previously, he was a Senior Fellow and the Director of the Center for Internet Freedom at The Progress & Freedom Foundation. Before joining PFF, he was an Associate in the Communications Practice Group at Latham & Watkins LLP, where he advised clients on regulations affecting the Internet and telecommunications industries. Before joining Latham’s Communications Practice Group, Szoka practiced at Lawler Metzger Milkman & Keeney, LLC, a boutique telecommunications law firm in Washington, and clerked for the Hon. H. Dale Cook, Senior U.S. District Judge for the Northern District of Oklahoma.

Scott Cleland is a Christian, conservative, Republican and President of Precursor®, a responsible Internet consultancy. He is not a lawyer. He served as Deputy U.S. Coordinator for International Communications & Information Policy in the George H. W. Bush Administration, and Institutional Investor twice ranked him the #1 independent analyst in communications when he was an investment analyst. He has testified before eight congressional subcommittees a total of sixteen times.

Karl Herchenroeder is a technology policy journalist for publications including Communications Daily. Born in Rockville, Maryland, he joined the Warren Communications News staff in 2018. He began his journalism career in 2012 at the Aspen Times in Aspen, Colorado, where he covered city government. After that, he covered the nuclear industry for ExchangeMonitor in Washington.

Watch our 2:27 minute preview video on Section 230

WATCH HERE, or on YouTubeTwitter and Facebook

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Reporter Sophie Draayer, a native Las Vegan, studied strategic communication and political science at the University of Utah. In her free time, she plays mahjong, learns new songs on the guitar, and binge-watches true-crime docuseries on Netflix.

Section 230

Section 230 Shuts Down Conversation on First Amendment, Panel Hears

The law prevents discussion on how the first amendment should be applied in a new age of technology, says expert.

Published

on

Photo of Ron Yokubaitis of Texas.net, Ashley Johnson of Information Technology and Innovation Foundation, Emma Llanso of Center for Democracy and Technology, Matthew Bergman of Social Media Victims Law Center, and Chris Marchese of Netchoice (left to right)

WASHINGTON, March 9, 2023 – Section 230 as it is written shuts down the conversation about the first amendment, claimed experts in a debate at Broadband Breakfast’s Big Tech & Speech Summit Thursday.  

Matthew Bergman, founder of the Social Media Victims Law Center, suggested that section 230 avoids discussion on the appropriate weighing of costs and benefits that exist in allowing big tech companies litigation immunity in moderation decisions on their platforms. 

We need to talk about what level of the first amendment is necessary in a new world of technology, said Bergman. This discussion happens primarily in an open litigation process, he said, which is not now available for those that are caused harm by these products. 

Photo of Ron Yokubaitis of Texas.net, Ashley Johnson of Information Technology and Innovation Foundation, Emma Llanso of Center for Democracy and Technology, Matthew Bergman of Social Media Victims Law Center, and Chris Marchese of Netchoice (left to right)

All companies must have reasonable care, Bergman argued. Opening litigation doesn’t mean that all claims are necessarily viable, only that the process should work itself out in the courts of law, he said. 

Eliminating section 230 could lead to online services being “over correct” in moderating speech which could lead to suffocating social reform movements organized on those platforms, argued Ashley Johnson of research institution, Information Technology and Innovation Foundation. 

Furthermore, the burden of litigation would fall disproportionally on the companies that have fewer resources to defend themselves, she continued. 

Bergman responded, “if a social media platform is facing a lot of lawsuits because there are a lot of kids who have been hurt through the negligent design of that platform, why is that a bad thing?” People who are injured have the right by law to seek redress against the entity that caused that injury, Bergman said. 

Emma Llanso of the Center for Democracy and Technology suggested that platforms would change the way they fundamentally operate to avoid threat of litigation if section 230 were reformed or abolished, which could threaten freedom of speech for its users. 

It is necessary for the protection of the first amendment that the internet consists of many platforms with different content moderation policies to ensure that all people have a voice, she said. 

To this, Bergman argued that there is a distinction between algorithms that suggest content that users do not want to see – even that content that exists unbeknownst to the seeker of that information – and ensuring speech is not censored.  

It is a question concerning the faulty design of a product and protecting speech, and courts are where this balancing act should take place, said Bergman. 

This comes days after law professionals urged Congress to amend the statue to specify that it applies only to free speech, rather than the negligible design of product features that promote harmful speech. The discussion followed a Supreme Court decision to provide immunity to Google for recommending terrorist videos on its video platform YouTube.   

To watch the full videos join the Broadband Breakfast Club below. We are currently offering a Free 30-Day Trial: No credit card required!

Continue Reading

Section 230

Congress Should Amend Section 230, Senate Subcommittee Hears

Experts urged Congress to amend tech protection law to limit protection for the promotion of harmful information.

Published

on

Photo of Hany Farid, professor at University of California, Berkley

WASHINGTON, March 8, 2023 – Law professionals at a Senate Subcommittee on Privacy, Technology and the Law hearing on Wednesday urged Congress to amend Section 230 to specify that it applies only to free speech, rather than the promotion of misinformation.

Section 230 protects platforms from being treated as a publisher or speaker of information originating from a third party, thus shielding it from liability for the posts of the latter. Mary Anne Franks, professor of law at the University of Miami School of Law, argued that there is a difference between protecting free speech and protecting information and the harmful dissemination of that information.

Hany Farid, professor at University of California, Berkley, argued that there should be a distinction between a negligently designed product feature and a core component to the platform’s business. For example, YouTube’s video recommendations is a product feature rather than an essential function as it is designed solely to maximize advertising revenue by keeping users on the platform, he said.

YouTube claims that the algorithm to recommend videos is unable to distinguish between two different videos. This, argued Farid, should be considered a negligently designed feature as YouTube knew or should have reasonably known that the feature could lead to harm.

Section 230, said Farid, was written to immunize tech companies from defamation litigation, not to immunize tech companies from any wrongdoing, including negligible design of its features.

“At a minimum,” said Franks, returning the statue to its original intention “would require amending the statute to make clear that the law’s protections only apply to speech and to make clear that platforms that knowingly promote harmful content are ineligible for immunity.”

In an State of the Net conference earlier this month, Frank emphasized the “good Samaritan” aspect of the law, claiming that it is supposed to “provide incentives at platforms to actually do the right thing.” Instead, the law does not incentivize platforms to moderate its content, she argued.

Jennifer Bennett of national litigation boutique Gupta Wessler suggested that Congress uphold what is known as the Henderson framework, which would hold a company liable if it materially contributes to what makes content unlawful, including the recommendation and dissemination of the content.

Unfortunately, lamented Eric Schnapper, professor of law at University of Washington School of Law, Section 230 has barred the right of Americans to get redress if they’ve been harmed by big tech. “Absolute immunity breeds absolute irresponsibility,” he said.

Senator Richard Blumenthal, R-Connecticut, warned tech companies that “reform is coming” at the onset of the hearing.

This comes weeks after the Supreme Court decision to provide immunity to Google for recommending terrorist videos on its video platform YouTube. The case saw industry dissention on whether section 230 protects algorithmic recommendations. Justice Brett Kavanaugh claimed that YouTube forfeited its protection by using recommendation algorithms but was overturned in the court ruling.

Continue Reading

Premium

Content Moderation, Section 230 and the Future of Online Speech

Our comprehensive report examines the extremely timely issue of content moderation and Section 230 from multiple angles.

Published

on

In the 27 years since the so-called “26 words that created the internet” became law, rapid technological developments and sharp partisan divides have fueled increasingly complex content moderation dilemmas.

Earlier this year, the Supreme Court tackled Section 230 for the first time through a pair of cases regarding platform liability for hosting and promoting terrorist content. In addition to the court’s ongoing deliberations, Section 230—which protects online intermediaries from liability for third-party content—has recently come under attack from Congress, the White House and multiple state legislatures.

Member download, or join with Free 30-Day Trial!

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending