Connect with us

Section 230

Broadband Breakfast Hosts Section 230 Debate

Two sets of experts debated the merits of reforming or removing and maintaining Section 230.

Published

on

June 1, 2021–Broadband Breakfast’s Live Online event hosted a debate about Section 230, with some arguing for a revision or a repeal and others suggesting it is integral to the healthy flow of information.

The debate, held on May 26 and moderated by Communications Daily’s Karl Herchenroeder, pitted DigitalFrontiers Advocacy founder Neil Fried and consulting company Precursor president Scott Cleland, who are proponents of Section 230 reform, against attorney Cathy Gellis and TechFreedom president Berin Szóka, who were for maintaining the safeguards protecting intermediary platforms from the liability posed by what their users post online.

Fried said Section 230 allowed platforms to moderate harmful remarks without the courts getting involved. His solution to blunt unlawful behavior is an adjustment to Section 230 creating more accountability. Reform could include distinguishing between small and large platforms, as they should not be treated the same.

Proponents of Section 230 have said that the likes of Facebook could never be without legal protections against what their users post.

Cleland shared the similar thoughts with Fried for the removal or adjustment of the provision. He explained that “repeal is comprehensive and constitutional”; he even went so far as to say “repeal is inevitable.”

For maintaining Section 230

On the other side, Gellis stated her position is that the provision “needs help, not destruction.”  She explained Section 230 allows immunity to create a healthy ecosystem for the sharing of ideas. In her rebuttal, she noted the value the country puts on free speech should prevent rules from being put into place to moderate information.

“We need to keep our eye on the ball of the ecosystem, to make sure the ecosystem is equipped without artificial barriers… It is not about big tech…it is about every platform of every size.”

Szóka was quick on his feet to both reiterate Gellis’ beliefs and to counter Cleland’s claims. He said he agrees there is too much hate speech, but that does not mean the internet is lawless.

“There is very little the government can do about such speech because of the first amendment…we cannot directly ban hate speech,” Szóka said. “Section 230 aims to do the next best thing.”

Our Broadband Breakfast Live Online events take place every Wednesday at 12 Noon ET. You can watch the May 26, 2021, event on this page. You can also PARTICIPATE in the current Broadband Breakfast Live Online event. REGISTER HERE.

Wednesday, May 26, 2021, 12 Noon ET — “Unpacking the Controversies Around Section 230”

When Congress approved the Communications Decency Act as part of the Telecommunications Act  in 1996, few saw Section 230 as the central issue surrounding online speech and debate. Long considered a foundational law for the internet in the United States, Section 230 has — slowly at first, but now in a torrent — come under reexamination. Join us for a debate between proponents and critics of Section 230.

Featuring panelists:

  • Neil Fried, Founder, DigitalFrontiers Advocacy
  • Cathy Gellis, Attorney
  • Berin Szoka, President, TechFreedom
  • Scott Cleland, President, Precursor
  • Moderated by Karl Herchenroeder, Assistant Editor, Communications Daily

In an Oxford style debate, the audience will be polled at both the beginning and end of the event about the following resolution: Section 230 is harmful and should be abolished or significantly changed.” Each panelist will give an opening statement and a rebuttal, following which the moderator and members of the live audience will be able to ask questions.

  • First affirmative opening statement (6 minutes): Neil Fried
  • First negative opening statement (6 minutes): Cathy Gellis
  • Second affirmative opening statement (6 minutes): Scott Cleland
  • Second negative opening statement (6 minutes): Berin Szoka
  • First affirmative rebuttal (4 minutes): Scott Cleland
  • First negative rebuttal (4 minutes): Berin Szoka
  • Second affirmative rebuttal (4 minutes): Neil Fried
  • Second negative rebuttal (4 minutes): Cathy Gellis

Explainer: With Florida Social Media Law, Section 230 Now Positioned In Legal Spotlight

Neil Fried was formerly chief communications and technology counsel to the House Energy and Commerce Committee and SVP for congressional and regulatory affairs at the Motion Picture Association. He also helped implement the 1996 Telecommunications Act while at the FCC and advised journalists while at the Reporters Committee for Freedom of the Press. In 2020 he launched DigitalFrontiers Advocacy, which advises clients on Communications Act and Copyright Act issues.

Frustrated that people were making the law without asking for her opinion, Cathy Gellis gave up a career as a web developer to become a lawyer so that she could help them not make it badly, especially when it came to technology. A former aspiring journalist and longtime fan of civil liberties, her legal work includes defending the rights of Internet users and advocating for policy that protects online speech and innovation. When not advising clients on the current state of the law with respect to such topics as platform liability, copyright, trademark, privacy, or cybersecurity she frequently writes about these subjects and more for outlets such as the Daily Beast, Law.com, and Techdirt.com, where she is a regular contributor.

Berin Szoka serves as President of TechFreedom. Previously, he was a Senior Fellow and the Director of the Center for Internet Freedom at The Progress & Freedom Foundation. Before joining PFF, he was an Associate in the Communications Practice Group at Latham & Watkins LLP, where he advised clients on regulations affecting the Internet and telecommunications industries. Before joining Latham’s Communications Practice Group, Szoka practiced at Lawler Metzger Milkman & Keeney, LLC, a boutique telecommunications law firm in Washington, and clerked for the Hon. H. Dale Cook, Senior U.S. District Judge for the Northern District of Oklahoma.

Scott Cleland is a Christian, conservative, Republican and President of Precursor®, a responsible Internet consultancy. He is not a lawyer. He served as Deputy U.S. Coordinator for International Communications & Information Policy in the George H. W. Bush Administration, and Institutional Investor twice ranked him the #1 independent analyst in communications when he was an investment analyst. He has testified before eight congressional subcommittees a total of sixteen times.

Karl Herchenroeder is a technology policy journalist for publications including Communications Daily. Born in Rockville, Maryland, he joined the Warren Communications News staff in 2018. He began his journalism career in 2012 at the Aspen Times in Aspen, Colorado, where he covered city government. After that, he covered the nuclear industry for ExchangeMonitor in Washington.

Watch our 2:27 minute preview video on Section 230

WATCH HERE, or on YouTubeTwitter and Facebook

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Reporter Sophie Draayer, a native Las Vegan, studied strategic communication and political science at the University of Utah. In her free time, she plays mahjong, learns new songs on the guitar, and binge-watches true-crime docuseries on Netflix.

Section 230

Parler Policy Exec Hopes ‘Sustainable’ Free Speech Change on Twitter if Musk Buys Platform

Parler’s Amy Peikoff said she wishes Twitter can follow in her social media company’s footsteps.

Published

on

Screenshot of Amy Peikoff

WASHINGTON, May 16, 2022 – A representative from a growing conservative social media platform said last week that she hopes Twitter, under new leadership, will emerge as a “sustainable” platform for free speech.

Amy Peikoff, chief policy officer of social media platform Parler, said as much during a Broadband Breakfast Live Online event Wednesday, in which she wondered about the implications of platforms banning accounts for views deemed controversial.

The social media world has been captivated by the lingering possibility that SpaceX and Tesla CEO Elon Musk could buy Twitter, which the billionaire has criticized for making decisions he said infringe on free speech.

Before Musk’s decision to go in on the company, Parler saw a surge in member sign-ups after former President Donald Trump was banned from Twitter for comments he made that the platform saw as encouraging the Capitol riots on January 6, 2021, a move Peikoff criticized. (Trump also criticized the move.)

Peikoff said she believes Twitter should be a free speech platform just like Parler and hopes for “sustainable” change with Musk’s promise.

“At Parler, we expect you to think for yourself and curate your own feed,” Peikoff told Broadband Breakfast Editor and Publisher Drew Clark. “The difference between Twitter and Parler is that on Parler the content is controlled by individuals; Twitter takes it upon itself to moderate by itself.”

She recommended “tools in the hands of the individual users to reward productive discourse and exercise freedom of association.”

Peikoff criticized Twitter for permanently banning Donald Trump following the insurrection at the U.S. Capitol on January 6, and recounted the struggle Parler had in obtaining access to hosting services on AWS, Amazon’s web services platform.

Screenshot of Amy Peikoff

While she defended the role of Section 230 of the Telecom Act for Parler and others, Peikoff criticized what she described as Twitter’s collusion with the government. Section 230 provides immunity from civil suits for comments posted by others on a social media network.

For example, Peikoff cited a July 2021 statement by former White House Press Secretary Jen Psaki raising concerns with “misinformation” on social media. When Twitter takes action to stifle anti-vaccination speech at the behest of the White House, that crosses the line into a form of censorship by social media giants that is, in effect, a form of “state action.”

Conservatives censored by Twitter or other social media networks that are undertaking such “state action” are wrongfully being deprived of their First Amendment rights, she said.

“I would not like to see more of this entanglement of government and platforms going forward,” she said Peikoff and instead to “leave human beings free to information and speech.”

Screenshot of Drew Clark and Amy Peikoff during Wednesday’s Broadband Breakfast’s Online Event

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, May 11, 2022, 12 Noon ET – Mr. Musk Goes to Washington: Will Twitter’s New Owner Change the Debate About Social Media?

The acquisition of social media powerhouse Twitter by Elon Musk, the world’s richest man, raises a host of issues about social media, free speech, and the power of persuasion in our digital age. Twitter already serves as the world’s de facto public square. But it hasn’t been without controversy, including the platform’s decision to ban former President Donald Trump in the wake of his tweets during the January 6 attack on the U.S. Capitol. Under new management, will Twitter become more hospitable to Trump and his allies? Does Twitter have a free speech problem? How will Mr. Musk’s acquisition change the debate about social media and Section 230 of the Telecommunications Act?

Guests for this Broadband Breakfast for Lunch session:

  • Amy Peikoff, Chief Policy Officer, Parler
  • Drew Clark (host), Editor and Publisher, Broadband Breakfast

Amy Peikoff is the Chief Policy Officer of Parler. After completing her Ph.D., she taught at universities (University of Texas, Austin, University of North Carolina, Chapel Hill, United States Air Force Academy) and law schools (Chapman, Southwestern), publishing frequently cited academic articles on privacy law, as well as op-eds in leading newspapers across the country on a range of issues. Just prior to joining Parler, she founded and was President of the Center for the Legalization of Privacy, which submitted an amicus brief in United States v. Facebook in 2019.

Drew Clark is the Editor and Publisher of BroadbandBreakfast.com and a nationally-respected telecommunications attorney. Drew brings experts and practitioners together to advance the benefits provided by broadband. Under the American Recovery and Reinvestment Act of 2009, he served as head of a State Broadband Initiative, the Partnership for a Connected Illinois. He is also the President of the Rural Telecommunications Congress.

Illustration by Mohamed Hassan used with permission

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

https://pixabay.com/vectors/elon-musk-twitter-owner-investor-7159200/

Continue Reading

Section 230

Leave Section 230 Alone, Panelists Urge Government

The debate on what government should — or shouldn’t — do with respect to liability protections for platforms continues.

Published

on

Photo of Josh Hammer, Paul Larken and Niam Yaraghi by Douglas Blair via Twitter

WASHINGTON, May 10, 2022 – A panelist at a Heritage Foundation event on Thursday said that the government should not make changes to Section 230, which protects online platforms from being liable for the content their users post.

However, the other panelist, Newsweek Opinion Editor Josh Hammer, said technology companies have been colluding with the government to stifle speech. Hammer said that Section 230 should be interpreted and applied more vigorously against tech platforms.

Countering this view was Niam Yaraghi, senior fellow at the Brookings Institution’s Center for Technology Innovation.

“While I do agree with the notion that what these platforms are doing is not right, I am much more optimistic” than Hammer, Yaraghi said. “I do not really like the government to come in and do anything about it, because I believe that a capitalist market, an open market, would solve the issue in the long run.”

Addressing a question from the moderator about whether antitrust legislation or stricter interpretation of Section 230 should be the tool to require more free speech on big tech platforms, Hammer said that “Section 230 is the better way to go here.”

Yaraghi, by contrast, said that it was incumbent on big technology platforms to address content moderation, not the government.

In March, Vint Cerf, a vice president and chief internet evangelist at Google, and the president of tech lobbyist TechFreedom warned against government moderation of content on the internet as Washington focuses on addressing the power of big tech platforms.

While some say Section 230 only protects “neutral platforms”, others claim it allows powerful companies to ignore user harm. Legislation from the likes of Amy Klobuchar, D-Minn., would exempt 230 protections for platforms that fail to address Covid mis- and disinformation.

Correction: A previous version of this story said Sen. Ron Wyden, D-Ore., agreed that Section 230 only protected “neutral platforms,” or that it allowed tech companies to ignore user harm. Wyden, one of the authors of the provision in the 1996 Telecom Act, instead believes that the law is a “sword and shield” to protect against small companies, organizations and movements against legal liability for what users post on their websites.

Additional correction: A previous version of this story misattributed a statement by Niam Yaraghi to Josh Hammer. The story has been corrected, and additional context added.

Continue Reading

Section 230

Reforming Section 230 Won’t Help With Content Moderation, Event Hears

Government is ‘worst person’ to manage content moderation.

Published

on

Photo of Chris Cox at the event.
Photo of Chris Cox at Monday's AEI event

WASHINGTON, April 11, 2022 — Reforming Section 230 won’t help with content moderation on online platforms, observers said Monday.

“If we’re going to have some content moderation standards, the government is going to be, usually, the worst person to do it,” said Chris Cox, a member of the board of directors at tech lobbyist Net Choice and a former Congressman.

These comments came during a panel discussion during an online event hosted by the American Enterprise Institute that focused on speech regulation and Section 230, a provision in the Communications Decency Act that protects technology platforms from being liable for posts by their users.

“Content moderation needs to be handled platform by platform and rules need to be established by online communities according to their community standards,” Cox said. “The government is not very competent at figuring out the answers to political questions.”

There was also discussion about the role of the first amendment in content moderation on platforms. Jeffrey Rosen, a nonresident fellow at AEI, questioned if the first amendment provides protection for content moderation by a platform.

“The concept is that the platform is not a publisher,” he said. “If it’s not [a publisher], then there’s a whole set of questions as to what first amendment interests are at stake…I don’t think that it’s a given that the platform is the decider of those content decisions. I think that it’s a much harder question that needs to be addressed.”

Late last year, experts said that it is not possible for platforms to remove from their site all content that people may believe to be dangerous during a Broadband Breakfast Live Online event. However some, like Alex Feerst, the co-founder of the Digital Trust and Safety Partnership, believe that platforms should hold some degree of liability for the content of their sites as harm mitigation with regards to dangerous speech is necessary where possible.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending