Connect with us

Section 230

Federal Communications Commissioner Geoffrey Starks Attacks Trump Administration Section 230 Petition

Published

on

Screenshot of Geoffrey Starks from the webinar

September 14, 2020 — Federal Communications Commissioner Geoffrey Starks on Wednesday blasted the Trump administration for its efforts to impose President Donald Trump’s political agenda upon an independent agency.

Speaking during a webinar hosted by the Center for Democracy and Technology, Starks called the administration May 28 executive order on social media an unprecedented effort “from the get-go.” He maintained that the FCC has no jurisdiction over social media services.

Starks also referenced the recent petition made by the Commerce Department’s National Telecommunications and Information Administration that officially urged the FCC to interpret Section 230 of the Communications Decency Act in a way detrimental to big tech platforms like Twitter and Facebook.

“The president is not the only one who wants the FCC to rewrite Section 230 statute,” said Starks.

Starks also said that it would be the decision of agency Chairman Ajit Pai as to whether, when and how to address the NTIA petition. Starks said that any movement on the matter will likely occur after the November election.

“There is a clear motive behind the NTIA petition,” said Starks, calling the petition a product of the president’s impatience, obsession with his online persona, and a political distraction.

Starks said that the motives behind the social media executive order or the NTIA petition were shown to be suspicious because of its timing.

“The executive order came immediately after President Trump had a Tweet flagged for saying mail-in ballots would lead to rigged elections,” said Starks, “it clearly shows the president’s intention to influence how social media sites operate to benefit himself.”

The NTIA’s request for the FCC rulemaking would have significant impacts on the use of the internet for free speech, he said.

The NTIA petition has already resulted in thousands of comments being filed to the Federal Communications Commission from hundreds of netizens and industry groups.

“Social media plays a crucial role in our elections,” Starks continued, noting that American’s report getting their news from Facebook more often than any other site.

Starks further called it deeply ironic that Republicans, the same Commissioners who moved to repeal net neutrality in 2017, relinquishing FCC control over the internet, are now calling for the FCC to moderate Section 230. “They are in an irreconcilable position,” said Starks, “they want it both ways.”

Starks argued the executive order was not an example of sound reasonable policymaking. He also urged his Republican colleagues to reject the NTIA petition as quickly as possible in order to get back to focusing on their top priorities.

“We don’t have to do what NTIA asks or what the executive order asks and I don’t think we should,” maintained Starks.

Section 230

Tech Groups, Free Expression Advocates Support Twitter in Landmark Content Moderation Case

The Supreme Court’s decision could dramatically alter the content moderation landscape.

Published

on

Photo of Supreme Court Justice Clarence Thomas courtesy of Stetson University

WASHINGTON, December 8, 2022 — Holding tech companies liable for the presence of terrorist content on their platforms risks substantially limiting their ability to effectively moderate content without overly restricting speech, according to several industry associations and civil rights organizations.

The Computer & Communications Industry Association, along with seven other tech associations, filed an amicus brief Tuesday emphasizing the vast amount of online content generated on a daily basis and the existing efforts of tech companies to remove harmful content.

A separate coalition of organizations, including the Electronic Frontier Foundation and the Center for Democracy & Technology, also filed an amicus brief.

The briefs were filed in support of Twitter as the Supreme Court prepares to hear Twitter v. Taamneh in 2023, alongside the similar case Gonzalez v. Google. The cases, brought by relatives of ISIS attack victims, argue that social media platforms allow groups like ISIS to publish terrorist content, recruit new operatives and coordinate attacks.

Both cases were initially dismissed, but an appeals court in June 2021 overturned the Taamneh dismissal, holding that the case adequately asserted its claim that tech platforms could be held liable for aiding acts of terrorism. The Supreme Court will now decide whether an online service can be held liable for “knowingly” aiding terrorism if it could have taken more aggressive steps to prevent such use of its platform.

The Taamneh case hinges on the Anti-Terrorism Act, which says that liability for terrorist attacks can be placed on “any person who aids and abets, by knowingly providing substantial assistance.” The case alleges that Twitter did this by allowing terrorists to utilize its communications infrastructure while knowing that such use was occurring.

Gonzalez is more directly focused on Section 230, a provision under the Communications Decency Act that shields platforms from liability for the content their users publish. The case looks at YouTube’s targeted algorithmic recommendations and the amplification of terrorist content, arguing that online platforms should not be protected by Section 230 immunity when they engage in such actions.

Supreme Court Justice Clarence Thomas wrote in 2020 that the “sweeping immunity” granted by current interpretations of Section 230 could have serious negative consequences, and suggested that the court consider narrowing the statute in a future case.

Experts have long warned that removing Section 230 could have the unintended impact of dramatically increasing the amount of content removed from online platforms, as liability concerns will incentivize companies to err on the side of over-moderation.

Without some form of liability protection, platforms “would be likely to use necessarily blunt content moderation tools to over-restrict speech or to impose blanket bans on certain topics, speakers, or specific types of content,” the EFF and other civil rights organizations argued.

Platforms are already self-motivated to remove harmful content because failing to do so can risk their user base, CCIA and the other tech organizations said.

There is an immense amount of harmful content to be found on online and moderating it is a careful, costly and iterative process, the CCIA brief said, adding that “mistakes and difficult judgement calls will be made given the vast amounts of expression online.”

Continue Reading

Section 230

Narrow Majority of Supreme Court Blocks Texas Law Regulating Social Media Platforms

The decision resulted in an unusual court split. Justice Kagan sided with Justice Alito but refused to sign his dissent.

Published

on

Caricature of Samuel Alito by Donkey Hotey used with permission

WASHINGTON, May 31, 2022 – On a narrow 5-4 vote, the Supreme Court of the United States on Tuesday blocked a Texas law that Republicans had argued would address the “censorship” of conservative voices on social media platforms.

Texas H.B. 20 was written by Texas Republicans to combat perceived bias against conservative viewpoints voiced on Facebook, Twitter, and other social media platforms with at least 50 million active monthly users.

Watch Broadband Breakfast Live Online on Wednesday, June 1, 2022

Broadband Breakfast on June 1, 2022 — The Supreme Court, Social Media and the Culture Wars

The bill was drafted at least in part as a reaction to President Donald Trump’s ban from social media. Immediately following the January 6 riots at the United States Capitol, Trump was simultaneously banned on several platforms and online retailers, including Amazon, Facebook, Twitter, Reddit, and myriad other websites.

See also Explainer: With Florida Social Media Law, Section 230 Now Positioned In Legal Spotlight, Broadband Breakfast, May 25, 2021

Close decision on First Amendment principles

A brief six-page dissent on the matter was released on Tuesday. Conservative Justices Samuel Alito, Neil Gorsuch, and Clarence Thomas dissented, arguing that the law should have been allowed to stand. Justice Elena Kagan also agreed that the law should be allowed to stand, though she did not join Alito’s penned dissent and did not elaborate further.

The decision was on an emergency action to vacate a one-sentence decision of the Fifth Circuit Court of Appeals. The appeals court had reversed a prior stay by a federal district court. In other words, the, the law passed by the Texas legislature and signed by Gov. Greg Abbott is precluded from going into effect.

Tech lobbying group NetChoice – in addition to many entities in Silicon Valley – argued that the law would prevent social media platforms from moderating and addressing hateful and potentially inflammatory content.

In a statement, Computer & Communications Industry Association President Matt Schruers said, “We are encouraged that this attack on First Amendment rights has been halted until a court can fully evaluate the repercussions of Texas’s ill-conceived statute.”

“This ruling means that private American companies will have an opportunity to be heard in court before they are forced to disseminate vile, abusive or extremist content under this Texas law. We appreciate the Supreme Court ensuring First Amendment protections, including the right not to be compelled to speak, will be upheld during the legal challenge to Texas’s social media law.”

In a statement, Public Knowledge Legal Director John Bergmayer said, “It is good that the Supreme Court blocked HB 20, the Texas online speech regulation law. But it should have been unanimous. It is alarming that so many policymakers, and even Supreme Court justices, are willing to throw out basic principles of free speech to try to control the power of Big Tech for their own purposes, instead of trying to limit that power through antitrust and other competition policies. Reining in the power of tech giants does not require abandoning the First Amendment.”

In his dissent, Alito pointed out that the plaintiffs argued “HB 20 interferes with their exercise of ‘editorial discretion,’ and they maintain that this interference violates their right ‘not to disseminate speech generated by others.’”

“Under some circumstances, we have recognized the right of organizations to refuse to host the speech of others,” he said, referencing Hurley v. Irish-American Gay, Lesbian and Bisexual Group of Boston, Inc.

“But we have rejected such claims in other circumstances,” he continued, pointing to PruneYard Shopping Center v. Robins.

Will Section 230 be revamped on a full hearing by the Supreme Court?

“It is not at all obvious how our existing precedents, which predate the age of the internet, should apply to large social media companies, but Texas argues that its law is permissible under our case law,” Alito said.

Alito argued that there is a distinction between compelling a platform to host a message and refraining from discriminating against a user’s speech “on the basis of viewpoint.” He said that H.B. 20 adopted the latter approach.

Alito went on, arguing that the bill only applied to “platforms that hold themselves out as ‘open to the public,’” and “neutral forums for the speech of others,” and thus, the targeting platforms are not spreading messages they endorse.

Alito added that because the bill only targets platforms with more than 50 million users, it only targets entities with “some measure of common carrier-like market power and that this power gives them an ‘opportunity to shut out [disfavored] speakers.’”

Justices John Roberts, Stephen Breyer, Sonya Sotomayor, Brett Kavanaugh, and Amy Coney Barrett all voted affirmatively – siding with NetChoice LLC’s emergency application – to block H.B. 20 from being enforced.

Continue Reading

Section 230

Parler Policy Exec Hopes ‘Sustainable’ Free Speech Change on Twitter if Musk Buys Platform

Parler’s Amy Peikoff said she wishes Twitter can follow in her social media company’s footsteps.

Published

on

Screenshot of Amy Peikoff

WASHINGTON, May 16, 2022 – A representative from a growing conservative social media platform said last week that she hopes Twitter, under new leadership, will emerge as a “sustainable” platform for free speech.

Amy Peikoff, chief policy officer of social media platform Parler, said as much during a Broadband Breakfast Live Online event Wednesday, in which she wondered about the implications of platforms banning accounts for views deemed controversial.

The social media world has been captivated by the lingering possibility that SpaceX and Tesla CEO Elon Musk could buy Twitter, which the billionaire has criticized for making decisions he said infringe on free speech.

Before Musk’s decision to go in on the company, Parler saw a surge in member sign-ups after former President Donald Trump was banned from Twitter for comments he made that the platform saw as encouraging the Capitol riots on January 6, 2021, a move Peikoff criticized. (Trump also criticized the move.)

Peikoff said she believes Twitter should be a free speech platform just like Parler and hopes for “sustainable” change with Musk’s promise.

“At Parler, we expect you to think for yourself and curate your own feed,” Peikoff told Broadband Breakfast Editor and Publisher Drew Clark. “The difference between Twitter and Parler is that on Parler the content is controlled by individuals; Twitter takes it upon itself to moderate by itself.”

She recommended “tools in the hands of the individual users to reward productive discourse and exercise freedom of association.”

Peikoff criticized Twitter for permanently banning Donald Trump following the insurrection at the U.S. Capitol on January 6, and recounted the struggle Parler had in obtaining access to hosting services on AWS, Amazon’s web services platform.

Screenshot of Amy Peikoff

While she defended the role of Section 230 of the Telecom Act for Parler and others, Peikoff criticized what she described as Twitter’s collusion with the government. Section 230 provides immunity from civil suits for comments posted by others on a social media network.

For example, Peikoff cited a July 2021 statement by former White House Press Secretary Jen Psaki raising concerns with “misinformation” on social media. When Twitter takes action to stifle anti-vaccination speech at the behest of the White House, that crosses the line into a form of censorship by social media giants that is, in effect, a form of “state action.”

Conservatives censored by Twitter or other social media networks that are undertaking such “state action” are wrongfully being deprived of their First Amendment rights, she said.

“I would not like to see more of this entanglement of government and platforms going forward,” she said Peikoff and instead to “leave human beings free to information and speech.”

Screenshot of Drew Clark and Amy Peikoff during Wednesday’s Broadband Breakfast’s Online Event

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, May 11, 2022, 12 Noon ET – Mr. Musk Goes to Washington: Will Twitter’s New Owner Change the Debate About Social Media?

The acquisition of social media powerhouse Twitter by Elon Musk, the world’s richest man, raises a host of issues about social media, free speech, and the power of persuasion in our digital age. Twitter already serves as the world’s de facto public square. But it hasn’t been without controversy, including the platform’s decision to ban former President Donald Trump in the wake of his tweets during the January 6 attack on the U.S. Capitol. Under new management, will Twitter become more hospitable to Trump and his allies? Does Twitter have a free speech problem? How will Mr. Musk’s acquisition change the debate about social media and Section 230 of the Telecommunications Act?

Guests for this Broadband Breakfast for Lunch session:

  • Amy Peikoff, Chief Policy Officer, Parler
  • Drew Clark (host), Editor and Publisher, Broadband Breakfast

Amy Peikoff is the Chief Policy Officer of Parler. After completing her Ph.D., she taught at universities (University of Texas, Austin, University of North Carolina, Chapel Hill, United States Air Force Academy) and law schools (Chapman, Southwestern), publishing frequently cited academic articles on privacy law, as well as op-eds in leading newspapers across the country on a range of issues. Just prior to joining Parler, she founded and was President of the Center for the Legalization of Privacy, which submitted an amicus brief in United States v. Facebook in 2019.

Drew Clark is the Editor and Publisher of BroadbandBreakfast.com and a nationally-respected telecommunications attorney. Drew brings experts and practitioners together to advance the benefits provided by broadband. Under the American Recovery and Reinvestment Act of 2009, he served as head of a State Broadband Initiative, the Partnership for a Connected Illinois. He is also the President of the Rural Telecommunications Congress.

Illustration by Mohamed Hassan used with permission

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

https://pixabay.com/vectors/elon-musk-twitter-owner-investor-7159200/

Continue Reading

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Broadband Breakfast Research Partner

Trending