Connect with us

Section 230

FCC’s Brendon Carr Urges Changes to Section 230, Says His Approach Supports First Amendment

Published

on

Photo of Brendan Carr by Gage Skidmore from February 2018 used with permission

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Reporter Liana Sowa grew up in Simsbury, Connecticut. She studied editing and publishing as a writing fellow at Brigham Young University, where she mentored upperclassmen on neuroscience research papers. She enjoys reading and journaling, and marathon-runnning and stilt-walking.

Section 230

Companies May Hesitate Bringing Section 230 Arguments in Court Fearing Political Ramifications: Lawyers

Legal experts say changing views on Section 230 will make platforms less willing to employ that defense in future cases.

Published

on

Carrie Goldberg, founder of C.A. Goldberg law firm

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Section 230

Head of Big Tech Lobby Group Says Repealing Section 230 Unconstitutional

CTA CEO said abolishing intermediary liability protections violates private industry protections against government interference.

Published

on

Gary Shapiro, CEO of the Consumer Technology Association

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Section 230

Broadband Breakfast Hosts Section 230 Debate

Two sets of experts debated the merits of reforming or removing and maintaining Section 230.

Published

on

Screenshot taken from Broadband Live Online event

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

 

Trending