Connect with us

Section 230

FCC’s Brendon Carr Urges Changes to Section 230, Says His Approach Supports First Amendment

Liana Sowa

Published

on

Photo of Brendan Carr by Gage Skidmore from February 2018 used with permission

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Reporter Liana Sowa grew up in Simsbury, Connecticut. She studied editing and publishing as a writing fellow at Brigham Young University, where she mentored upperclassmen on neuroscience research papers. She enjoys reading and journaling, and marathon-runnning and stilt-walking.

Section 230

Sen. Mike Lee Promotes Bills Valuing Federal Spectrum, Requiring Content Moderation Disclosures

Tim White

Published

on

Screenshot of Mike Lee taken from Silicon Slopes event

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Section 230

Pressed by Congress, Big Tech Defends Itself and Offers Few Solutions After Capitol Riot

Tim White

Published

on

Photo of Google CEO Sundar Pichai from a December 2018 hearing before the House Judiciary Committee by Drew Clark

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Section 230

Sen. Mark Warner Says His Section 230 Bill Is Crafted With Help of Tech Companies

Samuel Triginelli

Published

on

Photo of Sen. Mark Warner from December 2017 from his office

September 17, 2020 – Federal Communications Commissioner Brendan Carr on Wednesday advocated greater transparency by social media and internet platforms at an event called “Reboot Conversations: The Right’s Tech Realignment, Section 230, and What Comes Next.”

Carr said he believed that Section 230  needed to be changed because it is “skewing the landscape to favor a certain business model.”

When Section 230 was passed as part of the Telecom Act of 1996, Carr said there was minimal content moderation on messaging boards and CompuServe, a concern at the time because prior court precedents made heavy moderation vulnerable to liability for by the providers of interactive computer systems.

“Now we’ve created far too much incentive for content moderation than was originally intended,” he said.

Carr acknowledged that Section 230 does not specify how much or little moderation platforms should employ. He said that Section 230 subsection (c)(2), which protects platforms from being sued for acting as a “good Samaritan” to censor bad content, goes too far.

Some tech platforms deploy third parties to fact-check users’ posts. Carr said that most of these fact checkers are checking political posts.

This is the crux of the debate, said Carr. Some say that fact checkers help to prevent misinformation from spreading to the public. But Carr believe that users’ should be able to decide whether or not they want their feeds fact-checked. He suggested having an opt-in or opt-out button for fact-checking.

He argued that this approach would empower users while maintaining First Amendment rights.

Tech platforms should also be more explicit about their political leanings, he said.

This transparency would make it easier to hold platforms accountable for the level of moderation that they do or don’t do.

It would also allow platforms the freedom to create content without needing to politically neuter their platforms. He said he was not in favor of such neutering, or neutralizing.

“We need transparency– that doesn’t require neutralizing platforms politically.”

Carr also argued that implementing transparency would combat the threat of “deplatforming,” as when a user is kicked off of Twitter or Facebook. Such an action has much greater consequences today than it did than in 1996.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending