Connect with us

Section 230

Social Media Platforms Should Increase Algorithm Transparency, Say Broadband Breakfast Live Online Panelists

Elijah Labby

Published

on

July 16, 2020 — Social media companies need to “open up” their algorithms, said participants in a Broadband Breakfast panel on Wednesday.

The event, titled “Public Input on Platform Algorithms: The Role of Transparency and Feedback in High-Tech,” saw participants discuss the role of transparency in the algorithms utilized by social media companies.

Nicol Turner-Lee, director of the Brookings Institution Center for Technology Innovation, said that tech companies must be more transparent in their practices so that users can decode the content that the platforms present to them.

Turner-Lee said that transparency is important in order to decrease discrimination in all uses of algorithms, such as those used in awarding high school diplomas.

“They use an algorithm in lieu of an in-person test to make the determination as to a student’s ability to qualify for the diploma,” she said. “There was a huge decline among students, and potentially discriminatory outputs.”

Screenshot of Broadband Breakfast Live Online panelists

Harold Feld, senior vice president of Public Knowledge, agreed that such race-based algorithmic discrimination is not rare.

“There’s been a fair amount of documentation that shows that there is a bias,” he said. “Posts by African Americans are much more likely to be considered violent or dangerous than those by white users.”

However, Feld said that removing Section 230 of the Communications Decency Act is not a solution to this problem.

“You could take away 230 and nothing that we’ve said about the harms would make a damn bit of difference,” he said.

Nathalie Maréchal, a senior policy analyst at Ranking Digital Rights, discussed several suggestions for social media transparency.

“What we’re looking for in this area is first for companies to be clear about their rules for ad content … and second, we’re looking for companies to shape their users online experiences with the objective of those algorithms or what data is used,” she said.

Section 230 has become a flash point in an ongoing discussion about the role and rules of content moderation online. When several of President Donald Trump’s tweets were flagged as both glorifying violence and as being misleading, Trump hit back by signing an executive order attempting to strip Section 230 of its power.

It is unclear what power, if any, the executive order will have.

Elijah Labby was a Reporter with Broadband Breakfast. He was born in Pittsburgh, Pennsylvania and now resides in Orlando, Florida. He studies political science at Seminole State College, and enjoys reading and writing fiction (but not for Broadband Breakfast).

Section 230

Sen. Mike Lee Promotes Bills Valuing Federal Spectrum, Requiring Content Moderation Disclosures

Tim White

Published

on

Screenshot of Mike Lee taken from Silicon Slopes event

July 16, 2020 — Social media companies need to “open up” their algorithms, said participants in a Broadband Breakfast panel on Wednesday.

The event, titled “Public Input on Platform Algorithms: The Role of Transparency and Feedback in High-Tech,” saw participants discuss the role of transparency in the algorithms utilized by social media companies.

Nicol Turner-Lee, director of the Brookings Institution Center for Technology Innovation, said that tech companies must be more transparent in their practices so that users can decode the content that the platforms present to them.

Turner-Lee said that transparency is important in order to decrease discrimination in all uses of algorithms, such as those used in awarding high school diplomas.

“They use an algorithm in lieu of an in-person test to make the determination as to a student’s ability to qualify for the diploma,” she said. “There was a huge decline among students, and potentially discriminatory outputs.”

Screenshot of Broadband Breakfast Live Online panelists

Harold Feld, senior vice president of Public Knowledge, agreed that such race-based algorithmic discrimination is not rare.

“There’s been a fair amount of documentation that shows that there is a bias,” he said. “Posts by African Americans are much more likely to be considered violent or dangerous than those by white users.”

However, Feld said that removing Section 230 of the Communications Decency Act is not a solution to this problem.

“You could take away 230 and nothing that we’ve said about the harms would make a damn bit of difference,” he said.

Nathalie Maréchal, a senior policy analyst at Ranking Digital Rights, discussed several suggestions for social media transparency.

“What we’re looking for in this area is first for companies to be clear about their rules for ad content … and second, we’re looking for companies to shape their users online experiences with the objective of those algorithms or what data is used,” she said.

Section 230 has become a flash point in an ongoing discussion about the role and rules of content moderation online. When several of President Donald Trump’s tweets were flagged as both glorifying violence and as being misleading, Trump hit back by signing an executive order attempting to strip Section 230 of its power.

It is unclear what power, if any, the executive order will have.

Continue Reading

Section 230

Pressed by Congress, Big Tech Defends Itself and Offers Few Solutions After Capitol Riot

Tim White

Published

on

Photo of Google CEO Sundar Pichai from a December 2018 hearing before the House Judiciary Committee by Drew Clark

July 16, 2020 — Social media companies need to “open up” their algorithms, said participants in a Broadband Breakfast panel on Wednesday.

The event, titled “Public Input on Platform Algorithms: The Role of Transparency and Feedback in High-Tech,” saw participants discuss the role of transparency in the algorithms utilized by social media companies.

Nicol Turner-Lee, director of the Brookings Institution Center for Technology Innovation, said that tech companies must be more transparent in their practices so that users can decode the content that the platforms present to them.

Turner-Lee said that transparency is important in order to decrease discrimination in all uses of algorithms, such as those used in awarding high school diplomas.

“They use an algorithm in lieu of an in-person test to make the determination as to a student’s ability to qualify for the diploma,” she said. “There was a huge decline among students, and potentially discriminatory outputs.”

Screenshot of Broadband Breakfast Live Online panelists

Harold Feld, senior vice president of Public Knowledge, agreed that such race-based algorithmic discrimination is not rare.

“There’s been a fair amount of documentation that shows that there is a bias,” he said. “Posts by African Americans are much more likely to be considered violent or dangerous than those by white users.”

However, Feld said that removing Section 230 of the Communications Decency Act is not a solution to this problem.

“You could take away 230 and nothing that we’ve said about the harms would make a damn bit of difference,” he said.

Nathalie Maréchal, a senior policy analyst at Ranking Digital Rights, discussed several suggestions for social media transparency.

“What we’re looking for in this area is first for companies to be clear about their rules for ad content … and second, we’re looking for companies to shape their users online experiences with the objective of those algorithms or what data is used,” she said.

Section 230 has become a flash point in an ongoing discussion about the role and rules of content moderation online. When several of President Donald Trump’s tweets were flagged as both glorifying violence and as being misleading, Trump hit back by signing an executive order attempting to strip Section 230 of its power.

It is unclear what power, if any, the executive order will have.

Continue Reading

Section 230

Sen. Mark Warner Says His Section 230 Bill Is Crafted With Help of Tech Companies

Samuel Triginelli

Published

on

Photo of Sen. Mark Warner from December 2017 from his office

July 16, 2020 — Social media companies need to “open up” their algorithms, said participants in a Broadband Breakfast panel on Wednesday.

The event, titled “Public Input on Platform Algorithms: The Role of Transparency and Feedback in High-Tech,” saw participants discuss the role of transparency in the algorithms utilized by social media companies.

Nicol Turner-Lee, director of the Brookings Institution Center for Technology Innovation, said that tech companies must be more transparent in their practices so that users can decode the content that the platforms present to them.

Turner-Lee said that transparency is important in order to decrease discrimination in all uses of algorithms, such as those used in awarding high school diplomas.

“They use an algorithm in lieu of an in-person test to make the determination as to a student’s ability to qualify for the diploma,” she said. “There was a huge decline among students, and potentially discriminatory outputs.”

Screenshot of Broadband Breakfast Live Online panelists

Harold Feld, senior vice president of Public Knowledge, agreed that such race-based algorithmic discrimination is not rare.

“There’s been a fair amount of documentation that shows that there is a bias,” he said. “Posts by African Americans are much more likely to be considered violent or dangerous than those by white users.”

However, Feld said that removing Section 230 of the Communications Decency Act is not a solution to this problem.

“You could take away 230 and nothing that we’ve said about the harms would make a damn bit of difference,” he said.

Nathalie Maréchal, a senior policy analyst at Ranking Digital Rights, discussed several suggestions for social media transparency.

“What we’re looking for in this area is first for companies to be clear about their rules for ad content … and second, we’re looking for companies to shape their users online experiences with the objective of those algorithms or what data is used,” she said.

Section 230 has become a flash point in an ongoing discussion about the role and rules of content moderation online. When several of President Donald Trump’s tweets were flagged as both glorifying violence and as being misleading, Trump hit back by signing an executive order attempting to strip Section 230 of its power.

It is unclear what power, if any, the executive order will have.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending