Connect with us

Big Tech

Facebook’s Oversight Board Defends Against Critics Who Call It a Shield For Company

Published

on

Screenshot taken from South By Southwest event 

March 16, 2021 – A member of Facebook’s Oversight Board, which was created to provide content moderation decisions for the social media giant, defended itself Tuesday against critics who say it was created as a sort-of regulatory shield for the company.

The board’s head of communications, Dax Hunter-Torricke, said at the South by Southwest conference Tuesday that “the board was not created to be reputational or a regulation shield for Facebook. Board members are a team of experts in politics, journalism, law, and several other areas that are very independent and don’t have their careers tied to Facebook. They feel no problem speaking against it.”

The conference, which began Tuesday, has so-far heard about social media’s influence on American life, including how it helped drum up the January 6 Capitol riots.

The panel heard stories of incitement of hatred, the presence of genocide, and contribution to election manipulation that have made a separate content moderation institution a necessity to combat the issues plaguing the platform. The company regularly purges its platform of fringe entities.

The board said it has been tasked with showing how it came to content moderation decisions. It used that as an example of how it is ensuring its decisions are not impacted by the public or the company.

Hunter-Torricke added that the status quo on content moderation is broken and that the Oversight Board is here to make things better through transparency of decision-making, which is historically different than what Facebook has been doing.

The board has already been called upon to decide cases of content moderation, and in 80% of these cases, the board has overturned the Facebook content moderation decision.

Reporter Samuel Triginelli was born in Brazil and grew up speaking Portuguese and English, and later learned French and Spanish. He studied communications at Brigham Young University, where he also worked as a product administrator and UX/UI designer. He wants a world with better internet access for all.

Antitrust

FTC Divided Over Increasing Agency Jurisdiction at Congressional Hearing

FTC commissioners were split at a Congressional hearing on Wednesday at the prospects of increasing FTC jurisdiction.

Published

on

Rep. Jan Schakowsky, D-Illinois.

March 16, 2021 – A member of Facebook’s Oversight Board, which was created to provide content moderation decisions for the social media giant, defended itself Tuesday against critics who say it was created as a sort-of regulatory shield for the company.

The board’s head of communications, Dax Hunter-Torricke, said at the South by Southwest conference Tuesday that “the board was not created to be reputational or a regulation shield for Facebook. Board members are a team of experts in politics, journalism, law, and several other areas that are very independent and don’t have their careers tied to Facebook. They feel no problem speaking against it.”

The conference, which began Tuesday, has so-far heard about social media’s influence on American life, including how it helped drum up the January 6 Capitol riots.

The panel heard stories of incitement of hatred, the presence of genocide, and contribution to election manipulation that have made a separate content moderation institution a necessity to combat the issues plaguing the platform. The company regularly purges its platform of fringe entities.

The board said it has been tasked with showing how it came to content moderation decisions. It used that as an example of how it is ensuring its decisions are not impacted by the public or the company.

Hunter-Torricke added that the status quo on content moderation is broken and that the Oversight Board is here to make things better through transparency of decision-making, which is historically different than what Facebook has been doing.

The board has already been called upon to decide cases of content moderation, and in 80% of these cases, the board has overturned the Facebook content moderation decision.

Continue Reading

Antitrust

Explainer: Antitrust Heats Up as Biden Selects Tech Critic Jonathan Kanter for Top Enforcement Spot

In the fourth in a series of explainers, Broadband Breakfast examines the Biden administration’s intent to bash Big Tech.

Published

on

Photo of Jonathan Kanter at the Capitol Forum by New America used with permission

March 16, 2021 – A member of Facebook’s Oversight Board, which was created to provide content moderation decisions for the social media giant, defended itself Tuesday against critics who say it was created as a sort-of regulatory shield for the company.

The board’s head of communications, Dax Hunter-Torricke, said at the South by Southwest conference Tuesday that “the board was not created to be reputational or a regulation shield for Facebook. Board members are a team of experts in politics, journalism, law, and several other areas that are very independent and don’t have their careers tied to Facebook. They feel no problem speaking against it.”

The conference, which began Tuesday, has so-far heard about social media’s influence on American life, including how it helped drum up the January 6 Capitol riots.

The panel heard stories of incitement of hatred, the presence of genocide, and contribution to election manipulation that have made a separate content moderation institution a necessity to combat the issues plaguing the platform. The company regularly purges its platform of fringe entities.

The board said it has been tasked with showing how it came to content moderation decisions. It used that as an example of how it is ensuring its decisions are not impacted by the public or the company.

Hunter-Torricke added that the status quo on content moderation is broken and that the Oversight Board is here to make things better through transparency of decision-making, which is historically different than what Facebook has been doing.

The board has already been called upon to decide cases of content moderation, and in 80% of these cases, the board has overturned the Facebook content moderation decision.

Continue Reading

Big Tech

Proposed Bill Takes Aim at Misinformation on Social Media Platforms

Sen. Amy Klobuchar introduced a bill Thursday to remove Section 230 protections for vaccine misinformation.

Published

on

Sen. Amy Klobuchar, D-Minnesota

March 16, 2021 – A member of Facebook’s Oversight Board, which was created to provide content moderation decisions for the social media giant, defended itself Tuesday against critics who say it was created as a sort-of regulatory shield for the company.

The board’s head of communications, Dax Hunter-Torricke, said at the South by Southwest conference Tuesday that “the board was not created to be reputational or a regulation shield for Facebook. Board members are a team of experts in politics, journalism, law, and several other areas that are very independent and don’t have their careers tied to Facebook. They feel no problem speaking against it.”

The conference, which began Tuesday, has so-far heard about social media’s influence on American life, including how it helped drum up the January 6 Capitol riots.

The panel heard stories of incitement of hatred, the presence of genocide, and contribution to election manipulation that have made a separate content moderation institution a necessity to combat the issues plaguing the platform. The company regularly purges its platform of fringe entities.

The board said it has been tasked with showing how it came to content moderation decisions. It used that as an example of how it is ensuring its decisions are not impacted by the public or the company.

Hunter-Torricke added that the status quo on content moderation is broken and that the Oversight Board is here to make things better through transparency of decision-making, which is historically different than what Facebook has been doing.

The board has already been called upon to decide cases of content moderation, and in 80% of these cases, the board has overturned the Facebook content moderation decision.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

 

Trending