Creating Institutions for Resolving Content Moderation Disputes Out-of-Court

Private institutions may become primary method for content moderation disputes, says expert.

Creating Institutions for Resolving Content Moderation Disputes Out-of-Court
Photo of Discord CEO Jason Citron at the Wednesday hearing

WASHINGTON, March 9, 2023 – A member of Meta’s oversight board, John Samples, suggested that out-of-court dispute institutions for content moderation may become the preferred method of settlement in Broadband Breakfast’s Big Tech & Speech Summit Thursday.

Meta’s oversight board was created by the company to support free speech by upholding or reversing Facebook’s content moderation decisions. It works independently of the company and hosts 40 members around the world.

The European Union’s Digital Services Act, which came into force in November of 2022, requires platforms to remove illegal content and ensure that users can contest removal of their content. It clarifies that platforms are only liable for users’ unlawful behavior if they are aware of it and fail to remove it.

The Act specifies illegal speech to include speech that does harm to the electoral system, hate speech, and speech that harms fundamental rights. The appeals process allows citizens to go directly to the company, the national courts, or out-of-court dispute resolution institutions, none of which currently exist in Europe.

According to Samples, the Act opens the way for private organizations like the oversight board to play a part in moderation disputes. “Meta has a tremendous advantage here as a first mover,” said Samples, “and the model of the oversight board may well spread to Europe and perhaps other places.”

The United States may adopt European processes in the future as it takes the lead in moderating big tech, claimed Samples. “It would largely be a private system,” he said, and could unify and centralize social media moderation across platforms and around the world.

The private option of self-regulation has worked well, said Samples. “It may well be expanding throughout much of the world. If it goes to Europe, it could go throughout.”

Currently, of the media that Meta reviews for moderation, only one percent is restricted, either by taking down the content or reducing the size of the audience exposed to it, said Samples. The oversight board primarily rules against Meta’s decisions and accepts comments from independent interests.

Big Tech & Speech Summit
At the intersection of content moderation and political power.

Popular Tags