Panelists, Including Facebook Executive, Call For Increased Content Moderation
WASHINGTON, July 22, 2019 — The primary responsibility of moderating online platforms lies with the platforms themselves, making Section 230 protections essential, said panelists at New America’s Open Technology Institute on Thursday. Although some policymakers are attempting to solve the problem of
Em McPhie
WASHINGTON, July 22, 2019 — The primary responsibility of moderating online platforms lies with the platforms themselves, making Section 230 protections essential, said panelists at New America’s Open Technology Institute on Thursday.
Although some policymakers are attempting to solve the problem of digital content moderation, Open Technology Institute Director Sarah Morris noted that the First Amendment limits the government’s ability to regulate speech, leaving platforms to handle “the vast majority of decision making.”
“Washington lawmakers don’t have the capacity to address these challenges,” said Francella Ochillo, executive director of Next Century Cities.
It’s up to tech companies to do more than they are currently doing to tackle hate speech on their platforms, said David Snyder, executive director of the First Amendment Coalition.
Facebook Public Policy Manager Shaarik Zafar acknowledged that the tech community needs to do a better job of enforcing its policies—and on creating those policies in the first place.
Content moderation is an extremely difficult process, he said. Although algorithms are fairly good at detecting terrorism and child exploitation, other issues can be more difficult, such as trying to distinguish between journalists and activists raising awareness of atrocities versus people glorifying violent extremism.
No single solution can eliminate hate speech, and any solution found will have to be frequently revisited and updated, said Ochillo. But that doesn’t mean that platforms and others shouldn’t be a significant effort, she said, pointing out that people suffer real-world secondary effects from hateful content posted online.
Zafar emphasized Facebook’s commitment to onboarding additional civil rights expertise as the platform continues to tackle the problem of hate speech.
He also highlighted Facebook’s recently announced external oversight board, which will be made up of a diverse group of experts with experience in content, privacy, free expression, human rights, safety, and other relevant disciplines.
Facebook would defer to the board on difficult content moderation questions, said Zafar, and would follow their recommendation even when company executives disagree.
But as companies take steps to fine-tune and enforce their terms of service, transparency is of the utmost importance, Snyder said.
Content moderation algorithms should be made public so that independent researchers can test them for bias, suggested Sharon Franklin, OTI’s director of surveillance and cybersecurity policy.
Franklin also highlighted the Santa Clara Principles, a set of guidelines for transparency and accountability in content moderation. The principles call on companies to publish the numbers of posts removed and accounts suspended, provide notice to users whose content or account is removed, and create a meaningful appeal process.
Allowing content moderation under Section 230 of the Communications Decency Act has spurred innovation and made it possible for individuals and companies to have access to massive audiences through social media, said Zafar.
Without those protections, he continued, companies might choose to forgo content moderation altogether, leaving all sorts of hate speech, misinformation, and spam on the platforms to the point that they might actually become unusable.
The other potential danger of repealing the law would be companies airing on the side of caution and over-enforcing policies, said Franklin. Section 230 actually leads to less censorship because it allows for nuanced content moderation.
The Open Technology Institute supports Section 230 and is very concerned about the recent attacks that have been made on it, Franklin added.
Section 230 is “far from perfect,” said Snyder, but it’s much better than any of the plans that have been proposed to modify it or than not having it at all.
Facebook and other platforms give voice to a wide range of ideologies, and people from all backgrounds are able to successfully gain significant followings, said Zafar, emphasizing that the company’s purpose is to serve everybody.
(Photo of New America event by Emily McPhie.)