Without Section 230, Digital Platforms ‘Would Not Be Able to Exist,’ Panelists Said
WASHINGTON, July 29, 2019 — Policy experts and tech executives emphatically defended the controversial Section 230 of the Communications Decency Act at an Internet Governance Forum conference on Thursday. The statute, which has been the target of attacks from both sides of the aisle in recent weeks,
Em McPhie
WASHINGTON, July 29, 2019 — Policy experts and tech executives emphatically defended the controversial Section 230 of the Communications Decency Act at an Internet Governance Forum conference on Thursday.
The statute, which has been the target of attacks from both sides of the aisle in recent weeks, paves the way for digital platforms to moderate content posted by users by giving them certain legal immunities.
“If we were held liable for everything that the users potentially posted…we fundamentally would not be able to exist,” said Jessica Ashooh, Reddit’s director of policy. “And that’s where this becomes a competitive issue, because at a time when we’re talking about antitrust investigations and we’re wondering if the biggest players are too big, the last thing we want to do is make a law that makes it harder for smaller companies to compete.”
The debate about content moderation is wrongfully cased in binary terms, Ashooh continued, explaining that there are many steps between leaving content up and taking it down. For example, Reddit uses a quarantine policy for material that doesn’t technically violate the content policy but could be reasonably called highly offensive or upsetting.
The policy also applies to certain content that is verifiably false, such as forums dedicated to Holocaust denial or anti-vaccination propaganda. Such content is hidden behind a warning screen and dropped out of user recommendations.
The First Amendment limits the government’s ability to counter the spread of disinformation, making it essential that Section 230 enables digital platforms to do so, said Emma Llansó, Director of the Free Expression Project at the Center for Democracy & Technology.
Without Section 230’s protection, content moderation would quickly become binary, Ashooh and Llansó warned.
The debate over content moderation is difficult because there is so much criticism in both directions, said Jeff Kosseff, professor of cybersecurity law at the U.S. Naval Academy.
Many argue that platforms are not doing enough moderation, a viewpoint that gained traction among Democrats when a doctored video of House Speaker Nancy Pelosi spread rapidly across the internet.
Others, including many Republicans, argue that platforms are using excessive content moderation to silence opposing political viewpoints, using the oft-repeated claim that Section 230 is premised on platforms having neutrality.
Section 230 is actually premised on not having neutrality, Kosseff said. Instead, it gives platforms the ability to implement and enforce their own content guidelines.
While some content is clearly criminal and should be taken down, there is plenty of content that falls into a gray area. The same words could potentially be viewed by some as valid political discourse and by others as violent hate speech.
Section 230 is important because it enables platforms to decide where to draw the line, Kosseff said. Although there’s no perfect way of making these difficult decisions, they are the responsibility of private companies, not the federal government.
We will “almost certainly” see changes to the statute in the near future—likely before the Senate passes privacy legislation, said IBM Technology Policy Executive Ryan Hagemann, saying that the question was not whether the law should be amended, but how it should be amended.
And Congress is starting to take note of the issue. Following a June proposal from Sen. Josh Hawley, R-Mo., to repeal Section 230 for big tech giants, Rep. Ed Case, D-Hawaii, on Thursday formally requested the House Energy and Commerce Committee to launch a probe of the law.
“When the statute was passed in 1996, it had an important role to play in fostering the internet’s growth,” Day told the committee. “But today’s massive internet platforms that offer services cannot be allowed to knowingly facilitate law breaking in our states and localities by hiding behind CDA 230 immunity.”
(Photo of conference by Emily McPhie.)