Trump Justice Department Says Section 230 Changes Needed to Target ‘Bad Samaritans’ and Enforce Transparency
September 1, 2020 – Justice Department attorney Lauren Willard said that the Trump administration wanted to change Section 230 to incentivize big tech platforms to address criminal activity and to provide transparency when they take down lawful content. Speaking at an American Bar Association webina
Liana Sowa
September 1, 2020 – Justice Department attorney Lauren Willard said that the Trump administration wanted to change Section 230 to incentivize big tech platforms to address criminal activity and to provide transparency when they take down lawful content.
Speaking at an American Bar Association webinar on Tuesday on “Communications Decency Act Section 230 Under Review,” Willard engaged in a panel discussion analyzing the administration’s recent executive order and other proposals on the topic.
Attorney General William Barr and other legal experts opposed to the power of Silicon Valley tech giants have increasingly been critical of Section 230. Rather than protecting “good Samaritan” behavior that would clean up indecency and harassment on the internet, they argue that the law has turned the platforms into “bad Samaritans.”
See “Attorney General Bill Barr Calls for ‘Recalibrated’ Section 230 as Justice Department Hosts Tech Immunity Workshop,” Broadband Breakfast, February 19, 2020
At the Tuesday event, Willard said that the administration wants to change Section 230 to add more specific language that would clarify what is meant by criminal activity; including activities that are “unlawful, promote terrorism [and] promote self-harm,” she said.
Tech platforms that engage in or solicit this sort of criminal activity would lose the benefits of Section 230 immunity under Trump’s proposal, she said.
As part of the administration’s push for transparency against tech companies, the Justice Department wants to narrow the window for tech companies to benefit from immunity for removing unlawful content.
The administration also wants tech companies to delineate which section of the platform’s terms of service was violated by the removal of allegedly lawful content.
Computer and Communications Industry Association President Matthew Schruers said that the Justice Department proposals were misguided.
Section 230, he said, supports two different and yet complementary goals: It states that platforms are not responsible for inappropriate content and, at the same time, it states that they should not be sued for trying to remove inappropriate content.
Laura Willard of the Justice Department
As a result of this balance, Section 230 has “created a vibrant internet economy that is no doubt the envy of the world,” he said. That internet economy would be put at risk by the Justice Department asks.
Schreurs also disagreed with the transparency policy. Asking platforms to recite the relevant sections from their terms of service when moderating lawful activity is infeasible and unwieldy.
Given the volume of user-generated content that the platforms must moderate on a daily basis, requiring platforms to cite sections of their terms of service would simply result in less moderation. That is because of the time it would take to delineate the way in which each piece of objectionable content violated a corresponding term of service.
David Vladeck, co-director of Georgetown University’s Institute for Public Representation, said that the internet was hard to police. Even if the platforms moderated perfectly, individual internet users would continue to advertise discriminatorily, spread misinformation, and engage in online sex trafficking on other lower-visibility platforms.
Vladeck and Schruer agreed that Congress will probably not address these issues because of the U.S.’s leadership position in social media and the internet.