Connect with us

Section 230

Facebook, Google, Twitter Register to Lobby Congress on Section 230

Companies also want to discuss cybersecurity, net neutrality, taxes and privacy.

Published

on

Facebook CEO Mark Zuckerberg

August 3, 2021 — The largest social media companies have registered to lobby Congress on Section 230, according to lobby records.

Facebook, Google, and Twitter filed new paperwork late last month to discuss the internet liability provision under the Communications Decency Act, which protects these companies from legal trouble for content their users post.

Facebook’s registration specifically mentions the Safe Tech Act, an amendment to the provision proposed earlier this year by Sens. Amy Klobuchar, D-Minnesota, Mark Warner, D-Virginia, and Mazie Hirono, D-Hawaii, which would largely keep the provision’s protections except for content the platforms are paid for.

A separate Facebook registration included discussion on the “repeal” of the provision.

Other issues included in the Menlo Park-based company’s registration are privacy, data security, online advertising, and general regulations on the social media industry.

Google also wants to discuss taxes and cybersecurity, as security issues take center stage following high-profile attacks and as international proposals for a new tax regime on tech companies emerge.

Notable additional subject matters Twitter includes in its registration are content moderation practices, data security, misinformation, and net neutrality, as the Federal Communications Commission is being urged to bring back Obama-era policies friendly to the principle that ensures content cannot be given preferential treatment on networks.

Section 230 has gripped Congress

Social media critics have been foaming at the mouth over possible retaliatory measures against the technology companies that have taken increasingly strong measures against those that violate its policies.

Those discussions picked up steam when, at the beginning of the year, former President Donald Trump was banned from Twitter, and then from Facebook and other platforms, for allegedly stoking the Capitol Hill riot on January 6. (Trump has since filed a lawsuit as a private citizen against the social media giants for his removal.)

Since the Capitol riot, a number of proposals have been put forward to amend — in some cases completely repeal — the provision to address what some Republicans are calling outright censorship by social media companies. Even Florida tried to take matters into its own hands when it made law rules that penalized social media companies that banned politicians. That law has since been put on hold by the courts.

The social media giants, and its allies in the industry, have pressed the importance of the provision, which they say have allowed once-fledgling companies like Facebook to be what it is today. And some representatives think reform of the law could lean more toward amendment than outright repeal. But lawyers have warned about a shift in attitude toward those liability protections, as more judges in courts across the country hold big technology companies accountable for harm caused by the platforms.

Assistant Editor Ahmad Hathout has spent the last half-decade reporting on the Canadian telecommunications and media industries for leading publications. He started the scoop-driven news site downup.io to make Canadian telecom news more accessible and digestible. Follow him on Twitter @ackmet

Section 230

Repealing Section 230 Would be Harmful to the Internet As We Know It, Experts Agree

While some advocate for a tightening of language, other experts believe Section 230 should not be touched.

Published

on

Rep. Ken Buck, R-Colo., speaking on the floor of the House

WASHINGTON, September 17, 2021—Republican representative from Colorado Ken Buck advocated for legislators to “tighten up” the language of Section 230 while preserving the “spirit of the internet” and enhancing competition.

There is common ground in supporting efforts to minimize speech advocating for imminent harm, said Buck, even though he noted that Republican and Democratic critics tend to approach the issue of changing Section 230 from vastly different directions

“Nobody wants a terrorist organization recruiting on the internet or an organization that is calling for violent actions to have access to Facebook,” Buck said. He followed up that statement, however, by stating that the most effective way to combat “bad speech is with good speech” and not by censoring “what one person considers bad speech.”

Antitrust not necessarily the best means to improve competition policy

For companies that are not technically in violation of antitrust policies, improving competition though other means would have to be the answer, said Buck. He pointed to Parler as a social media platform that is an appropriate alternative to Twitter.

Though some Twitter users did flock to Parler, particularly during and around the 2020 election, the newer social media company has a reputation for allowing objectionable content that would otherwise be unable to thrive on social media.

Buck also set himself apart from some of his fellow Republicans—including Donald Trump—by clarifying that he does not want to repeal Section 230.

“I think that repealing Section 230 is a mistake,” he said, “If you repeal section 230 there will be a slew of lawsuits.” Buck explained that without the protections afforded by Section 230, big companies will likely find a way to sufficiently address these lawsuits and the only entities that will be harmed will be the alternative platforms that were meant to serve as competition.

More content moderation needed

Daphne Keller of the Stanford Cyber Policy Center argued that it is in the best interest of social media platforms to enact various forms of content moderation, and address speech that may be legal but objectionable.

“If platforms just hosted everything that users wanted to say online, or even everything that’s legal to say—everything that the First Amendment permits—you would get this sort of cesspool or mosh pit of online speech that most people don’t actually want to see,” she said. “Users would run away and advertisers would run away and we wouldn’t have functioning platforms for civic discourse.”

Even companies like Parler and Gab—which pride themselves on being unyielding bastions of free speech—have begun to engage in content moderation.

“There’s not really a left right divide on whether that’s a good idea, because nobody actually wants nothing but porn and bullying and pro-anorexia content and other dangerous or garbage content all the time on the internet.”

She explained that this is a double-edged sword, because while consumers seem to value some level of moderation, companies moderating their platforms have a huge amount of influence over what their consumers see and say.

What problems do critics of Section 230 want addressed?

Internet Association President and CEO Dane Snowden stated that most of the problems surrounding the Section 230 discussion boil down to a fundamental disagreement over the problems that legislators are trying to solve.

Changing the language of Section 230 would impact not just the tech industry: “[Section 230] impacts ISPs, libraries, and universities,” he said, “Things like self-publishing, crowdsourcing, Wikipedia, how-to videos—all those things are impacted by any kind of significant neutering of Section 230.”

Section 230 was created to give users the ability and security to create content online without fear of legal reprisals, he said.

Another significant supporter of the status quo was Chamber of Progress CEO Adam Kovacevich.

“I don’t think Section 230 needs to be fixed. I think it needs [a better] publicist.” Kovacevich stated that policymakers need to gain a better appreciation for Section 230, “If you took away 230 You would have you’d give companies two bad options: either turn into Disneyland or turn into a wasteland.”

“Either turn into a very highly curated experience where only certain people have the ability to post content, or turn into a wasteland where essentially anything goes because a company fears legal liability,” Kovacevich said.

Continue Reading

Section 230

Judge Rules Exemption Exists in Section 230 for Twitter FOSTA Case

Latest lawsuit illustrates the increasing fragility of Section 230 legal protections.

Published

on

Twitter CEO Jack Dorsey.

August 24, 2021—A California court has allowed a lawsuit to commence against Twitter from two victims of sexual trafficking, who allege the social media company initially refused to remove content that exploited the underaged plaintiffs – and then went viral.

The anonymous plaintiffs allege that they were manipulated into making pornographic videos of themselves through another social media app, Snapchat, after which the videos were posted on Twitter. When the plaintiffs asked Twitter to take down the posts, it refused, and it was only after the Department of Homeland Security got involved that the social media company complied.

At issue in the case is whether Twitter had any obligation to remove the content at least “immediately” under Section 230 of the Communications Decency Act, which provides legal liability protections for the content the platforms’ users post.

Court’s finding

The court ruled Thursday that the case should proceed after finding that Twitter knowingly knew such content was on the site, had to have known it was sex trafficking, and refused to do something about it immediately.

“The Court finds that these allegations are sufficient to allege an ongoing pattern of conduct amounting to a tacit agreement with the perpetrators in this case to allow them to post videos and photographs it knew or should have known were related to sex trafficking without blocking their accounts or the Videos,” the decision read.

“In sum, the Court finds that Plaintiffs have stated a claim for civil liability under the [Trafficking Victims Protection Reauthorization Act] on the basis of beneficiary liability and that the claim falls within the exemption to Section 230 immunity created by FOSTA.”

The Stop Enabling Sex Traffickers Act and the Allow States and Victims to Fight Online Sex Trafficking Act that became the package law SESTA-FOSTA was passed in 2018 and amended immunity claims under Section 230 to exclude enforcement of federal or state sex trafficking laws from intermediary protections.

The court dismissed other claims against the company made by the plaintiffs, but met the relatively low bar to move the case forward.

The arguments

The plaintiffs allege that Twitter violated the TVPRA because it allegedly knew about the videos, benefitted from them and did nothing to address the problem before it went viral.

Twitter argued that FOSTA, as applied to the CDA, only narrowly applies to websites that are “knowingly assisting and profiting from reprehensible crimes;” the plaintiffs allegedly fail to show that the company “affirmatively participated” in such crimes; and the company cannot be held liable “simply because it did not take the videos down immediately.”

Experts asserted companies may hesitate to bring Section 230 defense in court

The case is yet another instance of U.S. courts increasingly poking holes in arguments brought by technology companies that suggests they cannot be liable for content on their platforms, per Section 230, which is currently the subject of hot debate in Washington about whether to reform it or completely abolish it.

A number of state judges have ruled against Amazon, for example, and its Section 230 defense in a number of case-specific instances in Texas and California. Experts on a panel in May said if courts keep ruling against the defense, there may be a deluge of lawsuits to come against companies.

And last month, citing some of these cases, lawyers argued that big tech companies may begin to shy away from bringing the 230 defense to court in fear of awakening lawmakers to changing legal views on the provision that could ignite its reform.

Continue Reading

Section 230

Companies May Hesitate Bringing Section 230 Arguments in Court Fearing Political Ramifications: Lawyers

Legal experts say changing views on Section 230 will make platforms less willing to employ that defense in future cases.

Published

on

Carrie Goldberg, founder of C.A. Goldberg law firm

July 14, 2021—Legal experts are speculating that companies may shy away from testing Section 230 arguments in future court cases because recent legal decisions against the defense could influence political action on amending the intermediary liability provision.

Section 230 of the Communications Decency Act offers online platforms immunity from civil liability based on content their users post on their websites. But recent decisions by various courts that have ruled against the companies’ Section 230 defenses and held them liable for incidents could have a lasting effect on how companies approach these cases.

“People are being a lot more thoughtful when they use a 230 defense, and sometimes not using one at all, because they realize that that just won’t bode well for their future cases,” Michele Lee, assistant general counsel and the head of litigation at social media company Pinterest, said at a conference hosted by the Federal Communications Bar Association on Tuesday.

“The number of companies that operate within this space, frankly, aren’t that many. And I think people are thinking much more long term than just the cases that are in front of them.”

Legal experts at the conference argued that firms would be increasingly selective about what cases they elect to employ for a Section 230 defense. The more attention it receives, they argue, the more likely it is to receive political attention, which could reignite discussion about its reform.

Debate about what to do with Section 230 has enamored Capitol Hill for many months, with the climax of discussions occurring after former President Donald Trump was banned from several platforms at the start of the year for comments he made on the services that allegedly stoked the Capitol riot on January 6.

Since then, several proposed amendments were put forth, including from Sen. Amy Klobuchar, D-Minnesota, who proposed to keep Section 230 protections largely the same except for paid content.

And last month, Sen. Marco Rubio, R-Florida, introduced his own proposed legislation, which would “halt Big Tech’s censorship of Americans, defend free speech on the internet, and level the playing field to remove unfair protections that shied massive Silicon Valley firms from accountability.”

Legal precedent and policy: two vehicles for change

The concern for companies that provide platforms for the flow of information is that they could lose certain liability protections through legislation or a change in precedent. Historically, those protections did take up much mental real estate for Congresspeople, the White House and is often held up in court.

But that tide may be shifting.

In May, the court ruled against the popular messaging company Snapchat’s Section 230 defense, claiming that it could be held civilly liable because it had created a dangerous product following the death of a 20-year-old Snapchat user who crashed his car in 2020 while using a filter on the app that rewarded fast driving.

Reaching 120 miles-per-hour at one point, the crash also killed two teenage passengers. Two of the victims’ parents sued Snapchat for wrongful death, claiming that the reward system on that filter encouraged reckless driving.

The case was thrown out of court on Section 230 grounds, but the Ninth Circuit Appeals Court revived the case, reversing the ruling and favoring the victims, holding Snapchat liable for creating an inherently dangerous product.

Carrie Goldberg, founder of C.A. Goldberg, a victims’ rights law firm, said Tuesday that this ruling offers a “small window of online platform accountability,” in which platforms might be held liable for published content when that content demonstrates a harm to the public.

Goldberg referenced another case out of Texas last month, where the state’s supreme court ruled that Facebook could be held liable after three plaintiffs filed separate suits against the company, alleging that they became victims of sex trafficking, being lured in through people they met on Facebook and Instagram.

Facebook claimed immunity through Section 230, but the court sided with the plaintiffs, saying the provision does not “create a lawless no-man’s-land on the Internet.” The court made a further clarification that Section 230 protects online platforms from the words or actions of others, but “[h]olding internet platforms accountable for their own misdeeds is quite another thing.”

This particular case may only be applicable to Texas jurisdiction, however, and hold little impact for the rest of the country, as part of the case was fought using a Texas-specific statute that allows civil lawsuits “against those who intentionally or knowingly benefit from participation in a sex-trafficking venture.”

In May, observers noted that a number of these legal decisions reversing course on Section 230 matters could lead to a floodgate of other lawsuits across the country.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending