Lawyers: FTC ‘Censorship’ Inquiry Itself Violates First Amendment

Critics say the agency is acting beyond its authority by probing how platforms police speech.

Lawyers: FTC ‘Censorship’ Inquiry Itself Violates First Amendment
Photo from an event on Thursday, May 15, 2025 hosted by TechFreedom and the Competitive Enterprise Institute. From left to right: Berin Szóka, president and founder of TechFreedom; Scott Wilkens, senior counsel at the Knight First Amendment Institute; Bob Corn-Revere, chief counsel at the Foundation for Individual Rights and Expression; Jonathan Emord, constitutional and administrative lawyer; and Casey Mattox, vice president for legal strategy at Stand Together.

WASHINGTON, May 21, 2025 – Federal regulators are weighing whether content moderation practices of social media companies may violate competition or consumer protection laws. But First Amendment lawyers and civil society advocates warn that the Federal Trade Commission was walking into constitutionally dangerous territory.

During a panel hosted Thursday by TechFreedom and the Competitive Enterprise Institute, legal experts said the FTC’s inquiry into how platforms enforce their content rules could violate longstanding legal precedent that shields editorial discretion from government interference.

The FTC issued a Request for Information in February seeking details on how platforms make decisions about what speech to host, remove, or amplify. Chairman Andrew Ferguson has framed the inquiry as a potential antitrust issue, warning in March that what he called censorship by dominant platforms could fall under the FTC’s authority to address anticompetitive or deceptive conduct.

Pushing back on 'anticompetitive' framing

But panelists pushed back on that framing, emphasizing that any federal attempt to influence platform moderation decisions – even through informal mechanisms like public inquiries – risks chilling speech and infringing on First Amendment protections.

“The First Amendment is intentionally designed to deprive the government of any power over speech and press,” said Jonathan Emord, a constitutional attorney and former FCC litigator. “Yet, we’ve constantly, up to the present moment, seen extraordinary examples of government censorship.”

Bob Corn-Revere, a First Amendment attorney and partner at Davis Wright Tremaine, said the FTC may be acting ultra vires, or beyond its statutory authority. 

“Ferguson hasn’t been coy about the motivation behind this proceeding,” Corn-Revere said, pointing to a recent interview in which the FTC chairman cited concerns about platforms suspending accounts of politicians tied to the January 6 Capitol riots. “There’s really no question about what the FTC is hoping to do,” he added.

Panelists cited the Supreme Court’s 2024 ruling in NetChoice v. Paxton, which reaffirmed that private platforms have a constitutional right to make content moderation decisions, similar to the editorial freedom granted to newspapers.

Emord further argued that under the Supreme Court’s decision in Loper Bright Enterprises v. Raimondo, which overturned the Chevron doctrine, the FTC lacks a statutory basis to regulate content moderation at all. “There is no foundation in the FTC Act – Sections 5, 12, or 45 – for the agency to engage in this kind of inquiry,” he said.

A second panel challenged assumptions of FTC inquiry

A separate panel of researchers hosted Monday by TechFreedom challenged the FTC’s assumption that conservative content was being disproportionately censored. 

Citing both a 2022 Nature study by MIT’s David Rand and a March 2025 analysis led by Dean Jackson of the National Endowment for Democracy, experts said higher suspension rates for conservative users correlate with a greater tendency to share low-quality or misleading information, a pattern confirmed even by conservative raters. 

Jackson’s study, which compiled input from 14 researchers, further found that platform algorithms often amplify conservative content rather than suppress it, complicating claims of political bias.

“Policymakers must recognize that some partisan disparities in enforcement are inevitable even under neutral rules aimed at curbing the spread of false information,” Jackson said.

Other panelists emphasized that the FTC can address real harms in ways that respect constitutional boundaries. Lisa Macpherson, senior policy analyst at Public Knowledge, argued the commission should focus on promoting competition and transparency without interfering in content decisions.

“The Commission can use its Section 5 powers related to competition and consumer protection to help ensure free expression in technology, and do so in a content-neutral, constitutionally compatible way,” she said.

Macpherson urged the FTC to pursue pro-competition measures such as data portability and interoperability, which could shrink the dominance of major platforms and open the door for smaller competitors.

On the consumer protection front, she recommended that the FTC ensure platforms provide clear terms of service, robust appeal processes, and meaningful human interaction, particularly when automated systems are used to enforce moderation rules. Platforms that misrepresent or fail to enforce their own community standards, she said, should be held accountable.

Finally, Macpherson said the FTC should push for privacy-protected access to platform data for independent researchers in order to better understand the true impact of algorithms and moderation policies.

Comments on the FTC’s inquiry were due Wednesday, May 21, 2025. As of publication, more than 3,300 submissions had been filed to the FTC’s docket.

Popular Tags