Frances Haugen, U.S. House Witnesses Say Facebook Must Address Social Harms

The former Facebook employee-turned-whistleblower said the company must be accountable for the social harm it causes.

Frances Haugen, U.S. House Witnesses Say Facebook Must Address Social Harms
Facebook whistleblower Frances Haugen

WASHINGTON, December 2, 2021 – Facebook whistleblower Frances Haugen told the House Subcommittee on Communications and Technology on Wednesday that the committee must act to investigate Facebook’s social harms to consumers.

Haugen said Congress should be concerned about how Facebook’s products are used to influence vulnerable populations.

Haugen’s testimony, delivered at Wednesday’s subcommittee hearing, urged lawmakers to impose accountability and transparency safeguards on Facebook to prevent it from misleading the public. It comes on the heels of her first testimony in October in front of the subcommittee on consumer protection, product safety and data security in which she urged Congress to force Facebook to make its internal research public allegedly because it can’t be trusted to act on it.

That testimony came after she leaked documents to the Wall Street Journal and the Securities and Exchange Commission that suggested Facebook knew about the negative mental health impacts of photo-sharing app Instagram had on its teen users but allegedly did nothing to combat it.

“No efforts to address these problems are ever going to be effective if Facebook is not required to share data in support of its claims or be subject to oversight of its business decisions,” Haugen said Wednesday. “The company’s leadership keeps vital information from the public, the U.S. government, its shareholders, and governments around the world. The documents I have provided prove that Facebook has repeatedly misled us about what its own research reveals about the safety of children, its role in spreading hateful and polarizing messages, and so much more.”

Facebook’s impact on communities of color

Among the social harms that advocates highlighted, lawmakers were particularly interested in Facebook’s negative impact on communities of color. Rashad Robinson, president of online racial justice organization Color of Change, expressed frustration at technology companies’ disregard for the truth.

“I have personally negotiated with leaders and executives at Big Tech corporations like Facebook, Google, Twitter and Airbnb, including Mark Zuckerberg, over a number of years,” Robinson said. “I sat across the table from him, looking into his eyes, experiencing firsthand the lies, evasions, ignorance and complete lack of accountability to any standard of safety for Black people and other people of color.”

Robinson recalled during the height of the national racial justice protests in 2020 that Zuckerberg told him that the harms Black people were experiencing on Facebook “weren’t reflected in their own internal data.” Now, Robinson said, “we know from the documents shared by Frances Haugen and others that his internal researchers were, in fact, sounding alarms at the exact same time.”

Robinson also highlighted how Facebook’s own data shows that the company disables Black users for less extreme content more often than white users, “often for just talking about the racism they face,” he said.

To foster real solutions for social media consumer protection, Robinson suggests that lawmakers reform Section 230 of the Communications Decency Act to hold companies accountable for minimizing the adverse impact of the content from which they profit.

Currently, Section 230 shields online platforms from liability derived from content posted on their platforms that leads to harm. Conservative advocates for gutting Section 230 say the law should be repealed because it gives social media companies too much power to censor conservative voices, while proponents of keeping Section 230 argue that the law is necessary in some capacity because it allows for the free exchange of thoughts and ideas in our society.

Robinson said reforming Section 230 to impose liability for content on the companies sites would “protect people against Big Tech design features that amplify or exploit content that is clearly harmful to the public.”

These recommendations come as the House considered four social media consumer protection bills on Wednesday: H.R. 2154, the “Protecting Americans from Dangerous Algorithms Act”; H.R. 3184, the “Civil Rights Modernization Act of 2021”; H.R. 3421, the “Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act” or the “SAFE TECH Act”; and H.R. 5596, the “Justice Against Malicious Algorithms Act of 2021.”