Pressed by Congress, Big Tech Defends Itself and Offers Few Solutions After Capitol Riot
March 26, 2021 – The heads of the largest social media companies largely defended their platforms, reiterated what they’ve done, and offered few solutions to the problems that ail them during a congressional hearing Thursday. But, under harsh questioning from the House Energy and Commerce Committee,
March 26, 2021 – The heads of the largest social media companies largely defended their platforms, reiterated what they’ve done, and offered few solutions to the problems that ail them during a congressional hearing Thursday.
But, under harsh questioning from the House Energy and Commerce Committee, none of the CEOs of Google, Facebook or Twitter were given chance to respond to questions for more than 30 to 60 seconds on a given topic.
The hearing was about misinformation on social media in the fallout of the January 6 Capitol riot. The CEOs said dealing with the problem of dis- and misinformation on their platforms is more difficult than people think.
“The responsibility here lies with the people who took the actions to break the law and do the insurrection,” Facebook CEO Mark Zuckerberg said in response to a question about whether the platforms were to blame for the riot.
“Secondarily, also, the people who spread that content, including the president, but others as well, with repeated rhetoric over time, saying that the election was rigged and encouraging people to organize. I think those people bare the primary responsibility as well,” Zuckerberg said.
Zuckerberg added that “polarization was rising in America long before social networks were even invented,” he said. He blamed the “political and media environment that drives Americans apart.”
A ‘complex question’ of fault
Google CEO Sundar Pichai said it’s a “complex question” in response to the question of who’s at fault for the riot. Twitter CEO Jack Dorsey, however, was more direct: “Yes, but you also have to take into consideration a broader ecosystem; it’s not just about the technology platforms we use,” he said.
It was the first time Zuckerberg, Dorsey and Pichai appeared on Capitol Hill since the January 6 insurrection at the U.S. Capitol. The hearing was spurred by the riot and the turbulent presidential election that concluded in Joe Biden’s win and Donald Trump’s ban from Twitter and Facebook. Congress has turned their eye toward the social media companies for several months on possible Section 230 reform to address the alleged problems in the tech industry.
“Our nation is drowning in misinformation driven by social media. Platforms that were once used to share kids with grandparents are all-too-often havens of hate, harassment and division,” said Rep. Mike Doyle, D-Penn., chairman of the Communications and Technology subcommittee, who led the hearing. Doyle alleged the platforms “supercharged” the riot.
Both Democratic and Republican members of the committee laid out a variety of grievances during the five-hour meeting, and while they didn’t all share the same concerns, all agreed that something needs to be done.
“I hope you can take away from this hearing how serious we are, on both sides of the aisle, to see many of these issues that trouble Americans addressed,” Doyle said.
Congressional concerns
On the left side of the political aisle the main criticism against the tech giants was the spread of misinformation and extremism, including COVID-19 vaccines, climate change and the 2020 presidential election that Trump alleged was rigged against him.
“It is not an exaggeration to say that your companies have fundamentally and permanently transformed our very culture, and our understanding of the world,” said Rep. Jan Schakowsky, D-Illinois. “Much of this is for good, but it is also true that our country, our democracy, even our understanding of what is ‘truth’ has been harmed by the proliferation and dissemination of misinformation and extremism,” she said.
“Unfortunately, this disinformation and extremism doesn’t just stay online, it has real-world, often dangerous and even violent consequences, and the time has come to hold online platforms accountable,” said Rep. Frank Pallone, D-N.J.
From the right, Republican members voiced concerns about too much censorship, easy access to opioids, and the harm on children they said social media has.
“I’m deeply concerned by your decisions to operate your companies in a vague and biased manner, with little to no accountability, while using Section 230 as a shield for your actions and their real-world consequences,” said Rep. Bob Latta, R-Ohio. “Your companies had the power to silence the president of the United States, shut off legitimate journalism in Australia, shut down legitimate scientific debate on a variety of issues, dictate which articles or websites are seen by Americans when they search the internet,” he said.
“Your platforms are my biggest fear as a parent,” said Rep. Cathy McMorris Rodgers, R-Washington, expressing frustration over the impact that social media has on children. “It’s a battle for their development, a battle for their mental health, and ultimately, a battle for their safety,” she said, citing a rise of teen suicides since 2011. “I do not want you defining what is true for them, I do not want their future manipulated by your algorithms,” she said.
Platforms say it’s challenging, reiterate initiatives
In response to the many criticisms, Zuckerberg made it clear that while moderating content is central to address misinformation, it is important to protect speech as much as possible while taking down illegal content, which he said can be a huge challenge. As an example, bullying hurts the victim but there’s not a clear line where we can just censor speech, he said.
Pichai said that Google’s mission is about organizing and delivering information to the world and allowing free expression while also combatting misinformation. But it’s an evolving challenge, he said, because approximately 15 percent of google searches each day are new, and 500 hours of video are uploaded to YouTube every minute. To reinforce that point, he cited the fact that 18 months ago no one had heard of COVID-19, and in 2020 ‘coronavirus’ was the most trending search.
Dorsey expressed a similar sentiment about the evolving challenge of balancing freedom of expression with content moderation. “We observe what’s happening on our service, we work to understand the ramifications, and we use that understanding to strengthen our operations. We push ourselves to improve based on the best information we have,” he said.
The best way to face new challenges is to narrow down the problem to have the greatest impact, Dorsey said. For example, disinformation is a broad concept, and we focused on disinformation leading to offline harm, he said. Twitter worked on three specific categories, he said, these included manipulated media, public health and civic integrity.
“Ultimately, we’re running a business, and a business wants to grow the number of customers it serves. Enforcing a policy is a business decision,” Dorsey said.
Dorsey noted Twitter’s new Bluesky project, a decentralized internet protocol that various social media companies would be able to utilize, rather than being owned by a single company. It will improve the social media environment by increasing innovation around business models, recommended algorithms, and moderation controls in the hands of individuals instead of private companies, he said. But others already working in a similar technology space say the project is not without its problems.
On Section 230 reform
On the question of changing Section 230 of the Telecommunications Act, which grants social media companies immunity from liability for user-generated content, Zuckerberg suggested two specific changes: Platforms need to issue transparency reports about harmful content, and need better moderation for content that is clearly illegal. These changes should only affect large social media platforms, he said, but did not specify the difference between a large and small platform.
Dorsey said those may be good ideas, but it could be difficult to determine what is a large and small platform, and having those stipulations may incentivize the wrong things.
When asked about Instagram’s new version for children, Zuckerberg confirmed it was in the planning stage and many details were still being worked out.
Several Democrats raised concerns about minority populations, citing as one example the March 16 shooting in Atlanta that killed eight people including several Asian American women. Rep. Doris Matsui, D-Cal., asked why various hashtags such as #kungflu and #chinavirus were not removed from Twitter.
Dorsey responded that Twitter does take action against hate speech, but it can also be a challenge because it’s not always simple to distinguish between content that supports an idea and counter speech that condemns the support of that idea.
The tech leaders were asked by multiple members about the platform algorithms failing to catch specific instances of content moderation. Democrats referred to examples of posts containing misinformation or hate speech, while Republicans used examples of conservative-based content being removed.
Both Zuckerberg and Dorsey said that their systems are not perfect and it’s not realistic to expect perfection. Some content will always slip by our radars that we have to address individually, Zuckerberg said.
In response to Rep. Steve Scalise’s reference to a 2020 New York Post story about Hunter Biden that was taken down, Dorsey said we have made mistakes in some instances.
Editor’s Note: This story has been revised to add in a second paragraph that more accurately captured the fact that, while the tech executives offered few solutions, they were given little opportunity to do so by members of Congress. Additionally, the word “secondarily” was added back into Facebook CEO Mark Zuckerberg’s statement about who bore responsibility for the insurrection.