Big Tech
Panelists Recommend More Concentrated Focus on Federal Privacy Legislation
The federal parties have opposite aims as to what to do about Section 230.
WASHINGTON, March 9, 2023 – The party stalemate on addressing Section 230 concerns in Washington gives lawmakers an opportunity to focus on crafting federal privacy legislation, according to panelists at Broadband Breakfast’s Big Tech and Speech Summit on Thursday.
The Democrats and the Republicans have taken opposite positions on what to do with the liability provision under the Communications Decency Act, which shields technology platforms from the legal consequences of what their users post. That division – to allow more or less content moderation on platforms – coupled with the Republicans taking back the House, means the issue may not be resolved in a timely manner – if ever.
That’s why a focus on federal privacy legislation should grip lawmakers to avoid the negative effects of a patchwork of different state laws with different interpretations of privacy, according to panelists at the event Thursday.

Photo of Subcommittee Chair Gus Bilirakis at Big Tech & Speech Summit Thursday by Tim Su
“You cannot have 50 different regimes to manage the privacy and data breach regulations of the companies,” said Steve DelBianco, president and CEO of NetChoice, a trade association for free speech and enterprise. “I am not so worried about Section 230 because the two parties that run this country have completely opposite aims in mind for 230.” NetChoice has been one of the main opponents to social media laws in Florida and Texas that would restrict certain moderation practices by tech platforms.
Dane Snowden, senior advisor at telecom law firm Wilkinson Barker Knauer, also noted that there’s no common definition of the Section 230 problem. “The challenge that we have right now is there’s not a common definition of the problem that you’re trying to fix…until you have that, you’re going to have both parties going in opposite directions on 230.
“I think privacy is the number one thing we should focus on – we need to have a national privacy framework.” Snowden illustrated the problem by using the example of a product that must go through multiple jurisdictions to get to its destination. He said this is unfeasible when those jurisdictions have different laws.
But Eli Noam, the director of Columbia University’s Institute for Tele-Information – who gave a keynote speech on the use of artificial intelligence for the metaverse – said there may be some upside with state privacy laws because it would allow them to explore and experiment on privacy rules.
On Section 230, Amy Peikoff, head of policy and legal of social media company Parler, noted she’s glad there’s a stalemate because she said “any amendment that would come forth right now would make it worse.”
Earlier this month, members of the House Innovation, Data and Commerce subcommittee reiterated their support for a federal privacy legislation and discussed how to build on the previously introduced American Data Privacy and Protection Act before the midterm-induced turnover in Congress.
The ADPPA addressed algorithmic bias testing, limits on targeted advertising to kids and a pre-emption provision that would allow the federal law to usurp state law.
Subcommittee Chair Gus Bilirakis opened the summit with remarks about the need to amend Section 230 to address problems associated with kids’ use of social media, including suicidal ideations.
To watch the full videos join the Broadband Breakfast Club below. We are currently offering a Free 30-Day Trial: No credit card required!
Social Media
Congress Grills TikTok CEO Over Risks to Youth Safety and China
House lawmakers presented a united front against TikTok as calls for a national ban gain momentum.

WASHINGTON, March 24, 2023 — TikTok CEO Shou Zi Chew faced bipartisan hostility from House lawmakers during a high-profile hearing on Thursday, struggling to alleviate concerns about the platform’s safety and security risks amid growing calls for the app to be banned from the United States altogether.
For more than five hours, members of the House Energy and Commerce Committee lobbed criticisms at TikTok, often leaving Chew little or no time to address their critiques.
“TikTok has repeatedly chosen the path for more control, more surveillance and more manipulation,” Chair Cathy McMorris Rodgers, R-Wash., told Chew at the start of the hearing. “Your platform should be banned. I expect today you’ll say anything to avoid this outcome.”
“Shou came prepared to answer questions from Congress, but, unfortunately, the day was dominated by political grandstanding,” TikTok spokesperson Brooke Oberwetter said in a statement after the hearing.
In a viral TikTok video posted Tuesday, and again in his opening statement, Chew noted that the app has over 150 million active monthly users in the United States. TikTok has also become a place where “close to 5 million American businesses — mostly small businesses — go to find new customers and to fuel their growth,” he said.
But McMorris Rodgers argued that the platform’s significant reach only “emphasizes the urgency for Congress to act.”
Lawmakers condemn TikTok’s impact on youth safety and mental health
One of the top concerns highlighted by both Republicans and Democrats was the risk TikTok poses to the wellbeing of children and teens.
“Research has found that TikTok’s addictive algorithms recommend videos to teens that create and exacerbate feelings of emotional distress, including videos promoting suicide, self-harm and eating disorders,” said Ranking Member Frank Pallone, D-N.J.
Chew emphasized TikTok’s commitment to removing explicitly harmful or violative content. The company is also working with entities such as the Boston Children’s Hospital to find models for content that might harm young viewers if shown too frequently, even if the content is not inherently negative — for example, videos of extreme fitness regimens, Chew explained.
In addition, Chew listed several safeguards that TikTok has recently implemented for underage users, such as daily default time limits and the prevention of private messaging for users under 16.
However, few lawmakers seemed interested in these measures, with some noting that they appeared to lack enforceability. Others emphasized the tangible costs of weak safety policies, pointing to multiple youth deaths linked to the app.
Rep. Gus Bilirakis, R-Fla., shared the story of a 16-year-old boy who died by suicide after being served hundreds of TikTok videos glorifying suicidal ideation, self-harm and depression — even though such content was unrelated to his search history, according to a lawsuit filed by his parents against the platform.
At the hearing, Bilirakis underscored his concern by playing a series of TikTok videos with explicit descriptions of suicide, accompanied by messages such as “death is a gift” and “Player Tip: K!ll Yourself.”
“Your company destroyed their lives,” Bilirakis told Chew, gesturing toward the teen’s parents. “Your technology is literally leading to death, Mr. Chew.”
Watch Rep. Bilirakis’ keynote address from the Big Tech & Speech Summit.
Other lawmakers noted that this death was not an isolated incident. “There are those on this committee, including myself, who believe that the Chinese Communist Party is engaged in psychological warfare through Tik Tok to deliberately influence U.S. children,” said Rep. Buddy Carter, R-Ga.
TikTok CEO emphasizes U.S. operations, denies CCP ties
Listing several viral “challenges” encouraging dangerous behaviors and substance abuse, Carter questioned why TikTok “consistently fails to identify and moderate these kinds of harmful videos” — and claimed that no such content was present on Douyin, the version of the app available in China.

Screenshot of Rep. Buddy Carter courtesy of CSPAN
Chew urged legislators to compare TikTok’s practices with those of other U.S. social media companies, rather than a version of the platform operating in an entirely different regulatory environment. “This is an industry challenge for all of us here,” he said.
Douyin heavily restricts political and controversial content in order to comply with China’s censorship regime, while the U.S. currently grants online platforms broad liability for third-party content.
In response to repeated accusations of CCP-driven censorship, particularly regarding the Chinese government’s human rights abuses against the Uyghur population, Chew maintained that related content “is available on our platform — you can go and search it.”
“We do not promote or remove content at the request of the Chinese government,” he repeatedly stated.
A TikTok search for “Uygher genocide” on Thursday morning primarily displayed videos that were critical of the Chinese government, Broadband Breakfast found. The search also returned a brief description stating that China “has committed a series of ongoing human rights abuses against Uyghers and other ethnic and religious minorities,” drawn from Wikipedia and pointing users to the U.S.-based website’s full article on the topic.
TikTok concerns bolster calls for Section 230 reform
Although much of the hearing was specifically targeted toward TikTok, some lawmakers used those concerns to bolster an ongoing Congressional push for Section 230 reform.
“Last year, a federal judge in Pennsylvania found that Section 230 protected TikTok from being held responsible for the death of a 10-year-old girl who participated in a blackout challenge,” said Rep. Bob Latta, R-Ohio. “This company is a picture-perfect example of why this committee in Congress needs to take action immediately to amend Section 230.”
In response, Chew referenced Latta’s earlier remarks about Section 230’s historical importance for online innovation and growth.
“As you pointed out, 230 has been very important for freedom of expression on the internet,” Chew said. “[Free expression] is one of the commitments we have given to this committee and our users, and I do think it’s important to preserve that. But companies should be raising the bar on safety.”
Rep. John Curtis, R-Utah., asked if TikTok’s use of algorithmic recommendations should forfeit the company’s Section 230 protections — echoing the question at the core of Gonzalez v. Google, which was argued before the Supreme Court in February.
Other inquiries were more pointed. Chew declined to answer a question from Rep. Randy Weber, R-Texas, about whether “censoring history and historical facts and current events should be protected by Section 230’s good faith requirement.”
Weber’s question seemed to incorrectly suggest that the broad immunity provided by Section 230 (c)(1) is conditioned on the “good faith” referenced in in part (c)(2)(A) of the statute.
Ranking member says ongoing data privacy initiative is unacceptable
Chew frequently pointed to TikTok’s “Project Texas” initiative as a solution to a wide range of data privacy concerns. “The bottom line is this: American data, stored on American soil, by an American company, overseen by American personnel,” he said.
All U.S. user data is now routed by default to Texas-based company Oracle, Chew added, and the company aims to delete legacy data currently stored in Virginia and Singapore by the end of the year.
Several lawmakers pointed to a Thursday Wall Street Journal article in which China’s Commerce Ministry reportedly said that a sale of TikTok would require exporting technology, and therefore would be subject to approval from the Chinese government.
When asked if Chinese government approval was required for Project Texas, Chew replied, “We do not believe so.”
But many legislators remained skeptical. “I still believe that the Beijing communist government will still control and have the ability to influence what you do, and so this idea — this ‘Project Texas’ — is simply not acceptable,” Pallone said.
Free Speech
Additional Content Moderation for Section 230 Protection Risks Reducing Speech on Platforms: Judge
People will migrate from platforms with too stringent content moderation measures.

WASHINGTON, March 13, 2023 – Requiring companies to moderate more content as a condition of Section 230 legal liability protections runs the risk of alienating users from platforms and discouraging communications, argued a judge of the District of Columbia Court of Appeal last week.
“The criteria for deletion are vague and difficult to parse,” Douglas Ginsburg, a Ronald Reagan appointee, said at a Federalist Society event on Wednesday. “Some of the terms are inherently difficult to define and policing what qualifies as hate speech is often a subjective determination.”
“If content moderation became very rigorous, it is obvious that users would depart from platforms that wouldn’t run their stuff,” Ginsburg added. “And they will try to find more platforms out there that will give them a voice. So, we’ll have more fragmentation and even less communication.”
Ginsburg noted that the large technology platforms already moderate a massive amount of content, adding additional moderation would be fairly challenging.
“Twitter, YouTube and Facebook remove millions of posts and videos based on those criteria alone,” Ginsburg noted. “YouTube gets 500 hours of video uploaded every minute, 3000 minutes of video coming online every minute. So the task of moderating this is obviously very challenging.”
John Samples, a member of Meta’s Oversight Board – which provides direction for the company on content – suggested Thursday that out-of-court dispute institutions for content moderation may become the preferred method of settlement.
The United States may adopt European processes in the future as it takes the lead in moderating big tech, claimed Samples.
“It would largely be a private system,” he said, and could unify and centralize social media moderation across platforms and around the world, referring to the European Union’s Digital Services Act that went into effect in November of 2022, which requires platforms to remove illegal content and ensure that users can contest removal of their content.
Antitrust
Panel Disagrees on Antitrust Bills’ Promotion of Competition
Panelists disagree on the effects of two antitrust bills intended to promote competition.

WASHINGTON, March 10, 2023 – In a fiery debate Thursday, panelists at Broadband Breakfast’s Big Tech and Speech Summit disagreed on the effect of bills intended to promote competition and innovation in the Big Tech platform space, particularly for search engines.
One such innovation is new artificial intelligence technology being designed to pull everything a user searches for into a single page, said Cheyenne Hunt-Majer, big tech accountability advocate with Public Citizen. It is built to keep users on the site and will drastically change competition in the search engine space, she said, touting the advancement of two bills currently awaiting Senate vote.

Photo of Adam Kovacevich of Chamber of Progress, Berin Szoka of TechFreedom, Cheyenne Hunt-Majer of Public Citizen, Sacha Haworth of Tech Oversight Project, Christine Bannan of Proton (left to right)
The first, the American Innovation and Choice Online Act, would prohibit tech companies from self-preferencing their own products on their platforms over third-party competition. The second, the Open App Markets Act, would prevent app stores from requiring private app developers to use the app stores’ in-app payment system.
Hunt-Majer said she believes that the bills would benefit consumers by kindling more innovation in big tech. “Perfect should not be the enemy of change,” she said, claiming that Congress must start somewhere, even if the bills are not perfect.
“We are seeing a jump ahead in a woefully unprepared system to face these issues and the issues it is going to pose for a healthy market of competition and innovation,” said Hunt-Majer.
It is good for consumers to be able to find other ways to search that Google isn’t currently providing, agreed Christine Bannan, U.S. public policy manager at privacy-focused email service Proton. The fundamental goal of these bills is directly at odds with big companies, which suggests its importance to curb anti-competitive behavior, she said.
No need to rewrite or draft new laws for competition
But while Berin Szoka, president of non-profit technology organization TechFreedom, said competition concerns are valid, the Federal Trade Commission is best equipped to deal with disputes without the need to rewrite or draft new laws. Congress must legislate carefully to avoid unintended consequences that fundamentally harm businesses and no legislation has done so to date, he said.
Both bills have broad anti-discrimination provisions which will affect Big Tech partnerships, Szoka continued.
Not all experts believe that AI will replace search engines, however. Google has already adopted specialized search results that directly answer search queries, such as math problems, instead of resulting in several links to related webpages, said Adam Kovacevich, CEO of Chamber of Progress, a center-left tech policy coalition.
Kovacevich said he believes that some search queries demand direct answers while others demand a wide range of sources, answers, and opinions. He predicts that there will be a market for both AI and traditional search engines like Google.
To watch the full videos join the Broadband Breakfast Club below. We are currently offering a Free 30-Day Trial: No credit card required!
-
Fiber4 weeks ago
‘Not a Great Product’: AT&T Not Looking to Invest Heavily in Fixed Wireless
-
Broadband Roundup3 weeks ago
AT&T Floats BEAD in USF Areas, Counties Concerned About FCC Map, Alabama’s $25M for Broadband
-
Big Tech3 weeks ago
House Innovation, Data, and Commerce Chairman Gus Bilirakis to Keynote Big Tech & Speech Summit
-
Big Tech2 weeks ago
Watch the Webinar of Big Tech & Speech Summit for $9 and Receive Our Breakfast Club Report
-
Big Tech2 weeks ago
Preview the Start of Broadband Breakfast’s Big Tech & Speech Summit
-
#broadbandlive2 weeks ago
Broadband Breakfast on March 22, 2023 – Robocalls, STIR/SHAKEN and the Future of Voice Telephony
-
#broadbandlive3 weeks ago
Broadband Breakfast on March 8: A Status Update on Tribal Broadband
-
Infrastructure5 days ago
BEAD Build Timelines in Jeopardy if ‘Buy America’ Waivers Not Granted, White House Budget Office Told