Connect with us

Social Media

Twitter Takeover by Elon Musk Forces Conflict Over Free Speech on Social Networks

Transparency laws in Calif. and N.Y. are the ‘liberal’ counterpart to the ‘conservative’ speech laws in Texas and Florida.

Published

on

WASHINGTON, November 23, 2022 — As the Supreme Court prepares to hear two cases that may decide the future of content moderation, panelists on a Broadband Breakfast Live Online panel disagreed over the steps that platforms can and should take to ensure fairness and protect free speech.

Mike Masnick, founder and editor of Techdirt, argued that both sides of the aisle were attempting to control speech in one way or another, pointing to laws in California and New York as the liberal counterpoints to the laws in Texas and Florida that are headed to the Supreme Court.

“They’re not as blatant, but they are nudging companies to moderate in a certain way,” he said. “And I think those are equally unconstitutional.”

Censorship posed a greater threat to the ideal of free speech than would a law forcing platforms to carry certain content, said Bret Swanson, a nonresident senior fellow at the American Enterprise Institute.

“Free speech and pluralism, as an ethos for the country and really for the West, are in fact more important than the First Amendment,” he said.

At the same time, content moderation legislation is stalled by a sharp partisan divide, said Mark MacCarthy, a nonresident senior fellow in governance studies at the Brookings Institution’s Center for Technology Innovation.

“Liberals and progressives want action to remove lies and hate speech and misinformation from social media and the conservatives want equal time for conservative voices, so there’s a logjam gridlock that can’t move,” he said. “I think it might be broken if, as I predict, the Supreme Court says that the only way you can regulate social media companies is through transparency.”

Twitter’s past and current practices raise questions about bias and free speech

While talking about Elon Musk’s controversial changes to Twitter’s content moderation practices, panelists also discussed the impact of Musk’s rhetoric surrounding the topic more broadly.

“Declaring yourself as a free speech site without understanding what free speech actually means is something that doesn’t last very long,” Masnick said.

When a social media company like Twitter or Parler declares itself to be a “free speech site,” it is really just sending a signal to some of the worst people and trolls online to begin harassment, abuse and bigotry, he said.

That is not a sustainable business model, Masnick argued.

But Swanson took the opposite approach. He called Musk’s acquisition of Twitter “a real seminal moment in the history and the future of free speech,” and called it an antidote to “the most severe collapse of free speech maybe in American history.”

MacCarthy said he didn’t believe the oft-repeated assertion that Twitter was biased against conservatives before most Musk took over. “The only study I’ve seen of political pluralism on Twitter — and it was done by Twitter itself back when they had the staff to do that kind of thing — suggested that Twitter’s amplification and recommendation engines actually favored conservative tweets over liberal ones.”

Masnick agreed, pointing to other academic studies: “They seemed to bend over backwards to often allow conservatives to break the rules more than others,” he said.

Randolph May, president of The Free State Foundation, said that he was familiar with the studies but disagreed with their findings.

Citing the revelations from the laptop of Hunter Biden, a story that the New York Post broke in October 2020 about the Joe Biden’s son, May said: “To me, that that was a consequential censorship action. Then six months later before a congressional committee, [Twitter CEO] Jack Dorsey said, ‘Oops, we made we made a big mistake when we took down the New York Post stories.’”

Multiple possibilities for the future of content moderation

Despite his criticism of current practices, May said he did not believe platforms should eliminate content moderation practices altogether. He drew a distinction between topics subject to legitimate public debate and those posts that encourage terrorism or facilitate sex trafficking. Those kinds of posts should be subject to moderation practices, he said.

May made three suggestions for better content moderation practices: First, platforms should establish a presumption that they will not censor or downgrade material without clear evidence that their terms of service have been violated.

Second, platforms should work to enable tools that facilitate personalization of the user experience.

Finally, the current state of Section 230 immunity should be replaced with a “reasonableness standard,” he said.

Other panelists disagreed with the subjectivity of such a reasonableness standard. MacCarthy highlighted the Texas social media law, which bans discrimination based on viewpoint. “Viewpoint is undefined: What does that mean?” he asked.

“Does it mean you can’t get rid of Nazi speech, you can’t get rid of hate speech, you can’t get rid of racist speech? What does it mean? No one knows. And so here’s a requirement of government that no one can interpret. If I were the Supreme Court, I’d declare that void for vagueness in a moment.”

MacCarthy predicted that the Supreme Court would reject the content-based provisions in the Texas and Florida laws while upholding the transparency standard, opening the door, he argued, for bipartisan transparency legislation.

But to Masnick, even merely a transparency requirement would be an unsatisfactory result: “How would conservatives feel if the government said, ‘Fox News needs to be transparent about how they make their editorial decision making?’”

“I think everyone would recognize immediately that that is a huge First Amendment concern,” he said.

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, November 23, 2022, 12 Noon ET – Elon and Ye and Donald, Oh My!

With Elon Musk finally taking the reins at Twitter after a tumultuous acquisition process, what additional new changes will come to the world’s de facto public square? The world’s richest man has already reinstated certain banned accounts, including that of former president Donald Trump. Trump has made his own foray into the world of conservative social media, as has politically polarizing rapper Ye, formerly Kanye West, currently in the process of purchasing right-wing alternative platform Parler. Ye is no stranger to testing the limits of controversial speech. With Twitter in the hands of Musk, Parler in the process of selling and Trump’s Truth Social sort-of-kind-of forging ahead in spite of false starts, is a new era of conservative social media upon us?

Panelists

  • Mark MacCarthy, Nonresident Senior Fellow in Governance Studies, Center for Technology Innovation, Brookings Institution
  • Mike Masnick, Founder and Editor, Techdirt
  • Randolph May, President, The Free State Foundation
  • Bret Swanson, Nonresident Senior Fellow, American Enterprise Institute
  • Drew Clark (moderator), Editor and Publisher, Broadband Breakfast

Panelist resources:

Mark MacCarthy is a Nonresident Senior Fellow in Governance Studies at the Center for Technology Innovation at Brookings. He is also adjunct professor at Georgetown University in the Graduate School’s Communication, Culture, & Technology Program and in the Philosophy Department. He teaches courses in the governance of emerging technology, AI ethics, privacy, competition policy for tech, content moderation for social media, and the ethics of speech. He is also a Nonresident Senior Fellow in the Institute for Technology Law and Policy at Georgetown Law.

Mike Masnick is the founder and editor of the popular Techdirt blog as well as the founder of the Silicon Valley think tank, the Copia Institute. In both roles, he explores the intersection of technology, innovation, policy, law, civil liberties, and economics. His writings have been cited by Congress and the EU Parliament. According to a Harvard Berkman Center study, his coverage of the SOPA copyright bill made Techdirt the most linked-to media source throughout the course of that debate.

Randolph May is founder and president of The Free State Foundation, an independent, non-profit free market-oriented think tank founded in 2006. He has practiced communications, administrative, and regulatory law as a partner at major national law firms. From 1978 to 1981, May served as Assistant General Counsel and Associate General Counsel at the Federal Communication Commission. He is a past Chair of the American Bar Association’s Section of  Administrative Law and Regulatory Practice.

Bret Swanson is president of the technology research firm Entropy Economics LLC, a nonresident senior fellow at the American Enterprise Institute, a visiting fellow at the Krach Institute for Tech Diplomacy at Purdue University and chairman of the Indiana Public Retirement System (INPRS). He writes the Infonomena newsletter at infonomena.substack.com.

Drew Clark (moderator) is CEO of Breakfast Media LLC, the Editor and Publisher of BroadbandBreakfast.com and a nationally-respected telecommunications attorney. Under the American Recovery and Reinvestment Act of 2009, he served as head of the State Broadband Initiative in Illinois. Now, in light of the 2021 Infrastructure Investment and Jobs Act, attorney Clark helps fiber-based and wireless clients secure funding, identify markets, broker infrastructure and operate in the public right of way.

Social media controversy has centered around Elon Musk’s Twitter, Ye’s new role in Parler, and former U.S. President Donald Trump

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Reporter Em McPhie studied communication design and writing at Washington University in St. Louis, where she was a managing editor for the student newspaper. In addition to agency and freelance marketing experience, she has reported extensively on Section 230, big tech, and rural broadband access. She is a founding board member of Code Open Sesame, an organization that teaches computer programming skills to underprivileged children.

Social Media

Senate Commerce Committee Passes Two Bills To Protect Children Online

The bills failed to make headway in a previous Congress.

Published

on

Screenshot of Sen. Edward Markey, D-Mass., during the markup Thursday

WASHINGTON, July 27, 2023 – The Senate Commerce committee on Thursday swiftly passed two pieces of legislation aimed to protect the safety and privacy of children online, exactly one year after the same bills passed the committee but failed to advance further.

The first bill to clear the committee was the Kids Online Safety Act, which requires social media sites to put in place safeguards protecting users under the age of 17 from content that promotes harmful behaviors, such as suicide and eating disorders. KOSA was first introduced in 2022 by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, D-Tenn. It previously won bipartisan support but ultimately failed to become law.

The current version of the bill was reintroduced in May, gaining traction in several hearings, and picked up more than 30 co-sponsors. Several changes were made to the text, including a specific list of online harms and certain exemptions for support services, such as substance abuse groups that might unintentionally suffer from the bill’s requirements.

The bill was also amended Thursday to include a provision proposed by Sen. John Thune, R-S.D. that would require companies to disclose the use of algorithms for content filtering and give users the choice to opt out.

Critics of the bill, however, said the revised version largely resembled the original one and failed to address issues raised before. These concerns included sections that would require tech companies to collect more data to filter content and verify user age, as well as an infringement on children’s free speech.

Sen. Ted Cruz, R-Texas, supported the bill but agreed that more work needs to be done before it moves to the floor. Since the last committee’s markup of KOSA, several states have approved measures concerning children’s online safety that might be inconsistent with the existing provisions, he noted, proposing a preemptive provision to ensure the bill would be enforced regardless of state laws.

The Children and Teens’ Online Privacy Protection Act, or COPPA 2.0, introduced by Sen. Edward Markey, D-Mass., and Bill Cassidy, R-LA, was the second bill passed out of the committee. It expands on existing legislation that has been in effect since 2000 to protect children from harmful marketing. The bill would make it illegal for websites to collect data on children under the age of 16, outlaw marketing specifically aimed at kids, and allow parents to erase their kids’ information on the websites.

“It is time for Congress to meet this moment and to act with the urgency that these issues demand,” said Sen. Markey.

This pair of legislation is among many others that seek to protect children from online harms, none of which have made any headway in Congress so far.

Continue Reading

Free Speech

UK’s Online Safety Bill Likely to Impact American User Experience

The bill will affect the tone and content of discussion on U.S.-owned platforms that wish to continue offering UK services.

Published

on

Screenshot of Amy Peikoff of BitChute

WASHINGTON, July 21, 2023 – The United Kingdom’s Online Safety Bill will impact the American-based user’s experience on various platforms, said panelist at a Broadband Breakfast Live Online event Wednesday.  

The Online Safety Bill is the UK’s response to concerns about the negative impact of various internet platforms and applications. The core of the bill addresses illegal content and content that is harmful to children. It places a duty of care on internet sites, including social media platforms, search engines, and online shopping centers, to provide risk assessments for their content, prevent access to illegal content, protect privacy, and prevent children from accessing harmful content. 

The legislation would apply to any business that has a substantial user base in the UK, having unforeseen impacts on the end user experience, said Amy Peikoff, Chief Policy Officer of UK-based video-streaming platform, BitChute. 

Even though the legislation is not U.S. legislation, it will affect the tone and content of discussion on U.S.-owned platforms that wish to continue offering their services in the jurisdictions where this legislation will be enacted, said Peikoff. Already, the European Union’s Digital Services Act, is affecting Twitter, which is “throttling its speech” to turn out statistics that say a certain percentage of their content is “healthy,” she claimed. 

Large social media companies as we know them are finished, Peikoff said.  

Ofcom, the UK’s communications regulator, will be responsible to provide guidelines and best practices as well as conduct investigations and auditing. It will be authorized to apprehend revenue if a company fails to adhere to laws and may enact rules that require companies to provide user data to the agency and/or screen user messages for harmful content. 

Peikoff claimed that the legislation could set off a chain of events, “namely, that platforms like BitChute would be required to affirmatively, proactively scan every single piece of content – comments, videos, whatever posted to the platform – and keep a record of any flags.” She added that U.S-based communication would not be exempt. 

Meta-owned WhatsApp, a popular messaging app, has warned that it will exit the UK market if the legislation requires it to release data about its users or screen their messages, claiming that doing so would “compromise” the privacy of all users and threaten the encryption on its platform. 

Matthew Lesh, director of public policy and communications at the UK think tank Institute of Economic Affairs, said that the bill is a “recipe for censorship on an industrial, mechanical scale.” He warned that many companies will choose to simply block UK-based users from using their services, harming UK competitiveness globally and discouraging investors.  

In addition, Lesh highlighted privacy concerns introduced by the legislation. By levying fines on platforms that host harmful content accessible by children, companies may have to screen for children by requiring users to present government-issued IDs, presenting a major privacy concern for users.  

The primary issue with the bill and similar policies, said Lesh, is that it enacts the same moderation policies to all online platforms, which can limit certain speech and stop healthy discussion and interaction cross political lines. 

The bill is currently in the final stages of the committee stage in the House of Lords, the UK’s second chamber of parliament. Following its passage, the bill will go to the House of Commons in which it will either be amended or be accepted and become law. General support in the UK’s parliament for the bill suggests that the bill will be implemented sometime next year. 

This follows considerable debate in the United States regarding content moderation, many of which discussions are centered around possible reform of Section 230. Section 230 protects platforms from being treated as a publisher or speaker of information originating from a third party, thus shielding it from liability for the posts of the latter. 

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, July 19, 2023 – The UK’s Online Safety Bill

The UK’s Online Safety Bill seeks to make the country “the safest place in the world to be online” has seen as much upheaval as the nation itself in the last four years. Four prime ministers, one Brexit and one pandemic later, it’s just a matter of time until the bill finally passes the House of Lords and eventually becomes law. Several tech companies including WhatsApp, Signal, and Wikipedia have argued against its age limitation and breach of end-to-end encryption. Will this legislation serve as a model for governments worldwide to regulate online harms? What does it mean for the future of U.S. social media platforms?

Panelists

  • Amy Peikoff, Chief Policy Officer, BitChute
  • Matthew Lesh, Director of Public Policy and Communications at the Institute of Economic Affairs.
  • Drew Clark (moderator), Editor and Publisher, Broadband Breakfast

Panelist resources

Amy Peikoff is Chief Policy Officer for BitChute. She holds a BS in Math/Applied Science and a JD from UCLA, as well as a PhD in Philosophy from University of Southern California, and has focused in her academic work and legal activism on issues related to the proper legal protection of privacy. In 2020, she became Chief Policy Officer for the free speech social media platform, Parler, where she served until Parler was purchased in April 2023.

Matthew Lesh is the Director of Public Policy and Communications at the Institute of Economic Affairs. Matthew often appears on television and radio, is a columnist for London’s CityAM newspaper, and a regular writer for publications such as The TimesThe Telegraph and The Spectator. He is also a Fellow of the Adam Smith Institute and Institute of Public Affairs.

Drew Clark is CEO of Breakfast Media LLC. He has led the Broadband Breakfast community since 2008. An early proponent of better broadband, better lives, he initially founded the Broadband Census crowdsourcing campaign for broadband data. As Editor and Publisher, Clark presides over the leading media company advocating for higher-capacity internet everywhere through topical, timely and intelligent coverage. Clark also served as head of the Partnership for a Connected Illinois, a state broadband initiative.

 

 

 

Illustration from the Spectator

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook.

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Continue Reading

Free Speech

New Tool Measures Economic Impact of Internet Shutdowns

The calculator is being called a ‘major step forward’ for those pushing back against such shutdowns.

Published

on

Photo of a protest in Frankfurt, Germany by M K

July 10, 2023 – New measuring tool NetLoss launched by the Internet Society shows the impacts of internet shutdowns on economies including Iraq, Sudan and Pakistan, where government-mandated outages have cost millions of dollars in a matter of hours or days.

NetLoss, launched on June 28, calculated a four-hour shutdown in July in Iraq, implemented by the government to prevent cheating during high school exam season, resulted in an estimated loss of $1.6 million. In May, a shutdown in Pakistan cost more than $13 million over the span of four days, while a five-day internet outage in Sudan in April cost the economy more than $4 million and resulted in the loss of 560 jobs.

NetLoss is unique among other internet assessment tools as it also includes subsequent economic impacts on the unemployment rate, foreign direct investments, and the risk of future shutdowns, claimed the advocacy group Internet Society. It provides data on both ongoing and anticipated shutdowns, drawing from historical dataset of over 90 countries dating back to 2019.

“The calculator is a major step forward for the community of journalists, policymakers, technologists and other stakeholders who are pushing back against the damaging practice of Internet shutdowns,” said Andrew Sullivan, CEO of the Internet Society. “Its groundbreaking and fully transparent methodology will help show governments around the world that shutting down the Internet is never a solution.”

The tool relies on open-access databases, including the Internet Society Pulse’s Shutdown data, the World Bank’s economic indicators, the Armed Conflict Location and Event Data Project’s civil unrest data, Yale University’s election data, and other relevant socioeconomic factors. To stay up to date with real-time changes, the data will be updated quarterly.

According to the press release, internet shutdowns worldwide peaked in 2022 with governments increasingly blocking internet services due to concerns over civil unrest or cybersecurity threats. These disruptions are extremely damaging to the economy, read the document, as they impede online commercial activities and expose companies and the economy to financial and reputational risks.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending