Connect with us

Big Tech

Broadband Roundup: FTC Against Public Utility Broadband Regulation, Holder Slaps Tech Industry, and FCC Video News

Published

on

WASHINGTON, October 2, 2014 ­- Federal Trade Commissioner Maureen Ohlhausen warned that reclassifying broadband under public utility regulation including in Title II of the Communications Act would put ISPs beyond the legal reach of the FTC, the Washington Post reported. The item was previously reported in Broadband Breakfast. Currently, the FTC is not able to bring actions against common carriers. Title II reclassification would put many more companies under such a regime.

Ohlhausen stated that she is less worried about the loss of FTC power, and more about consumers’ loss of FTC protection. While Ohlhausen and the FTC have not explicitly stated how best protect the open internet, her comments are a testament to the Commission’s conviction in the power of antitrust law to oversee internet service providers.

Holder Slaps Tech Sector on Mobile Device Encryption

Departing Attorney General Eric Holder said that new forms of device encryption, including those expected to be included in Apple’s iOS 8 and Google’s Android L, could put children at increased risk, the Washington Post reported. Holder warned that the inability to decrypt mobile device data could jeopardize investigations into time-sensitive crimes like kidnapping.

“It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy,” Holder said. “When a child is in danger, law enforcement needs to be able to take every legally available step to quickly find and protect the child and to stop those that abuse children. It is worrisome to see companies thwarting our ability to do so.”

Many privacy advocates claim that companies like Google and Apple are just adding the same type encryption that has historically been found in personal computers. Many individuals now use their personal mobile devices as their “computer.”

FCC Overturns Sports Blackout Rule, Wants Comcast Programming Contracts

In a unanimous decision, the Federal Communications Commission on Tuesday announced that it would repeal the sports blackout rule, which commissioners called “outdated.” The rule prohibited satellite and cable operators from airing particular sports events on local broadcast stations. This often prevented local consumers from watching their teams’ games.

Separately, the FCC is asking media companies for their programming agreements with Comcast in order to aid in its review of the cable company’s proposed merger with Time Warner Cable. The information in the agreements, along with additional documents and data pertaining to the deal negotiations, would allow the Commission to assess the leverage the combined cable company would have over its media partners, the agency said.

However, the agency has received pushback from media companies like CBS, 21st Century Fox, Disney, and Viacom. They say their programming contracts contain “extremely sensitive business data and information, and highly proprietary and scrupulously protected terms and conditions,” reported The Wall Street Journal.

Continue Reading
Click to comment

Leave a Reply

Antitrust

FTC Chair Warns Artificial Intelligence Industry of Vigorous Enforcement

The FTC’s statute on consumer protection that ‘prohibits unfair deceptive practices’ extends to AI, said Kahn.

Published

on

WASHINGTON, October 2, 2023 – The chair of the Federal Trade Commission warned the artificial intelligence industry Wednesday that the agency is prepared to clamp down on any monopolistic practices, as she proposed more simplistic rules to avoid confrontation.

“We’re really firing on all cylinders to make sure that we’re meeting the moment and the enormous and urgent need for robust and vigorous enforcement,” Lina Khan said at the AI and Tech Summit hosted by Politico on Wednesday.

Khan emphasized that the FTC’s statute on consumer protection “prohibits unfair deceptive practices” and that provision extends to AI development.

The comments come as artificial intelligence products advance at a brisk pace. The advent of new chat bots – such as those from OpenAI and Google that are driven by the latest advances in large language models – has meant individuals can use AI to create content from basic text prompts.

Khan stated that working with Congress to administer “more simplicity in rules” to all businesses and market participants could promote a more equal playing field for competitors.

“It’s no secret that there are defendants that are pushing certain arguments about the FTC’s authority,” Khan said. “Historically we’ve seen that the rules that are most successful oftentimes are ones that are clear and that are simple and so a regime where you have bright line rules about what practices are permitted, what practices are prohibited, I think could provide a lot more clarity and also be much more administrable.”

Khan’s comments came the day before the agency and 17 states filed an antitrust lawsuit against Amazon, which is accusing the e-commerce giant of utilizing anticompetitive practices and unfair strategies to sustain its supremacy in the space.

“Obviously we don’t take on these cases lightly,” Khan said. “They are very resource intensive for us and so we think it’s a worthwhile use of those resources given just the significance of this market, the significance of online commerce, and the degree to which the public is being harmed and being deprived of the benefits of competition.”

Since being sworn in 2021, Khan’s FTC has filed antitrust lawsuits against tech giants Meta, Microsoft, and X, formerly known as Twitter.

Continue Reading

Social Media

Senate Commerce Committee Passes Two Bills To Protect Children Online

The bills failed to make headway in a previous Congress.

Published

on

Screenshot of Sen. Edward Markey, D-Mass., during the markup Thursday

WASHINGTON, July 27, 2023 – The Senate Commerce committee on Thursday swiftly passed two pieces of legislation aimed to protect the safety and privacy of children online, exactly one year after the same bills passed the committee but failed to advance further.

The first bill to clear the committee was the Kids Online Safety Act, which requires social media sites to put in place safeguards protecting users under the age of 17 from content that promotes harmful behaviors, such as suicide and eating disorders. KOSA was first introduced in 2022 by Sen. Richard Blumenthal, D-Conn., and Sen. Marsha Blackburn, D-Tenn. It previously won bipartisan support but ultimately failed to become law.

The current version of the bill was reintroduced in May, gaining traction in several hearings, and picked up more than 30 co-sponsors. Several changes were made to the text, including a specific list of online harms and certain exemptions for support services, such as substance abuse groups that might unintentionally suffer from the bill’s requirements.

The bill was also amended Thursday to include a provision proposed by Sen. John Thune, R-S.D. that would require companies to disclose the use of algorithms for content filtering and give users the choice to opt out.

Critics of the bill, however, said the revised version largely resembled the original one and failed to address issues raised before. These concerns included sections that would require tech companies to collect more data to filter content and verify user age, as well as an infringement on children’s free speech.

Sen. Ted Cruz, R-Texas, supported the bill but agreed that more work needs to be done before it moves to the floor. Since the last committee’s markup of KOSA, several states have approved measures concerning children’s online safety that might be inconsistent with the existing provisions, he noted, proposing a preemptive provision to ensure the bill would be enforced regardless of state laws.

The Children and Teens’ Online Privacy Protection Act, or COPPA 2.0, introduced by Sen. Edward Markey, D-Mass., and Bill Cassidy, R-LA, was the second bill passed out of the committee. It expands on existing legislation that has been in effect since 2000 to protect children from harmful marketing. The bill would make it illegal for websites to collect data on children under the age of 16, outlaw marketing specifically aimed at kids, and allow parents to erase their kids’ information on the websites.

“It is time for Congress to meet this moment and to act with the urgency that these issues demand,” said Sen. Markey.

This pair of legislation is among many others that seek to protect children from online harms, none of which have made any headway in Congress so far.

Continue Reading

Free Speech

UK’s Online Safety Bill Likely to Impact American User Experience

The bill will affect the tone and content of discussion on U.S.-owned platforms that wish to continue offering UK services.

Published

on

Screenshot of Amy Peikoff of BitChute

WASHINGTON, July 21, 2023 – The United Kingdom’s Online Safety Bill will impact the American-based user’s experience on various platforms, said panelist at a Broadband Breakfast Live Online event Wednesday.  

The Online Safety Bill is the UK’s response to concerns about the negative impact of various internet platforms and applications. The core of the bill addresses illegal content and content that is harmful to children. It places a duty of care on internet sites, including social media platforms, search engines, and online shopping centers, to provide risk assessments for their content, prevent access to illegal content, protect privacy, and prevent children from accessing harmful content. 

The legislation would apply to any business that has a substantial user base in the UK, having unforeseen impacts on the end user experience, said Amy Peikoff, Chief Policy Officer of UK-based video-streaming platform, BitChute. 

Even though the legislation is not U.S. legislation, it will affect the tone and content of discussion on U.S.-owned platforms that wish to continue offering their services in the jurisdictions where this legislation will be enacted, said Peikoff. Already, the European Union’s Digital Services Act, is affecting Twitter, which is “throttling its speech” to turn out statistics that say a certain percentage of their content is “healthy,” she claimed. 

Large social media companies as we know them are finished, Peikoff said.  

Ofcom, the UK’s communications regulator, will be responsible to provide guidelines and best practices as well as conduct investigations and auditing. It will be authorized to apprehend revenue if a company fails to adhere to laws and may enact rules that require companies to provide user data to the agency and/or screen user messages for harmful content. 

Peikoff claimed that the legislation could set off a chain of events, “namely, that platforms like BitChute would be required to affirmatively, proactively scan every single piece of content – comments, videos, whatever posted to the platform – and keep a record of any flags.” She added that U.S-based communication would not be exempt. 

Meta-owned WhatsApp, a popular messaging app, has warned that it will exit the UK market if the legislation requires it to release data about its users or screen their messages, claiming that doing so would “compromise” the privacy of all users and threaten the encryption on its platform. 

Matthew Lesh, director of public policy and communications at the UK think tank Institute of Economic Affairs, said that the bill is a “recipe for censorship on an industrial, mechanical scale.” He warned that many companies will choose to simply block UK-based users from using their services, harming UK competitiveness globally and discouraging investors.  

In addition, Lesh highlighted privacy concerns introduced by the legislation. By levying fines on platforms that host harmful content accessible by children, companies may have to screen for children by requiring users to present government-issued IDs, presenting a major privacy concern for users.  

The primary issue with the bill and similar policies, said Lesh, is that it enacts the same moderation policies to all online platforms, which can limit certain speech and stop healthy discussion and interaction cross political lines. 

The bill is currently in the final stages of the committee stage in the House of Lords, the UK’s second chamber of parliament. Following its passage, the bill will go to the House of Commons in which it will either be amended or be accepted and become law. General support in the UK’s parliament for the bill suggests that the bill will be implemented sometime next year. 

This follows considerable debate in the United States regarding content moderation, many of which discussions are centered around possible reform of Section 230. Section 230 protects platforms from being treated as a publisher or speaker of information originating from a third party, thus shielding it from liability for the posts of the latter. 

Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.

Wednesday, July 19, 2023 – The UK’s Online Safety Bill

The UK’s Online Safety Bill seeks to make the country “the safest place in the world to be online” has seen as much upheaval as the nation itself in the last four years. Four prime ministers, one Brexit and one pandemic later, it’s just a matter of time until the bill finally passes the House of Lords and eventually becomes law. Several tech companies including WhatsApp, Signal, and Wikipedia have argued against its age limitation and breach of end-to-end encryption. Will this legislation serve as a model for governments worldwide to regulate online harms? What does it mean for the future of U.S. social media platforms?

Panelists

  • Amy Peikoff, Chief Policy Officer, BitChute
  • Matthew Lesh, Director of Public Policy and Communications at the Institute of Economic Affairs.
  • Drew Clark (moderator), Editor and Publisher, Broadband Breakfast

Panelist resources

Amy Peikoff is Chief Policy Officer for BitChute. She holds a BS in Math/Applied Science and a JD from UCLA, as well as a PhD in Philosophy from University of Southern California, and has focused in her academic work and legal activism on issues related to the proper legal protection of privacy. In 2020, she became Chief Policy Officer for the free speech social media platform, Parler, where she served until Parler was purchased in April 2023.

Matthew Lesh is the Director of Public Policy and Communications at the Institute of Economic Affairs. Matthew often appears on television and radio, is a columnist for London’s CityAM newspaper, and a regular writer for publications such as The TimesThe Telegraph and The Spectator. He is also a Fellow of the Adam Smith Institute and Institute of Public Affairs.

Drew Clark is CEO of Breakfast Media LLC. He has led the Broadband Breakfast community since 2008. An early proponent of better broadband, better lives, he initially founded the Broadband Census crowdsourcing campaign for broadband data. As Editor and Publisher, Clark presides over the leading media company advocating for higher-capacity internet everywhere through topical, timely and intelligent coverage. Clark also served as head of the Partnership for a Connected Illinois, a state broadband initiative.

 

 

 

Illustration from the Spectator

WATCH HERE, or on YouTubeTwitter and Facebook.

As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.

SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTubeTwitter and Facebook.

See a complete list of upcoming and past Broadband Breakfast Live Online events.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending