Connect with us

Broadband Data

FCC Net Neutrality Workshop Examines Importance of Transparency

WASHINGTON, January 21, 2010 – The Federal Communications Commission in a Tuesday workshop explored consumer choice, user control of their online experience and the importance of transparency.

The agency’s fourth Open Internet Workshop on Consumers, Transparency and the Open Internet focused on the sixth principle of network neutrality – transparency.

Avatar

Published

on

WASHINGTON, January 21, 2010 – The Federal Communications Commission in a Tuesday workshop explored consumer choice, user control of their online experience and the importance of transparency.

The agency’s fourth Open Internet Workshop on Consumers, Transparency and the Open Internet focused on the sixth principle of network neutrality – transparency.

That principle states: “subject to reasonable network management, a provider of broadband internet access service must disclose such information concerning network management and other practices as is reasonably required for users and content, application, and service providers to enjoy the protections specified in this part.”

In opening remarks, FCC Chairman Julius Genachowski said that the sixth principle was most important because it provides consumers with the most information possible to make the best decisions and it also reduces government involvement in dispute by creating direct access to more publicly available information.

“When we talk about internet consumer and users, we mean not only an individual consumer subscribing to a fixed or mobile broadband service, but also an engineer in a garage or at a start up company who is developing and deploying a new application over the internet,” Genachowski said.

Approximately 120,000 people and organizations submitted filings on FCC’s notice of a proposed rulemaking on its “open Internet” policy.

Workshop moderators – Joel Gurin, the FCC’s head of its Consumer and Government Affairs Bureau, and Julius Knapp chief of the agency’s Office of Engineering and Technology –  framed the discussion by asking about the kind of information consumers might need about network management practices, tools they have, how to present the information and the appropriate role for the public in disclosing policies and practices.

A commenter said: “A large and diverse group believe that transparency can go far to preserve the Internet’s openness.”  He commended the variety of stakeholders that weighed in with positive comments on transparency and highlighted the constructive joint filing of Google and Verizon.

Agency Commissioner Michael Copps said he was happy to see that the FCC was returning to its role as a consumer protection agency.

Commissioner Mignon Clyburn asked whether service providers are providing consumers with services and plans most beneficial to them.  She was not convinced that consumers as well as content and application developers receive the information they need.  She brought up concerns of actual speeds being a fraction of advertised speeds and service providers not disclosing when they block IT traffic.

“Once there is disclosure of all plans and services only then can there be a thriving market place,” she said.

Commissioner Meredith Attwell Baker said that “while there was divergence in the best way to keep IT open, transparency is a path of more common ground.”

Federal Trade Commission Chairman Jon Leibowitz described the need for FCC involvement in promoting transparency and nondiscrimination.

“Transparency and an open internet are critical for consumers and innovation,” he said, but cautioned that “absent effort by the FCC, the open internet will not be a given.”

He reiterated that transparency and disclosure will enable consumers to pick affordable technology that fits their needs.

Konrad Finkenstein from the Canadian Radio-telephone and Telecommunications Commission spoke about his experiences leading to CRTC’s decision in October to create net neutrality rules based partly on the FCC’s principles.

Finkenstein defined two separate network management practices: economic practices based on regulation of use and technical practices like deep packet inspections, or DPI.

If the CRTC receives a complaint about a service provider, it must assess it based on the following criteria:

  • Is the network management technique designed to prevent a certain practice?
  • Is it narrowly tailored?
  • Does it cause as little harm as possible to the customer?
  • Is there any less intrusive way to achieve the same result?
  • The technique must be advertised 30 days in advance, and must be explained so that the average customer can understand. For example, why will it be used, when will it occur, what type of traffic is subject and how will it affect the users internet experience including the effect on speed?

If the service provider knows that the content will be degraded the company must come and get prior approval from the commission.

Finkenstein added that Canada also has required that data gained from administering a network management practice must be destroyed as soon as it is not needed to preserve customer privacy.

Sascha Meinrath, director of the Open Technologies Initiative at the New America Foundation, said that since 1995 with the onset of network privatization there has been a steady removal of information and data from the public domain.

“Data collection and transparency has disappeared…willful ignorance has led to bad policy actions and inaction…disempowered users allowed for a dysfunctional market,” he said.

He believes that the FCC has a responsibility to fix this problem and pointed the agency and others to New America’s Measurement Lab platform to deploy internet research tools.

Jay Monahan, vice president and general counsel of Vuze, said if a content provider were to block peer-to-peer traffic or label it “non time sensitive” that would seriously affect his legitimate business interests.  He says that without full disclosure from providers, consumers assume that it’s his company’s fault, for example, rather than the provider’s problem.

Monahan believes that transparency must be backed by network management principles and any technique used to slow or delay network traffic must be disclosed.

Parul Desai, vice president of the Media Access Project, argued that transparency rules are necessary because disclosure of network management practices are critical for allowing users and innovators to have realistic expectations from their internet experience.

She said disclosure is necessary to determine whether a particular management practice is designed to address legitimate congestion and traffic management issues.

“We need clear and conspicuous rules,” said Desai, adding that the CRTC model for addressing consumer complaints is a good model to follow.

Ron Dicklin, co-founder and CTO of Root Wireless, said consumers make educated decisions on their wireless services.  He added that most of today’s frustration is due to mismatched expectations.

Former FCC Chief Economist Gerald Faulhaber reminded the agency that the current market is consumer-centric where consumers drive decisions.  He explained that consumer-centric markets have three features: competition, transparency and judicious antitrust protection.

Faulhaber added that “successful competition requires transparency from all sides of the market including ISPs, application and content providers and backbone developers.”  He warned that information asymmetry could lead to potential market failure, and regulation will be required to ensure transparency if a market failure occurs.

Faulhaber defined transparency as credible information, immediately available at the time of purchase – it must be easy to understand and not buried on a web site or label.  He likened the form of the information to a nutrition label or a Food and Drug Administration prescription label.

Nicholas Weaver from the International Computer Science Institute at University of California, Berkley, demonstrated the broadband survey tool Netalyzr, which aids individuals in finding out more about their network.  He explained the results of his tests in Starbucks versus his test in the commission.  The results tell the users about bugs, latency, bandwidth buffering and other potential issues with the network.

David Young, vice president of regulatory affairs at Verizon, agreed with other panelists that robust transparency and disclosure is essential to increase the end user’s experience.

He said Netalyzer was a great detailed tool, but simpler tools are available too.  Third party evaluation that can do a side by side comparison of services would be useful, but no one has yet performed a real technical analysis.

Young suggested that the North American Network Operators Group should be tasked with creating a sounding board for the best practices when it comes to disclosure and network management.  He thinks that NANOG and not the FCC should determine what is best to disclose to consumers and how to disclose it.

Network operator executive Fernando Laguarda echoed the idea that the success of the company depends on customer satisfaction.  LgGuarda, a vice president with Time Warner Cable, said that Time Warner gives its customers very clear information regarding its plans, billing and termination, additionally they provide web assistance, local offices and 24 hour telephone service centers for their customers.  LaGuarda does not believe that the transparency principle needs to be codified.

LaGuarda cautioned that if the agency did mandate disclosure it should allow for flexibility and give users as much information as possible.

He said there has been too much focus on disclosure instead of information to customers.

“There are unfortunate consequences if the companies make too much detail available,” he said.

Joel Kelsey, policy analyst for Consumers Union, said he was pleased with the FCC’s comprehensive approach toward improving wireline and wireless disclosure.

Kelsey said the disclosure of network management practices were particularly important in providing consumers with an accurate representation of the internet service they can expect, and in ensuring that network management practices are narrowly tailored to address a legitimate purpose and not interfere with consumer access to a best efforts network.

“As a matter of good consumer disclosure policy, the FCC should stop ISPs from describing binding terms and conditions within a multi-page legal document in eight point font,” he said.

His also stressed the need for clarity versus detail: “Clarity of information on network practices is a function of information design.”

Kelsey said consumers have the ability to learn and understand more sophisticated terms such as octane levels, caloric intake or credit scores) as soon as the government provides an industry standard and mandates consistent disclosure of information.

Kelsey ended his testimony with a couple of examples of what a government mandate for meaningful network management disclosure should look like:

1. Any limits imposed on a subscriber’s upstream or downstream traffic. This includes blocking, delaying, de-prioritzing or prioritizing, or inserting traffic into the stream;

2. Technical details of the methods used;

3. Thresholds that trigger certain network management practices, an estimate of the percentage of users affected, and the duration of the practice. Examples include time of day, network congestion levels, user bandwidth consumption;

4. Any technology that inspects the content of Internet traffic, other than the processing of basic addressing information;

5. Differences in how the network is being allocated to different uses, including “managed services”. This includes the amount of capacity dedicated to Internet traffic, and if shared capacity, how it is shared’

The panelists continued to tackled issues surrounding the concept of “transparency.”

Weaver stressed the point that there should be two levels of disclosure, “a high level and a lower level.”

Meinrath said: “Security through obscurity is a great way to undermine the security of networks.”  He believes that there is almost nothing that should not be disclosed.  Weaver disagreed by saying that he did not believe that ISPs should be forced to expose certain detailed algorithms and techniques.

Kelsey asked the agency to focus on disclosure, saying what is reasonable versus unreasonable network management practices could differ between wireless and wireline providers.  Kelsey also agreed with the idea of having two levels of disclosure.

There was disagreement over when and if Deep Packet Inspections should be used.

Weaver said that DPI is more appropriate on some networks than on others, adding that it’s acceptable for an ISP to use DPI to cut out spam.  Meinrath countered by stating that DPI is almost never a good idea, and he believes that till we can define what spam is he does not want anyone else to filter his e-mails.

Faulhaber stated that throughout the history of telecom, networks have been subject to variability.  He would like to see providers release information that says “in your neighborhood, at your service tier, customers have received at least X levels of service during the busiest hours of the week.”

Dicklin and Desai liked the idea of progressive disclosure where consumers can dive into more detailed layers if they want to.

Young and Laguarda spoke for the providers and agreed on the need for collaboration to come up with a set of best practices.

Finally the panelists were asked about how they should think about wireless networks in terms of disclosure.  Can the labels look the same?

Young, speaking for Verizon believed that consumer should expect similar disclosure principles for wireline and wireless. Dicklin added that there needs to be a distinction made between fixed and mobile wireless.  There are different expectations from mobile.

Kelsey pointed out that some major issues with wireless providers are the blocked costs and switching fees all of which must be clearly disclosed to consumers before the purchase.

Weaver felt that when it came to wireless providers, pricing per usage should have a default total cost per month and when the ceiling is exceeded service should be cut off instead of running up the bill.

Meinrath ended by saying that consumer empowerment can only be achieved through government oversight.  The oversight must include disclosure, documentation of real speeds and practices and providers must be held accountable for the information they provide.

As Deputy Editor, Chris Naoum is curating expert opinions, and writing and editing articles on Broadband Breakfast issue areas. Chris served as Policy Counsel for Future of Music Coalition, Legal Research Fellow for the Benton Foundation and law clerk for a media company, and previously worked as a legal clerk in the office of Federal Communications Commissioner Jonathan Adelstein. He received his B.A. from Emory University and his J.D. and M.A. in Television Radio and Film Policy from Syracuse University.

Broadband Data

New Broadband Mapping Fabric Will Help Unify Geocoding Across the Broadband Industry, Experts Say

Tim White

Published

on

Photo of Lynn Follansbee from October 2019 by Drew Clark

March 11, 2021 – The Federal Communications Commission’s new “fabric” for mapping broadband service across America will not only help collect more accurate data, but also unify geocoding across the broadband industry, industry experts said during a Federal Communications Bar Association webinar Thursday.

Broadband service providers are not geocoding experts, said Lynn Follansbee of US Telecom, and they don’t know where all the people are.

The new fabric dataset is going to be very useful to get a granular look at what is and what is not served and to harmonize geocoding, she said.

AT&T’s Mary Henze agreed. “We’re a broadband provider, we’re not a GIS company,” she said. Unified geocode across the whole field will help a lot to find missing spots in our service area, she said.

The new Digital Opportunity Data Collection fabric is a major shift from the current Form 477 data that the FCC collects, which has been notoriously inaccurate for years. The effort to improve broadband mapping has been ongoing for years, and in 2019 US Telecom in partnership with CostQuest and other industry partners created the fabric pilot program.

That has been instrumental in lead to the new FCC system, panelists said. It is called a “fabric” dataset because it is made up of other datasets that interlace like fabric, Follansbee explained.

The fabric brings new challenges, especially for mobile providers, said Chris Wieczorek of T-Mobile. With a whole new set of reporting criteria to fill out the fabric, it will lead to confusion for consumers, and lots of work for the new task force, he said.

Henze said that without the fabric, closing the digital divide between those with broadband internet and those without has been impossible.

Digital Opportunity Data Collection expected to help better map rural areas

The new mapping can help in rural areas where the current geolocation for a resident may be a mailbox that is several hundred feet or farther away from the actual house that needs service, Follansbee said.

Rural areas aren’t the only places that will benefit, though. It can also help in dense urban areas where vertical location in a residential building is important to getting a good connection, said Wieczorek.

The fabric will also help from a financial perspective, because of the large amount of funding going around, said Charter Communications’ Christine Sanquist. The improved mapping can help identify where best to spend that funding for federal agencies, providers, and local governments, she said.

There is now more than $10 billion in new federal funding for broadband-related projects, with the recent $3.2 billion Emergency Broadband Benefit program as part of the Consolidated Appropriations Act in December 2020 and the new $7.6 Emergency Connectivity Fund part of the American Rescue Plan that President Joe Biden signed into law Thursday.

The new FCC task force for implementing the new mapping system was created in February 2021, and is being led by , led by Jean Kiddoo at the FCC. No specific dates have been set yet for getting the system operational.

Continue Reading

Broadband Data

GOP Grills FCC on Improving Broadband Mapping Now, as Agency Spells Out New Rules

Tim White

Published

on

Photo of former FCC Chairman Ajit Pai speaking at the March 2019 launch of US Telecom’s mapping initiative by Drew Clark

March 11, 2021 – Federal Communications Commission Acting Chairwoman Jessica Rosenworcel has changed her stance on the timeline for updating the FCC’s broadband mapping data, and several House and Senate Republicans are wondering why.

“On March 10, 2020, you testified before the Senate Appropriations Committee’s Subcommittee on Financial Services and General Government that the FCC could ‘radically improve’ its broadband maps ‘within three-to-six months,’” read the letter, sent Monday to Rosenworcel from the GOP delegation.

“You repeated that statement the next day, testifying before the House Appropriations Committee’s FSGG Subcommittee that the agency could fix its maps in ‘just a few months.’”

“You can imagine our surprise and disappointment when the FCC recently suggested the new maps would not be ready until 2022,” the letter read, referring to the FCC’s open meeting on February 17, 2021.

“The United States faces a persistent digital divide. The pandemic has made connectivity more important than ever, yet millions of Americans continue to live without high-speed broadband. Any delay in creating new maps would delay funding opportunities for unserved households,” the letter read.

The letter requests Rosenworcel’s response by March 22, 2021, including why she changed the timeline, details on the timeline for developing new maps, how the FCC plans to spend the $98 million funding provided for this updated mapping as part of the Consolidated Appropriations Act that passed in December 2020, among other stipulations.

Digital Opportunity Data Collection order spells out rules for mapping

On January 19, 2021, as the final order before FCC Chairman Ajit Pai left his position, the FCC announced new rules for mobile and fixed broadband providers to submit data.

The agency began collecting data from service providers in 1996 with the Telecommunications Act, and at that time considered broadband connection speed to be at least 200 kilobits per second (Kbps).

While internet speeds have greatly improved since then, the January 19 order still uses the 200 Kbps speed as at least one benchmark measurement 25 years later.

The fact that many Americans still lack access to modern, high-speed broadband has become increasingly apparent during the COVID-19 pandemic, as many children lack a consistent connection to the internet for remote learning.

Improving broadband mapping has been a major obstacle for the FCC for several years. Since the Telecommunications Act became law and the commission began gathering data on their Form 477, further legislation has been passed to improve that data, including the National Broadband Plan and National Broadband Map in 2010 and 2011, but many say that the maps still need considerable work.

In August 2019 the FCC launched this new mapping initiative, dubbed “Digital Opportunity Data Collection.” It shifts how the agency gather data from service providers using Form 477. Now, they will be required to provide more granular information.

Then, in March 2020 Congress passed the Broadband Deployment Accuracy and Technological Availability (DATA) Act into law. It further improves the way the FCC much collects broadband mapping data. It wasn’t until the consolidated appropriations bill in December that Congress appropriated funds for the mapping effort.

New order returns to August 2019 principles

Under the new order, fixed broadband providers must submit data for services offered, specifying if they are for residents and/or businesses.

The order states: “This represents a change from the Commission’s proposal in the Second Order and Third Further Notice to collect data separately on residential and on business-and-residential offerings. We find that the approach we adopt will provide us with a more complete picture of the state of broadband deployment.”

Data for non-mass market services do not need to be filed, because the FCC says it does not fall within the scope of the Broadband DATA Act. Data services that will not need to be collected include those purchased by hospitals, schools, libraries, government entities, and other enterprise customers.

The order requires providers to report connection speeds for broadband internet access. The FCC considers a download speed faster than 25 megabits per second (Mbps) and an upload speed faster than 3 Mbps as “advanced telecommunications technology.” That also matches the speed threshold on Form 477, at least since 2015.

Companies must report the maximum advertised speeds in the geographic area if they’re higher than 25/3 Mbps. Although the median fixed broadband speed is much higher than that across America, as reported by Ookla for the fourth quarter of 2020, millions of Americans still lack quality access to the internet.

When providers report their speeds to the FCC under the new order, they must specify in two tiers the connection speed if it falls below the 25/3 Mbps threshold. The first tier is for speeds between 200 kbps and 10/1 Mbps, and the second tier falls between 10/1 Mbps and 25/3 Mbps.

With the new order, fixed wireless providers that submit propagation maps are now required to also submit geographic coordinates—latitude and longitude—for their base stations that provide broadband to their consumers.

Previously, providers were required to submit data only on the spectrum used, height of the base station and type of radio technology. The order details that also verifying the geographic coordinates of base stations will allow for more accurate mapping. Due to the sensitive nature that geographic coordinates may have “for business or national security reasons,” the FCC will consider this new data presumptively confidential.

Latency and signal strength information now required

The new order requires fixed broadband access providers to submit information on latency in their semiannual Digital Opportunity Data Collection filing. The information must detail whether the network round-trip latency for the maximum speed offered in a geographic area is at or below 100 miliseconds.

The agency used the 100 milisecond threshold because it aligns with the requirement for the Connect America Fund Phase II program, which subsidizes companies that provide broadband access in unavailable areas.

Mobile broadband providers are now required to submit signal-strength “heat maps” showing reference signal received power and received signal strength indicator. Both of these metrics are ways of measuring 4G LTE and 5G mobile signal strength.

Covering only outdoor strength, the maps must include data for both pedestrians and drivers. Mobile providers must also submit 3G maps for areas without access to 4G or 5G connections. Due to various factors that affect signal strength, the FCC has not set a floor for minimum signal strength.

Additionally, all mobile and fixed broadband providers must certify each submission by a qualified engineer for accuracy, in addition to the corporate officer certification. The engineer must be employed by the service provider and is directly responsible for or has knowledge of the submitted maps.

FCC verification processes, and the deployment of a broadband fabric

The order permits the FCC’s Office of Economics and Analytics and Wireless Telecommunications Bureau to request additional information from mobile service providers to verify all necessary information that details either infrastructure information or on-the-ground test data for the area where coverage is provided. The companies must do so within 60 days of the request.

The order also directs OEA to verify mobile on-the-ground data submitted by state, local, and Tribal government entities that are responsible for mapping broadband service coverage. It also permits OEA to similarly verify data from third parties if that data is in the public interest for developing the coverage maps or to verify other data as submitted by providers.

The order also adopted a previous suggestion to implement systems for consumers, governmental or other entities to challenge coverage maps for both fixed broadband and mobile connections, disputing the data submitted by providers.

US Telecom and WISPA, trade association representing telecom and wireless providers in the United States, has been working with CostQuest Associates on a “fabric” mapping system for years. The CostQuest system touts considerable improvement over the FCC’s current broadband mapping. The Fabric is based on granular address-level data.

In this new order, the FCC took the first steps to implementing such a system by adopting the definition of a “location” as a residential or business location at which fixed broadband access service is or can be installed, using geographic coordinates.

The commission declined to use street address data until at least they are able “to determine the types of data and functionality that will be available through the procurement process.”

Continue Reading

Broadband Data

Broadband Breakfast Interview with BroadbandNow about Gigabit Coverage and Unreliable FCC Data

Broadband Breakfast Sponsor

Published

on

December 27, 2020 – Broadband Now’s new report on gigabit internet coverage in the United States picks upon aspects of the Federal Communications Commission’s unreliable broadband data.

FCC data shows that in 2016, only 4 percent of American had access to a gigabit connection. Fast forward to 2020, and – according to government statistics – 84 percent of Americans reportedly have that same luxury.

Except that it isn’t so.

In this interview with Broadband Now Editor-in-Chief Tyler Cooper, he and Broadband Breakfast Editor and Publisher Drew Clark delve into the mechanics of understanding the availability of gigabit broadband networks.

As Broadband Now notes in its report, progress on gigabit deployment in the U.S. has been greatly exaggerated. This is true for the state of the internet in general, as Broadband Now previously illustrated. However, the gigabit landscape is a subsection worth examining more closely, as it is the connectivity threshold that will be required to solve the speed and functionality divides of the near future.

This 18-minute question-and-answer delves into the details: Why gigabit connectivity is important, why the FCC is mismeasuring it, and how Broadband Now has filled out our understanding of this benchmark level of broadband connectivity.

The full report is titled, “Massive Gigabit “Coverage” Increase Highlights How Unreliable Government Broadband Data Can Be.”

Broadband Now Editor-in-Chief Tyler Cooper

This Broadband Breakfast interview is sponsored by:

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending