Connect with us

Broadband Data

Google Enters Free Speed Test Marketplace with Academic Collaboration

WASHINGTON, January 27, 2009 – Search giant Google is preparing to enter the market for free broadband speed tests, through a collaboration with the university research consortium PlanetLab, and the New America Foundation.

Drew Clark

Published

on

WASHINGTON, January 27, 2009 – Search giant Google is preparing to enter the market for free broadband speed tests, through a collaboration with the university research consortium PlanetLab, and the New America Foundation.

Google is set to announce the collaboration on Wednesday, at an event at the New America Foundation in Washington, and keynoted by Vint Cerf, vice president and chief internet evangelist at Google.

Google follows BroadbandCensus.com, which launched in January 2008, in providing a free internet speed tests to consumers.

BroadbandCensus.com’s speed test allows internet uses to test actual speeds and compare them to the speeds that are promised by their internet providers.

Google and the other participants in the research consortium will be using the same speed test – the Network Diagnostic Tool of Internet2 – that was deployed by BroadbandCensus.com beginning in February 2008.

As with BroadbandCensus.com, Google apparently seeks to make the data publicly available, as a means of providing transparency into the operations of internet providers.

“Transparency has always been an essential component of the Internet’s success,” reads the press release announcing Wednesday’s event. “To remedy today’s information gap, researchers need resources to develop new analytical tools.”

“At this event, speakers will discuss the importance of advancing research in network measurement tools and introduce new developments that will benefit end-users, innovators, and policymakers,” reads the release.

The organizational framework for the speed tests and other network tools is to be called the Measurement Lab, and is expected to be hosted through PlanetLab at Princeton University.

Among the individuals also scheduled to speak at the event include Larry Peterson, chair of the Department of Computer Science at Princeton, and Princeton Professor Ed Felten, director of the Center for Information Technology Policy.

In addition to the NDT speed test, the Measurement Lab will allow internet users to use two additional tests, “Glasnost,” developed by the Max Planck Institute for Software Systems, in Kaiserlautern and Saarbrucken, Germany, and the NPAD diagnostic service, Pathdiag, developed by the Pittsburgh Supercomputing Center.

According to the Max Plank Institute web site, Glastnost “creates a BitTorrent-like transfer between your machine and our server, and determines whether or not your [internet service provider] is limiting such traffic. This is a first step towards making traffic manipulation by ISPs more transparent to their customers.”

In fall 2007, through tests conducted by the Electronic Frontier Foundation, Comcast was found to have been interfering in the packet transfers by users of BitTorrent, a peer-to-peer software system. After a complaint, the FCC punished Comcast in August 2008.

Comcast’s system of network management – which the cable operator says it has discontinued – became Exhibit A in the battle over network neutrality, or the procedures by which broadband carriers can prioritize internet traffic.

Over the past several years, Google has opposed attempts by carriers to circumvent Net neutrality.

According to the Pittsburgh Supercomputing Center web site, NPAD’s Pathdiag “is designed to easily and accurately diagnose problems in the last-mile network and end-systems that are the most common causes of all severe performance degradation over long end-to-end paths.”

“Our goal is to make the test procedures easy enough and the report it generates clear enough to be suitable for end-users who are not networking experts,” the PSC web site continues.

Google, PlanetLab, New America Foundation and the software engineers that designed each of the three tools are involved in the new venture.

“We are listed as an advisory board” to the project, said Rich Carlson, a network engineer at Internet2. “Google is providing some rackspace. Google is providing the funding to purchase the hardware, and the network connectivity to connect [the tests] to the commercial internet.”

BroadbandCensus.com’s goal in allowing internet users to test their speeds is to provide a publicly-available repository of speeds, prices, availability, reliability and competition in measuring local broadband.

In Taking the Broadband Census, individuals answer a brief questionnaire about their location, their carriers and the quality of service. They are also invited to comment on their carrier.

Information about all speed tests conducted on BroadbandCensus.com are immediately publicly available, both by carrier and by ZIP code, after the tests are concluded. All the content on BroadbandCensus.com is available under a Creative Commons Attribution Noncommercial License, allowing it to be republished and reused for free by academics and by local government agencies.

BroadbandCensus.com reported about its experience using the Internet2’s NDT speed test, and made a presentation about its findings at an Internet2/Joint Techs Conference in Lincoln, Neb., in July 2008.

Carlson said he believes that Google will also make its data publicly available. “My intention is to make that data available, as soon as possible.”

Carlson said that he and Internet2 believed it was important to “get the data collection started, and see what kind of community resources can be put to bear, to do some analysis” about internet traffic.

Other academic organizations, including Virginia Tech’s eCorridors Program, have also used the NDT speed test, which is open source software. Speed test data from eCorridors is also publicly available.

Google announced its interest in the speed test marketplace at Supernova conference in June 2008, and the collaboration apparently took root after an invitation-only conference Google organized in Mountain View, Calif., in the summer of 2008.

More details are expected to be made available at the Wednesday New America Foundation event.

Google CEO Eric Schmidt is chairman of the New America Foundation, and Schmidt personally has made significant financial contributions to the think tank.

The Foundation has taken stances congruent with positions that Google been pushing. For example, the think tank strongly advocated for the FCC to make vacant television channels available for unlicensed use by internet devices, a position endorsed by Google.

Editor’s Note

Internet2 provided technical direction about deploying a speed test to BroadbandCensus.com, and the eCorridors Program at Virginia Tech has provided encouragement and technical advice in taking the Broadband Census to a national audience. See BroadbandCensus.com supporters.

Drew Clark is the Editor and Publisher of BroadbandBreakfast.com and a nationally-respected telecommunications attorney at The CommLaw Group. He has closely tracked the trends in and mechanics of digital infrastructure for 20 years, and has helped fiber-based and fixed wireless providers navigate coverage, identify markets, broker infrastructure, and operate in the public right of way. The articles and posts on Broadband Breakfast and affiliated social media, including the BroadbandCensus Twitter feed, are not legal advice or legal services, do not constitute the creation of an attorney-client privilege, and represent the views of their respective authors.

Broadband Data

New Broadband Mapping Fabric Will Help Unify Geocoding Across the Broadband Industry, Experts Say

Tim White

Published

on

Photo of Lynn Follansbee from October 2019 by Drew Clark

March 11, 2021 – The Federal Communications Commission’s new “fabric” for mapping broadband service across America will not only help collect more accurate data, but also unify geocoding across the broadband industry, industry experts said during a Federal Communications Bar Association webinar Thursday.

Broadband service providers are not geocoding experts, said Lynn Follansbee of US Telecom, and they don’t know where all the people are.

The new fabric dataset is going to be very useful to get a granular look at what is and what is not served and to harmonize geocoding, she said.

AT&T’s Mary Henze agreed. “We’re a broadband provider, we’re not a GIS company,” she said. Unified geocode across the whole field will help a lot to find missing spots in our service area, she said.

The new Digital Opportunity Data Collection fabric is a major shift from the current Form 477 data that the FCC collects, which has been notoriously inaccurate for years. The effort to improve broadband mapping has been ongoing for years, and in 2019 US Telecom in partnership with CostQuest and other industry partners created the fabric pilot program.

That has been instrumental in lead to the new FCC system, panelists said. It is called a “fabric” dataset because it is made up of other datasets that interlace like fabric, Follansbee explained.

The fabric brings new challenges, especially for mobile providers, said Chris Wieczorek of T-Mobile. With a whole new set of reporting criteria to fill out the fabric, it will lead to confusion for consumers, and lots of work for the new task force, he said.

Henze said that without the fabric, closing the digital divide between those with broadband internet and those without has been impossible.

Digital Opportunity Data Collection expected to help better map rural areas

The new mapping can help in rural areas where the current geolocation for a resident may be a mailbox that is several hundred feet or farther away from the actual house that needs service, Follansbee said.

Rural areas aren’t the only places that will benefit, though. It can also help in dense urban areas where vertical location in a residential building is important to getting a good connection, said Wieczorek.

The fabric will also help from a financial perspective, because of the large amount of funding going around, said Charter Communications’ Christine Sanquist. The improved mapping can help identify where best to spend that funding for federal agencies, providers, and local governments, she said.

There is now more than $10 billion in new federal funding for broadband-related projects, with the recent $3.2 billion Emergency Broadband Benefit program as part of the Consolidated Appropriations Act in December 2020 and the new $7.6 Emergency Connectivity Fund part of the American Rescue Plan that President Joe Biden signed into law Thursday.

The new FCC task force for implementing the new mapping system was created in February 2021, and is being led by , led by Jean Kiddoo at the FCC. No specific dates have been set yet for getting the system operational.

Continue Reading

Broadband Data

GOP Grills FCC on Improving Broadband Mapping Now, as Agency Spells Out New Rules

Tim White

Published

on

Photo of former FCC Chairman Ajit Pai speaking at the March 2019 launch of US Telecom’s mapping initiative by Drew Clark

March 11, 2021 – Federal Communications Commission Acting Chairwoman Jessica Rosenworcel has changed her stance on the timeline for updating the FCC’s broadband mapping data, and several House and Senate Republicans are wondering why.

“On March 10, 2020, you testified before the Senate Appropriations Committee’s Subcommittee on Financial Services and General Government that the FCC could ‘radically improve’ its broadband maps ‘within three-to-six months,’” read the letter, sent Monday to Rosenworcel from the GOP delegation.

“You repeated that statement the next day, testifying before the House Appropriations Committee’s FSGG Subcommittee that the agency could fix its maps in ‘just a few months.’”

“You can imagine our surprise and disappointment when the FCC recently suggested the new maps would not be ready until 2022,” the letter read, referring to the FCC’s open meeting on February 17, 2021.

“The United States faces a persistent digital divide. The pandemic has made connectivity more important than ever, yet millions of Americans continue to live without high-speed broadband. Any delay in creating new maps would delay funding opportunities for unserved households,” the letter read.

The letter requests Rosenworcel’s response by March 22, 2021, including why she changed the timeline, details on the timeline for developing new maps, how the FCC plans to spend the $98 million funding provided for this updated mapping as part of the Consolidated Appropriations Act that passed in December 2020, among other stipulations.

Digital Opportunity Data Collection order spells out rules for mapping

On January 19, 2021, as the final order before FCC Chairman Ajit Pai left his position, the FCC announced new rules for mobile and fixed broadband providers to submit data.

The agency began collecting data from service providers in 1996 with the Telecommunications Act, and at that time considered broadband connection speed to be at least 200 kilobits per second (Kbps).

While internet speeds have greatly improved since then, the January 19 order still uses the 200 Kbps speed as at least one benchmark measurement 25 years later.

The fact that many Americans still lack access to modern, high-speed broadband has become increasingly apparent during the COVID-19 pandemic, as many children lack a consistent connection to the internet for remote learning.

Improving broadband mapping has been a major obstacle for the FCC for several years. Since the Telecommunications Act became law and the commission began gathering data on their Form 477, further legislation has been passed to improve that data, including the National Broadband Plan and National Broadband Map in 2010 and 2011, but many say that the maps still need considerable work.

In August 2019 the FCC launched this new mapping initiative, dubbed “Digital Opportunity Data Collection.” It shifts how the agency gather data from service providers using Form 477. Now, they will be required to provide more granular information.

Then, in March 2020 Congress passed the Broadband Deployment Accuracy and Technological Availability (DATA) Act into law. It further improves the way the FCC much collects broadband mapping data. It wasn’t until the consolidated appropriations bill in December that Congress appropriated funds for the mapping effort.

New order returns to August 2019 principles

Under the new order, fixed broadband providers must submit data for services offered, specifying if they are for residents and/or businesses.

The order states: “This represents a change from the Commission’s proposal in the Second Order and Third Further Notice to collect data separately on residential and on business-and-residential offerings. We find that the approach we adopt will provide us with a more complete picture of the state of broadband deployment.”

Data for non-mass market services do not need to be filed, because the FCC says it does not fall within the scope of the Broadband DATA Act. Data services that will not need to be collected include those purchased by hospitals, schools, libraries, government entities, and other enterprise customers.

The order requires providers to report connection speeds for broadband internet access. The FCC considers a download speed faster than 25 megabits per second (Mbps) and an upload speed faster than 3 Mbps as “advanced telecommunications technology.” That also matches the speed threshold on Form 477, at least since 2015.

Companies must report the maximum advertised speeds in the geographic area if they’re higher than 25/3 Mbps. Although the median fixed broadband speed is much higher than that across America, as reported by Ookla for the fourth quarter of 2020, millions of Americans still lack quality access to the internet.

When providers report their speeds to the FCC under the new order, they must specify in two tiers the connection speed if it falls below the 25/3 Mbps threshold. The first tier is for speeds between 200 kbps and 10/1 Mbps, and the second tier falls between 10/1 Mbps and 25/3 Mbps.

With the new order, fixed wireless providers that submit propagation maps are now required to also submit geographic coordinates—latitude and longitude—for their base stations that provide broadband to their consumers.

Previously, providers were required to submit data only on the spectrum used, height of the base station and type of radio technology. The order details that also verifying the geographic coordinates of base stations will allow for more accurate mapping. Due to the sensitive nature that geographic coordinates may have “for business or national security reasons,” the FCC will consider this new data presumptively confidential.

Latency and signal strength information now required

The new order requires fixed broadband access providers to submit information on latency in their semiannual Digital Opportunity Data Collection filing. The information must detail whether the network round-trip latency for the maximum speed offered in a geographic area is at or below 100 miliseconds.

The agency used the 100 milisecond threshold because it aligns with the requirement for the Connect America Fund Phase II program, which subsidizes companies that provide broadband access in unavailable areas.

Mobile broadband providers are now required to submit signal-strength “heat maps” showing reference signal received power and received signal strength indicator. Both of these metrics are ways of measuring 4G LTE and 5G mobile signal strength.

Covering only outdoor strength, the maps must include data for both pedestrians and drivers. Mobile providers must also submit 3G maps for areas without access to 4G or 5G connections. Due to various factors that affect signal strength, the FCC has not set a floor for minimum signal strength.

Additionally, all mobile and fixed broadband providers must certify each submission by a qualified engineer for accuracy, in addition to the corporate officer certification. The engineer must be employed by the service provider and is directly responsible for or has knowledge of the submitted maps.

FCC verification processes, and the deployment of a broadband fabric

The order permits the FCC’s Office of Economics and Analytics and Wireless Telecommunications Bureau to request additional information from mobile service providers to verify all necessary information that details either infrastructure information or on-the-ground test data for the area where coverage is provided. The companies must do so within 60 days of the request.

The order also directs OEA to verify mobile on-the-ground data submitted by state, local, and Tribal government entities that are responsible for mapping broadband service coverage. It also permits OEA to similarly verify data from third parties if that data is in the public interest for developing the coverage maps or to verify other data as submitted by providers.

The order also adopted a previous suggestion to implement systems for consumers, governmental or other entities to challenge coverage maps for both fixed broadband and mobile connections, disputing the data submitted by providers.

US Telecom and WISPA, trade association representing telecom and wireless providers in the United States, has been working with CostQuest Associates on a “fabric” mapping system for years. The CostQuest system touts considerable improvement over the FCC’s current broadband mapping. The Fabric is based on granular address-level data.

In this new order, the FCC took the first steps to implementing such a system by adopting the definition of a “location” as a residential or business location at which fixed broadband access service is or can be installed, using geographic coordinates.

The commission declined to use street address data until at least they are able “to determine the types of data and functionality that will be available through the procurement process.”

Continue Reading

Broadband Data

Broadband Breakfast Interview with BroadbandNow about Gigabit Coverage and Unreliable FCC Data

Broadband Breakfast Sponsor

Published

on

December 27, 2020 – Broadband Now’s new report on gigabit internet coverage in the United States picks upon aspects of the Federal Communications Commission’s unreliable broadband data.

FCC data shows that in 2016, only 4 percent of American had access to a gigabit connection. Fast forward to 2020, and – according to government statistics – 84 percent of Americans reportedly have that same luxury.

Except that it isn’t so.

In this interview with Broadband Now Editor-in-Chief Tyler Cooper, he and Broadband Breakfast Editor and Publisher Drew Clark delve into the mechanics of understanding the availability of gigabit broadband networks.

As Broadband Now notes in its report, progress on gigabit deployment in the U.S. has been greatly exaggerated. This is true for the state of the internet in general, as Broadband Now previously illustrated. However, the gigabit landscape is a subsection worth examining more closely, as it is the connectivity threshold that will be required to solve the speed and functionality divides of the near future.

This 18-minute question-and-answer delves into the details: Why gigabit connectivity is important, why the FCC is mismeasuring it, and how Broadband Now has filled out our understanding of this benchmark level of broadband connectivity.

The full report is titled, “Massive Gigabit “Coverage” Increase Highlights How Unreliable Government Broadband Data Can Be.”

Broadband Now Editor-in-Chief Tyler Cooper

This Broadband Breakfast interview is sponsored by:

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending