ARLINGTON, Va., September 28, 2009 – “Beer and Broadband Mapping” was the informal name appended to a spirited and lively discussion that capped the first day of the Telecommunications Policy Research Conference here at George Mason School of Law on Friday, September 25.
Blair Levin, executive director of the Federal Communication Commission’s Omnibus Broadband Initiative and a keynote panelist at the Friday evening event, joked that the real intelligence among those academics would be among those spending their Friday night talking about broadband data.
The discussion, which was sponsored by The Benton Foundation, BroadbandCensus.com and the New America Foundation, began at around 8:30 p.m., and lasted for nearly an hour and a half. Many notable academics from TPRC, and from the Obama administration, attended the session.
Charles Benton, chairman of the Benton Foundation, began the discussion by noting the importance of broadband data disclosure, which he had emphasized in his opening statement at the U.S. Broadband Coalition on Thursday, September 24.
Drew Clark, executive director of BroadbandCensus.com, followed by presenting the company’s public and transparent map of Columbia, South Carolina, that shows broadband speeds, technologies, and providers It is available at BroadbandCensusMaps.com. Clark referenced the major change in policy on August 7, calling for disclosure of carrier data, under the Notice of Funds Availability, at the Census block level. He also talked about the importance of data including actual speeds versus advertised speeds, and pointed out that actual speed data was not a requirement of the NoFA. Unfortunately, he noted, pricing data is also not required under the NoFA.
Michael Calabrese, vice president and director of the Wireless Future Program at the New America Foundation, began his opening remarks by stating that the Recovery Act and the Broadband Data Improvement Act set aside up to $350 million for state mapping data. There is much more data out there for researchers and policy makers to layer over the inventory data, but only if they had the proper funding. Examples include actual household speeds and data on connection reliability.
Because the language in the act is vague and requires the “mapping of broadband capability,” the money that does not go to the states for the collection of data should be distributed to the academic community for more in-depth and layered collection efforts, Calabrese said. An inventory of the public sector fiber that is currently available is one of the possibilities under this broad “capabilities” language. Academics and other organizations need to ask about what type of data they need – above and beyond the existing FCC Form 477 – in order to create a complete the national broadband plan.
Before opening up the discussion, Clark, Benton and Calabrese reminded the crowd that there was no agenda behind the discussion, and that everyone should feel free to put their ideas and thoughts on the table. The following are some notes from the discussion.
The first couple of questions revolved around the carriers’ identities and whether there were any trade secrets, or whether the release of carrier data at the census block data would create any confidentiality concerns. Everyone agreed on the need for real connection speeds, as opposed to advertised speeds.
Others also wanted to find out if there was a transparent http cache, DNS manipulation, subtle traffic manipulation, jitter, and packet loss. The real question for the panel was: what data is truly necessary?
BroadbandCensus.com has been working along with the New America Foundation’s Measurement Lab to collect specific consumer data. Through the NDT tests deployed by both organizations, they have been able to gather valuable information from consumers, but are still trying to link that data with specific locations and the carriers themselves.
Other attendees asked: how important is true pricing data? How does bundling affect the ability to obtain real prices? Prices also seem to change depending on household credit history and other similar factors, not just geographical community.
The members at the meeting were also interested in the state of broadband competition. They wanted to be able to combine usage and availability to see if competition works. There should be a national broadband data warehouse that includes data collected from the federal government, data from independent sources and crowd sourcing, several individuals said. This warehouse, however, brought up a concern about the coherence and standardization between data types. Data from different sources need to be comparable in order to be properly analyzed.
More data must also be collected on the demand side. What are the different barriers to adoption? A largely overlooked factor regarding the lack of adoption was the low level of computer ownership. It would also be helpful to know which organizations are addressing the demand-side issues, and how effective have they been? Furthermore, in order to address the issues of uptake and how the community has benefited from broadband, researchers must look beyond household data. If there were an interest in what productivity advances deployment of technology has created, there needs to be a survey of the business districts in the cities being served.
Finally, everyone decided that they want to hear more positive stories. One way to do this is time-series data collection. Specific dates will make it easy to track change and see the progress of the areas that have been underserved and unserved. Where has there been success and what can be learned about those areas?
The meeting ended on a high note where all constituents agreed that there needs to be joint action and collaboration to make sure that the government is collecting the correct data in order to come up with a complete national broadband strategy.
BroadbandCensus.com was launched in January 2008, and uses “crowdsourcing” to collect the Broadband SPARC: Speeds, Prices, Availability, Reliability and Competition. The news on BroadbandCensus.com is produced by Broadband Census News LLC, a subsidiary of Broadband Census LLC that was created in July 2009.
A recent split of operations helps to clarify the mission of BroadbandCensus.com. Broadband Census Data LLC offers commercial broadband verification services to cities, states, carriers and broadband users. Created in July 2009, Broadband Census Data LLC produced a joint application in the NTIA’s Broadband Technology Opportunities Program with Virginia Tech’s eCorridors Program. In August 2009, BroadbandCensus.com released a beta map of Columbia, South Carolina, in partnership with Benedict-Allen Community Development Corporation.
Broadband Census News LLC offers daily and weekly reporting, as well as the Broadband Breakfast Club. The Broadband Breakfast Club has been inviting top experts and policy-makers to share breakfast and perspectives on broadband technology and internet policy since October 2008. Both Broadband Census News LLC and Broadband Census Data LLC are subsidiaries of Broadband Census LLC, and are organized in the Commonwealth of Virginia. About BroadbandCensus.com.
New Broadband Mapping Fabric Will Help Unify Geocoding Across the Broadband Industry, Experts Say
March 11, 2021 – The Federal Communications Commission’s new “fabric” for mapping broadband service across America will not only help collect more accurate data, but also unify geocoding across the broadband industry, industry experts said during a Federal Communications Bar Association webinar Thursday.
Broadband service providers are not geocoding experts, said Lynn Follansbee of US Telecom, and they don’t know where all the people are.
The new fabric dataset is going to be very useful to get a granular look at what is and what is not served and to harmonize geocoding, she said.
AT&T’s Mary Henze agreed. “We’re a broadband provider, we’re not a GIS company,” she said. Unified geocode across the whole field will help a lot to find missing spots in our service area, she said.
The new Digital Opportunity Data Collection fabric is a major shift from the current Form 477 data that the FCC collects, which has been notoriously inaccurate for years. The effort to improve broadband mapping has been ongoing for years, and in 2019 US Telecom in partnership with CostQuest and other industry partners created the fabric pilot program.
That has been instrumental in lead to the new FCC system, panelists said. It is called a “fabric” dataset because it is made up of other datasets that interlace like fabric, Follansbee explained.
The fabric brings new challenges, especially for mobile providers, said Chris Wieczorek of T-Mobile. With a whole new set of reporting criteria to fill out the fabric, it will lead to confusion for consumers, and lots of work for the new task force, he said.
Henze said that without the fabric, closing the digital divide between those with broadband internet and those without has been impossible.
Digital Opportunity Data Collection expected to help better map rural areas
The new mapping can help in rural areas where the current geolocation for a resident may be a mailbox that is several hundred feet or farther away from the actual house that needs service, Follansbee said.
Rural areas aren’t the only places that will benefit, though. It can also help in dense urban areas where vertical location in a residential building is important to getting a good connection, said Wieczorek.
The fabric will also help from a financial perspective, because of the large amount of funding going around, said Charter Communications’ Christine Sanquist. The improved mapping can help identify where best to spend that funding for federal agencies, providers, and local governments, she said.
There is now more than $10 billion in new federal funding for broadband-related projects, with the recent $3.2 billion Emergency Broadband Benefit program as part of the Consolidated Appropriations Act in December 2020 and the new $7.6 Emergency Connectivity Fund part of the American Rescue Plan that President Joe Biden signed into law Thursday.
The new FCC task force for implementing the new mapping system was created in February 2021, and is being led by , led by Jean Kiddoo at the FCC. No specific dates have been set yet for getting the system operational.
GOP Grills FCC on Improving Broadband Mapping Now, as Agency Spells Out New Rules
March 11, 2021 – Federal Communications Commission Acting Chairwoman Jessica Rosenworcel has changed her stance on the timeline for updating the FCC’s broadband mapping data, and several House and Senate Republicans are wondering why.
“On March 10, 2020, you testified before the Senate Appropriations Committee’s Subcommittee on Financial Services and General Government that the FCC could ‘radically improve’ its broadband maps ‘within three-to-six months,’” read the letter, sent Monday to Rosenworcel from the GOP delegation.
“You repeated that statement the next day, testifying before the House Appropriations Committee’s FSGG Subcommittee that the agency could fix its maps in ‘just a few months.’”
“You can imagine our surprise and disappointment when the FCC recently suggested the new maps would not be ready until 2022,” the letter read, referring to the FCC’s open meeting on February 17, 2021.
“The United States faces a persistent digital divide. The pandemic has made connectivity more important than ever, yet millions of Americans continue to live without high-speed broadband. Any delay in creating new maps would delay funding opportunities for unserved households,” the letter read.
The letter requests Rosenworcel’s response by March 22, 2021, including why she changed the timeline, details on the timeline for developing new maps, how the FCC plans to spend the $98 million funding provided for this updated mapping as part of the Consolidated Appropriations Act that passed in December 2020, among other stipulations.
Digital Opportunity Data Collection order spells out rules for mapping
On January 19, 2021, as the final order before FCC Chairman Ajit Pai left his position, the FCC announced new rules for mobile and fixed broadband providers to submit data.
The agency began collecting data from service providers in 1996 with the Telecommunications Act, and at that time considered broadband connection speed to be at least 200 kilobits per second (Kbps).
While internet speeds have greatly improved since then, the January 19 order still uses the 200 Kbps speed as at least one benchmark measurement 25 years later.
The fact that many Americans still lack access to modern, high-speed broadband has become increasingly apparent during the COVID-19 pandemic, as many children lack a consistent connection to the internet for remote learning.
Improving broadband mapping has been a major obstacle for the FCC for several years. Since the Telecommunications Act became law and the commission began gathering data on their Form 477, further legislation has been passed to improve that data, including the National Broadband Plan and National Broadband Map in 2010 and 2011, but many say that the maps still need considerable work.
In August 2019 the FCC launched this new mapping initiative, dubbed “Digital Opportunity Data Collection.” It shifts how the agency gather data from service providers using Form 477. Now, they will be required to provide more granular information.
Then, in March 2020 Congress passed the Broadband Deployment Accuracy and Technological Availability (DATA) Act into law. It further improves the way the FCC much collects broadband mapping data. It wasn’t until the consolidated appropriations bill in December that Congress appropriated funds for the mapping effort.
New order returns to August 2019 principles
Under the new order, fixed broadband providers must submit data for services offered, specifying if they are for residents and/or businesses.
The order states: “This represents a change from the Commission’s proposal in the Second Order and Third Further Notice to collect data separately on residential and on business-and-residential offerings. We find that the approach we adopt will provide us with a more complete picture of the state of broadband deployment.”
Data for non-mass market services do not need to be filed, because the FCC says it does not fall within the scope of the Broadband DATA Act. Data services that will not need to be collected include those purchased by hospitals, schools, libraries, government entities, and other enterprise customers.
The order requires providers to report connection speeds for broadband internet access. The FCC considers a download speed faster than 25 megabits per second (Mbps) and an upload speed faster than 3 Mbps as “advanced telecommunications technology.” That also matches the speed threshold on Form 477, at least since 2015.
Companies must report the maximum advertised speeds in the geographic area if they’re higher than 25/3 Mbps. Although the median fixed broadband speed is much higher than that across America, as reported by Ookla for the fourth quarter of 2020, millions of Americans still lack quality access to the internet.
When providers report their speeds to the FCC under the new order, they must specify in two tiers the connection speed if it falls below the 25/3 Mbps threshold. The first tier is for speeds between 200 kbps and 10/1 Mbps, and the second tier falls between 10/1 Mbps and 25/3 Mbps.
With the new order, fixed wireless providers that submit propagation maps are now required to also submit geographic coordinates—latitude and longitude—for their base stations that provide broadband to their consumers.
Previously, providers were required to submit data only on the spectrum used, height of the base station and type of radio technology. The order details that also verifying the geographic coordinates of base stations will allow for more accurate mapping. Due to the sensitive nature that geographic coordinates may have “for business or national security reasons,” the FCC will consider this new data presumptively confidential.
Latency and signal strength information now required
The new order requires fixed broadband access providers to submit information on latency in their semiannual Digital Opportunity Data Collection filing. The information must detail whether the network round-trip latency for the maximum speed offered in a geographic area is at or below 100 miliseconds.
The agency used the 100 milisecond threshold because it aligns with the requirement for the Connect America Fund Phase II program, which subsidizes companies that provide broadband access in unavailable areas.
Mobile broadband providers are now required to submit signal-strength “heat maps” showing reference signal received power and received signal strength indicator. Both of these metrics are ways of measuring 4G LTE and 5G mobile signal strength.
Covering only outdoor strength, the maps must include data for both pedestrians and drivers. Mobile providers must also submit 3G maps for areas without access to 4G or 5G connections. Due to various factors that affect signal strength, the FCC has not set a floor for minimum signal strength.
Additionally, all mobile and fixed broadband providers must certify each submission by a qualified engineer for accuracy, in addition to the corporate officer certification. The engineer must be employed by the service provider and is directly responsible for or has knowledge of the submitted maps.
FCC verification processes, and the deployment of a broadband fabric
The order permits the FCC’s Office of Economics and Analytics and Wireless Telecommunications Bureau to request additional information from mobile service providers to verify all necessary information that details either infrastructure information or on-the-ground test data for the area where coverage is provided. The companies must do so within 60 days of the request.
The order also directs OEA to verify mobile on-the-ground data submitted by state, local, and Tribal government entities that are responsible for mapping broadband service coverage. It also permits OEA to similarly verify data from third parties if that data is in the public interest for developing the coverage maps or to verify other data as submitted by providers.
The order also adopted a previous suggestion to implement systems for consumers, governmental or other entities to challenge coverage maps for both fixed broadband and mobile connections, disputing the data submitted by providers.
US Telecom and WISPA, trade association representing telecom and wireless providers in the United States, has been working with CostQuest Associates on a “fabric” mapping system for years. The CostQuest system touts considerable improvement over the FCC’s current broadband mapping. The Fabric is based on granular address-level data.
In this new order, the FCC took the first steps to implementing such a system by adopting the definition of a “location” as a residential or business location at which fixed broadband access service is or can be installed, using geographic coordinates.
The commission declined to use street address data until at least they are able “to determine the types of data and functionality that will be available through the procurement process.”
Broadband Breakfast Interview with BroadbandNow about Gigabit Coverage and Unreliable FCC Data
December 27, 2020 – Broadband Now’s new report on gigabit internet coverage in the United States picks upon aspects of the Federal Communications Commission’s unreliable broadband data.
FCC data shows that in 2016, only 4 percent of American had access to a gigabit connection. Fast forward to 2020, and – according to government statistics – 84 percent of Americans reportedly have that same luxury.
Except that it isn’t so.
In this interview with Broadband Now Editor-in-Chief Tyler Cooper, he and Broadband Breakfast Editor and Publisher Drew Clark delve into the mechanics of understanding the availability of gigabit broadband networks.
As Broadband Now notes in its report, progress on gigabit deployment in the U.S. has been greatly exaggerated. This is true for the state of the internet in general, as Broadband Now previously illustrated. However, the gigabit landscape is a subsection worth examining more closely, as it is the connectivity threshold that will be required to solve the speed and functionality divides of the near future.
This 18-minute question-and-answer delves into the details: Why gigabit connectivity is important, why the FCC is mismeasuring it, and how Broadband Now has filled out our understanding of this benchmark level of broadband connectivity.
The full report is titled, “Massive Gigabit “Coverage” Increase Highlights How Unreliable Government Broadband Data Can Be.”
This Broadband Breakfast interview is sponsored by:
- Lawmakers And Newsmakers Tackle Google and Facebook Market Power
- Verizon Expands 5G, U.S. And E.U. Diverge On Facial Recognition, New Drone Regulations
- Popularity Of Telework And Telehealth Presents Unique Opportunities For A Post-Pandemic World
- Emergency Broadband Benefit Test Launch, FCC Robocall Database, West Virginia Broadband Legislation
- Proving Current Speed Threshold As Insufficient A Hurdle For House Bill: Consultant
- Multilingual Digital Navigators Crucial For Inclusion
Signup for Broadband Breakfast
Artificial Intelligence3 months ago
Artificial Intelligence Aims to Enhance Human Capabilities, But Only With Caution and Safeguards
Fiber4 months ago
Smaller Internet Providers Were Instrumental to Fiber Deployment in 2020, Says Fiber Broadband Association
Privacy1 month ago
New Laws Needed on Capturing Data Collection From Mixed Reality, Experts Say
Artificial Intelligence2 months ago
Staying Ahead On Artificial Intelligence Requires International Cooperation
#broadbandlive1 month ago
Broadband Breakfast Live Online Wednesday, March 24, 2021 – The State of Online Higher Education
Cybersecurity3 months ago
Internet of Things Connected Devices Are Inherently Insecure, Say Tech Experts
White House4 months ago
Building Better Broadband Underscores Joe Biden’s Top Policy Initiatives
Broadband Roundup3 months ago
Getting Older Adults Connected, Nextlink Internet Partnership, Tacoma Convention Center Gains 5G Connectivity