WASHINGTON, February 10, 2010 – The head of the Federal Communications Commission’s internal “think tank” said Tuesday that the agency was taking a fresh look at all aspects of its broadband data-collection processes: collection, validation and analysis, and dissemination.
Speaking to a roomful of panelists and telecom officials who attended Tuesday’s Broadband Breakfast Club in spite of the snow, Office of Strategic Planning Chief Paul de Sa said that the agency was sensitive to the need to balance proprietary information with the desire for transparency in its data-collection processes.
In a keynote on the topic of “Setting the Table for the National Broadband Plan: Collecting and Using Broadband Data,” de Sa began by asking questions that frame the work of the agency on broadband data.
He outlined aspects of the data collection process: the staff must ask themselves whether they are collecting the right data, they must validate and analyze the data, and then create a process to disseminate the data. He added that there is also an inherent struggle between the principle of transparency and protecting proprietary data.
“ We are not trying to solve problems by creating immediate policy,” said de Sa, after outlining aspects of the agency’s data-collection processing. Effective use of broadband data involves defining the problem to be solved, coming up with a hypothesis, analyzing the data and either creating policy or reassessing the hypothesis.
Some major policy issues to be addressed in the broadband plan that will be released on March 17, 2010, include deployment, adoption as well as choice and competition.
The data that the FCC is looking at to solve some of these include data from the National Telecommunications and Information Administration’s mapping efforts, the FCC’s Form 477 data, which has been collected from carriers since 2000, periodic surveys, plus data from American Indian tribes, and crowd sourced data.
Drew Clark, Editor and Executive Director of BroadbandBreakfast.com, moderated the discussion and began by asking the panelists to explain what the Form 477 database is, and to discuss whether it was still a meaningful data source for broadband information?
Michelle Connolly, associate professor of economics at Duke University, explained that the FCC requires broadband providers to report the number of subscribers within a given Zip code. Connolly also said that the requirement to provide such data was not enforced.
Connolly said that the Form 477 database was the only systematic nationwide data-set. Additionally, the information is useful to asses historical data.
Connolly proposed that broadband providers be given identity numbers so that their coverage area over time could be tracked without revealing their identities.
Jeffery Campbell, senior director of technology and government affairs at Cisco Systems, said that the data about providers was readily discoverable. A consumer can go to any internet service provider’s web site, plug in their Zip code and see if they can receive service.
The more important question to focus on, said Campbell, is who does not have broadband at a household level?
But Connolly said that the cost of including address-level data would be too great a burden upon providers.
Broadband mapping experts Brian Webster said that measurements conducted using the U.S. national grid system allow coverage areas to be shows at levels even better than the Census block without the need for cooperation from carriers. Address-level data, he said, was frequently imprecise.
(Editor’s Note: Webster and his WirelessMapping.com firm are partners with BroadbandCensus.com, the sister company of BroadbandBreakfast.com, in offering states, counties and broadband stimulus applicants with Census-block-level information about broadband providers, technologies, speeds and prices.)
John Horrigan, director of consumer research for the FCC’s omnibus broadband initiative, addressed the challenges of constructing and merging consistent data-sets.
Horrigan said that expanding Form 477 would be analytically useful, “if there is harmonization in data sets then there is a position for rich data analysis.” Horrigan went on to discuss the survey conducted of non-adopters and small businesses in the fall of 2009. The data will be released soon after the release of the national broadband plan, said Horrigan.
Clark asked the panelists their perspectives on coordinating with the NTIA on broadband data.
De Sa said that data collected from disparate sources could be merged into the same format -- it will just take a lot of comparisons and hard work.
Connolly said that Congress went about broadband data collection in a strange way. “Why not have one company do this job properly as opposed to having a bunch of them do it wrong?”
Campbell agreed that it makes more sense to collect this data once and collect it the right way. “We need to rethink what we need to know.” He believes that the right data is out there, and efforts should be made to work with providers to find out how they are collecting their data.
Webster added that there is a lot of private-sector data-sets available for purchase with valuable broadband information.
A member of the audience that represented a small wireless provider in a rural state worried about the disclosure of broadband data spurring anticompetitive behavior. This provider said that his company had submitted Form 477 data but has not complies with the NTIA’s mapping efforts for fear that such data would be made public.
Connolly and Campbell countered that withholding such data is not necessary, because competing telephone incumbents can observe the same information through observation.
Connolly – leaving early to catch a flight back to North Carolina – ended her comments by posing a question to the panel about the importance of collecting data about small businesses, not just consumers.
Horrigan agreed: it is very useful to survey businesses and find out how small business are using broadband. Webster noted that when surveying business it is important to note the differences in industries, where some types of businesses are more broadband-driven than others.
Horrigan stressed the great research being done in the academic community. When it cames to these researchers, having low-cost data is very important.
Campbell said, however, that “we need data with a clear public purpose [focused on] whether consumers have affordable quality broadband available to them.”
If the FCC and the NTIA stray from such a clear public interest focus, and instead seek information that will be primarily of benefit to private-sector providers, then very few broadband providers will be willing to provide broadband data, he said.
The video-recording of the February 9 Broadband Breakfast Club will be available on BroadbandBreakfast.com, for FREE, within the week.
- Pushes to Privatize USPS Threaten the Oldest Universal Communications Network and Efficiency of Mail-in Ballots
- Microsoft Moves to Buy TikTok, Deepfake Identification Software, Facebook Advertising Growth Unchanged
- Digital Infrastructure Investment: Preview Video
- Breakfast Media Minute: August 3, 2020
- Jim Baller, Champion of Municipal Broadband, Fights the Fight for More Than 25 Years
Signup for Broadband Breakfast
Artificial Intelligence1 month ago
Brookings Panelists Emphasize Importance of Addressing Biases in Artificial Intelligence Technology
Artificial Intelligence1 month ago
U.S. State Department Employing Artificial Intelligence Against COVID-19 Misinformation
Broadband Roundup1 month ago
Artificial Intelligence Task Force, State Cybersecurity, ADTRAN Offers Rural Funding Guidance
Education1 month ago
A Mix of Resources and Technologies Are Needed to Close the Homework Gap
5G4 weeks ago
Verizon CEO Hans Vestberg Describes 5G-to-the-Home Vision, Claiming U.S. Leads in 5G Deployment
Infrastructure1 month ago
Michigan Broadband Cooperative Calls Report Saying Municipal Broadband Has an Unfair Advantage ‘Laughable’
Digital Inclusion1 month ago
‘Disconnection Day’ Looms as a Flouted ‘Keep Americans Connected’ Pledge Expires
House of Representatives1 month ago
Witnesses Blame Social Media Algorithms for Spread of Misinformation