Evidence-Based Policy Making is Particularly Important in Managing Radio Frequency Spectrum
October 23, 2020 – Evidence-based policy making needs to be framed by the correct questions, agreed panelists at the Silicon Flatirons event on October 13 and 15. In the first panel, “Evidence-based policy making in perspective,” Adam Scott, director general of spectrum policy at Innovation, Science
Liana Sowa
October 23, 2020 – Evidence-based policy making needs to be framed by the correct questions, agreed panelists at the Silicon Flatirons event on October 13 and 15.
In the first panel, “Evidence-based policy making in perspective,” Adam Scott, director general of spectrum policy at Innovation, Science and Economic Development in Canada, contrasted the questions, “Should we make broadband a human right?” with, “What are the social and economic benefits of connecting a community that hasn’t been connected yet?”
He asserted that the first question is more philosophical and doesn’t directly ask for data, while the second question can be answered very succinctly with data.
Marrying data and decision-making is the best way to think about evidence-based policy making, said Renee Gregory, senior regulatory affairs advisor at Google and moderator of a session on spectrum sharing. She was speaking about work by Thyaga Nandagopal of the National Science Foundation, who had discussed innovate the current spectrum allocation model.
Additionally, evidence-based policy making does not rely on data gathered to answer funded questions, said Blair Levin, nonresident senior fellow of the Metropolitan Policy Program at the Brookings Institution.
Levin did allow that to make room for innovation it was sometimes difficult to make policies based on evidence. He cited the theoretical work the FCC did when they made the spectrum policy auctions, pointing out that work wasn’t evidence-based because nothing like that had been done before.
How legislators view evidence-based data
Kate O’Connor, member of the chief telecom counsel’s office for the U.S. House Energy and Commerce Committee, said that in a world of information overload, nearly every person could find information to support their position. Therefore, data needed to be considered holistically.
O’Connor said the communications space was unique because it was so new. The spectrum crunch is a lot different than in the past, and the private sector has more resources than the government in some cases.
There’s bipartisan consensus that the FCC hasn’t done a good job of collecting data, said Levin. He suggested having real experts used to looking at data examine the types of data needed for effective spectrum policy.
Scott Wallsten, president and senior fellow at Technology Policy Institute, said that a lot of the FCC’s data collection methods are really antiquated. He said we should be supplementing our data with surveys like the ones in the Bureau of Labor Statistics and added that it would be nice see the two agencies work together better. He also advocated for transparency in data submission, saying transparency allowed for contextual data interpretation.
Giulia McHenry, chief of the office of economics and analytics at the Federal Communications Commission, agreed that transparency helps to remove biases when examining evidence.
Others stress the need for enforcement in spectrum management
Dale Hatfield, spectrum policy initiative co-director and distinguished advisor at Silicon Flatirons, said in a later event that evidence-based policy making could prove futile without proper enforcement, and said the FCC should delegate some of their statutory power to private industry.
The better the hypothesis, the lower the cost and burden on a company like Hawkeye to help, said Chris Tourigny, electronics engineer at the Federal Aviation Administration, at the “Spectrum Sharing Policy among Active and Passive Service” panel on Thursday.
Panelists Jennifer Manner, senior vice president of regulatory affairs at EchoStar Corporation, and Ashley Zauderer, program director in the division of astronomical sciences at NSF, emphasized the need for being open to amending data along the way.
The importance of continued communication in policy making was also discussed. Stefanie Tompkins, vice president for research and technology transfer at the Colorado School of Mines, shared that years ago they worked with a communications company to get a software package for multipath technology.
They found many of their signals bouncing and going longer than thought they should, which “led to many middle of the night panic attacks.” The communications company had rounded the speed of light—a cultural mismatch that led to a lot of mistakes. Tompkins said this experience applies to how we interpret facts.
David Redl, of Salt Point Strategies, moderated the first policy making panel.