Connect with us

Broadband Updates

Experts Review Reform and Standards at the FCC

WASHINGTON, March 8, 2010 – Panelists from the Federal Communications Commission, Capitol Hill, public interest groups and the private sector addressed issues of FCC reform and regulatory responsibility at “An FCC for the Internet Age: Reform and Standard-Setting, a half-day conference sponsored by Public Knowledge, Silicon Flatirons and the Information Technology and Innovation Foundation.

Published

on

WASHINGTON, March 8, 2010 – Panelists from the Federal Communications Commission, Capitol Hill, public interest groups and the private sector addressed issues of FCC reform and regulatory responsibility at “An FCC for the Internet Age: Reform and Standard-Setting,” a half-day conference sponsored by Public Knowledge, Silicon Flatirons and the Information Technology and Innovation Foundation.

Dale Hatfield from Silicon Flatirons opened up the conference by enforcing the need for an open, transparent process to encourage investment. He said regulatory risk is no good but it is critical to our fundamental belief in government. He added that while it’s important to protect investors, we run the danger of too much or too little regulation with bad effects on both ends.

“We need the right tool to stay closer to the optimum,” he said.

Public Knowledge Director Gigi Sohn moderated the first panel titled “The Present and Future of FCC Reform.” Sohn said the FCC has been “a little broken” in the past and asked the panelists to focus on agency reform by analyzing what has been done, what needs to be done and whether Congress should step in.

With regards to what reforms have been made, Mary Beth Richards, special counsel on FCC reform, said the agency’s goal is to become a model of excellence in government through openness, transparency, public input and data driven decisions.

She stressed that the FCC has made strides in seven areas beginning with public safety and readiness, data collection analysis and dissemination and system reform (licensing, comment filing and interface commonality).

The agency also has focused on how it communicates within and outside the agency, and Richards said there have been great strides in the areas of social media. Additionally. the FCC has focused on its workforce and organization, rules and procedures and all things financial.

FCC General Counsel Austin Schlick focused on the rules and processes reform and highlighted three changes in that area.

“The over-arching principles are accessibility, transparency and efficiency,” he said. “Sometimes they are complementary and sometimes they are in tension, and it is our job to balance them as best we can.”

Schlick added that the first thing the agency did was to return to the model that the drafters of the Administrative Procedures Act intended, which is to provide the public with draft rules in the FCC’s Notices of Proposed Rulemaking (NPRM) where ever possible.

As a tradeoff, this approach can lead to a loss of efficiency, he said, explaining that the agency will be using more Notice of Inquiries to gather preliminary knowledge to establish the content of draft rules.

The second change is what the agency calls the Procedures NPRM, which seeks to reform the operating procedures and rules of practice.

Schlick said the highlights of the NPRM streamline certain procedures and clear out stale items and backlogs, but perhaps most importantly they press toward a broader use of electronic filing and docketing. A big problem Schlick cited is that a number of large proceedings in the bureaus are undocketed and maintained as adjudications, making it difficult to get the comments filed in those proceedings.

Schlick highlighted a third change – the Ex Parte NPRM. The proceeding revises ex parte rules by proposing to require disclosure of every meeting addressing the merits and a summary of what was discussed in the meeting. Additionally the NPRM proposes reforms to the Sunshine Act and extends ex parte filing deadlines from one to two days to allow for more substantive filings.

Sohn turned to Matthew Hussey, who is the telecommunications legislative assistant for Sen. Olympia Snowe (R-Maine), to gauge the Hill’s reaction to the changes.

Hussey told other panelists that it seems like Sen. Jay Rockefeller (D-W. Va.), who heads the Committee on Commerce, Science and Transportation, takes FCC reform seriously.

The committee still has concerns about undue influence and the integrity of data collection, he said, later adding that it is important for the FCC to resolve its bottleneck issues because industry cannot wait. Undue delay within industry will erode potential for competition and advanced technological development, he said.

Mark Cooper, the director of research at the Consumer Federation of America, said the essence of democracy is established when the people write the rules that they want to live under.

“Change means changing the rules. Changing rules means having proceedings” and changing proceedings naturally take a long time, he said.

Cooper also took issue with ex parte communications. He believes they “are an affront and insult to democracy and a denial of due process.” Certain parties are naturally much better situated to get those meetings than others. Why does the agency need everything explained to them by an army of lobbyists, he asked rhetorically. Cooper proposed that the FCC basically abolish ex parte communications.

Nick Johnson, a former FCC commissioner and now a law professor at the University of Iowa’s law school, agreed with Cooper. Johnson said all communications with commissioners should be done in writing and if a meeting is requested, it must occur in front of the full commission and be properly documented.

Susan Crawford, former National Economic Council member and now a law professor at the University of Michigan, said the work of the agency – especially in the area of net neutrality – is particularly exciting.

“When we see something, we make progress,” said Crawford in regards to the Openinternet.gov Web site fully dedicated to that proceeding.

To address Crawford’s concerns about the ex parte procedures, she said having the members of the FCC meet more often as a commission might reduce the dependence on the ex parte system.

Schlick agreed with Crawford that to the extent that ex parte has become a substitute for other fact gathering processes, it is wrong, inefficient and not transparent.

Schlick said he is a fan of the ex parte process because he has a lot of questions that are not normally addressed on the record.

He added that there is a need to lower the barriers of entry for ex parte communications and participation at the commission.

The OpenInternet proceeding took blog postings into the record, which proved controversial. Schlick added that in the ex parte NPRM they asked how they could take a construct that assumes a small professional record and apply it to everyday people on Twitter sending their thoughts to the commission.

Cooper countered: “It is hard to accept the proposition that a two-hour long dinner with the chairman is equal to a blog post.” His proposed compromise involves the use of an independent third party scribe who takes notes on and files the ex parte letter.

Sohn changed course and asked the panelists what they thought about the perceived notion that the agency’s relationship with the White House is a little too cozy. She then asked whether the FCC should be an executive agency rather than an independent agency.

Crawford said there will always be political pressure due to appointments and congressional budget oversight but that overall, the agency does a good job to try to be independent. However, she cautioned that the real pressure on the agency comes from industry, not from politics.

The relationships with the telecom industries is way too centralized, she said, adding that the revolving door at the agency should be fixed.

She said FCC staff should not be able to work for the industry that they are regulating. Crawford ended by referencing an article written by Kevin Murphy of Catholic Law School. Murphy suggests that the FCC’s policy role should be taken away and given back to the administration, leaving it with the sole responsibility of regulating the industry. Crawford suggests that a split between policy and regulation at the FCC is an interesting idea.

Cooper noted that to decrease political influences, commissioners should be appointed to life terms or have set term limits.

The realistic approach would be to limit commissioners to one term limit, he said, adding that the ban on lobbying the commission should be equal to the amount of time served at the agency.

He also said a former FCC employee should not be allowed face-to-face communications with current staff members and commissioners.

Schlick responded by clarifying that all employees are restricted from lobbying the agency on matters they worked on. Senior officials have a one-year ban on lobbying the agency.

Ethics pledge employees such as commissioners and some other senior employees have a two-year ban, he said. If they register as a lobbyist, they cannot work on the same issues they worked on at the commission.

Sohn brought up the Sunshine Act, which has been criticized as an impediment to honest decision-making. She asked whether proposals like the Stupak bill strike balance between transparency and deliberative privilege.

That bill, named after Rep. Bart Stupak (D-Mich.), allows more than two commissioners to meet alone at any time outside of a public meeting. However, the meetings would require a representative from the general counsel’s office as well as a detailed transcription of the meeting.

Johnson said the Stupak bill addresses a serious problem but is troubled by the solution. He wants to see more deliberation between bodies and would like to see the fact-finding process outlined for the public. He is not persuaded that the language in the Stupak upholds the spirit of the APA.

Cooper wanted to bring the Sunshine discussion to the data discussion. He said that when the FCC commissions a study it should be subject to a formal process of peer review just like stated in the guidelines offered by the Office of Management and Budget. Richards responded by saying that in the data and systems reform area, there are many changes to make data more available to the public.

When asked about the FCC academic studies such as those done by the Berkman Center for Internet and Society at Harvard University, Schlick explained that they receive many of the academic studies as gifts.

When an audience member asked about the loss of engineering talent, Hussey asserted that Sen. Snowe has a strong interest in FCC reform on technology issues.

She has voiced concern about the reduction in engineering staff compared to the increase in complexity of technical issues. Hussey suggested that at least one commissioner should be an engineer. The senator also has introduced a bill to increase the engineering hires at the commission.

Another audience member asked why the agency does not use video conferencing to stream ex parte meetings.

Richards said it had been considered but the agency is in the process of currently improving its own internal bandwidth. Crawford saw no difference between a full description of the meetings and streaming the meetings. Schlick on the other had was firm to defend ex parte “if you stream ex parte then it is not ex parte.”

The final question asked the panelists how the FCC is balancing the effort to increase online comment filing with the notion that so many low-income Americans do not have access to high speed internet. Schlick responded that the question goes to the heart of the issues and the National Broadband Plan due out this month.

As Deputy Editor, Chris Naoum is curating expert opinions, and writing and editing articles on Broadband Breakfast issue areas. Chris served as Policy Counsel for Future of Music Coalition, Legal Research Fellow for the Benton Foundation and law clerk for a media company, and previously worked as a legal clerk in the office of Federal Communications Commissioner Jonathan Adelstein. He received his B.A. from Emory University and his J.D. and M.A. in Television Radio and Film Policy from Syracuse University.

Broadband Data

U.S. Broadband Deployment and Speeds are Beating Europe’s, Says Scholar Touting ‘Facilities-based Competition’

Published

on

WASHINGTON, June 10, 2014 – In spite of press reports to the contrary, U.S. broadband coverage is not falling behind European levels of service, academic Christopher Yoo said on Wednesday at the National Press Club.

“It seems like every other week there’s a new infographic or news story that talks about how the U.S. is falling behind in broadband speeds, we don’t have fiber to the home, and telecom companies are rolling in the profits while consumer prices soar,” said Doug Brake, telecommunications policy analyst with The Information Technology and Innovation Foundation, setting up the topic tackled in by Yoo in his presentation.

On the contrary, said Yoo, the founding director of the Center for Technology, Innovation and Competition, the U.S. led in many broadband metrics in 2011 and 2012. And, he said, it is precisely the absence of a “one size fits all” regulatory structure that has been been driving technological innovation forward in the marketplace.

In other words, according to Yoo, the American approach to facilities-based competition – where cable companies and telephone companies compete through rival communications networks –has succeeded.

While the findings may be “surprising” to some, Yoo said they proved the importance of examining the best approach to broadband regulation based on “real world data.”

The notion that “fiber is the only answer” to affordable high-speed broadband is a misconception, he said. Countries emphasizing fiber over rival technologies – including Sweden and France – were among the worst broadband performers.

In the U.S., 82 percent of households received broadband at speeds of at least 25 Megabits per second (Mbps), versus 54 percent in Europe. In rural areas, the difference was even greater: 48 percent in the U.S., versus 12 percent in Europe. The five countries that did beat U.S. coverage of greater than 25 Mbps (including Denmark and the Netherlands) are compact, urbanized regions with greater population densities.

Additionally, even looking at fiber-based technologies, the U.S. is outperforming Europe, he said. Fiber coverage in the U.S. went from 17 percent in 2011 to 23 percent in 2012. In Europe, fiber coverage went from 10 percent in 2011 to 12 percent in 2012.

And, based on the measurement of telecommunications investment per household, the U.S. number is more than double that of Europe: $562 versus $244 in the old world.

And, he said, American users consumed 50 percent more bandwidth than Europeans in 2011 and 2012.

“The best measure of how much a network is really worth is how much you use it,” Yoo said. “It’s great to have a very fast car, but unless you use it, it’s not really doing very much for you.”

One area where the U.S. could see improvement is in the area of broadband adoption, Brake said. That demonstrates continued need to demonstrate value in broadband for consumers.

Yoo agreed: “Availability is only a part of the question. There are plenty of people who have broadband available to them who are choosing not to adopt.”

Moderator Gerry Faulhaber added: “As regulators, we can mandate coverage, we can mandate buildout. What we can’t do is mandate people to use it.”

Keeping a series of tiered rates for broadband service is exactly what America’s broadband rollout needs, said Brake. That not only encourages consumers to purchase internet at lower introductory rates, it also efficiently places the burden on those who wish to pay more for higher-speed service. This helps to recuperate costs for networks.

“Is it better to provide 75 to 100 Mbps to 80 to 90 percent of the population, or one Gigabit per second to 10 to 20 percent of the population?”

Blair Levin, former director of the FCC’s National Broadband Plan, and now communications a science fellow at the Aspen Institute, said that comparisons with Europe doesn’t change America’s objective to build deeper fiber, use broadband to improve the delivery of goods and services, and connect more users.

“Which activity is more productive – looking at oneself in the mirror and asking, ‘do these jeans make me look fat?’ or going to the gym? Focusing on actions that improve one’s condition is better than wondering about how one should appear relative to others,” said Levin.

Continue Reading

Broadband Updates

Discussion of Broadband Breakfast Club Virtual Event on High-Capacity Applications and Gigabit Connectivity

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

Published

on

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

To register for the next Broadband Breakfast Club Virtual Event, “How Will FirstNet Improve Public Safety Communications?,” on Tuesday, October 15, 2013, at 11 a.m. ET/10 a.m. CT, please visit http://gowoa.me/i/XV8

Continue Reading

#broadbandlive

Breakfast Club Video: ‘Gigabit and Ultra-High-Speed Networks: Where They Stand Now and How They Are Building the Future’

Published

on

WASHINGTON, May 24, 2013 – Emphasizing the developing nature of broadband networks in the United States, speakers at the May 21 Broadband Breakfast Club event said that the recent achievement of ultra-high speed broadband networks has been a critical factor seeding transformative developments for organizations, individuals and communities. These developments, panelists said, were simply not possible before with slower speed networks.

Yet panelists at the event, “Becoming a Gigabit Nation: What Have We Learned About Ultra-High Speed Broadband?” also agreed that speed is not actually the most important factor in the maturing of these networks.

Event Highlights

Complete Program

Successful deployment of such networks requires concerted efforts and continual upgrades involving community leadership, assessment of consumer needs and desires, infrastructure development, application development and successful assessment of usage patterns. All of these factors affect the success of such gigabit and high-speed networks, panelists said.

In other words, high-speed networks need to be developed in concert with proposed applications, which are in turn developed in the context of their communities or customer base.

As gigabit cities consultant David Sandel said, gigabit and smart city transformation being undertaken is 90 percent sociology and 10 percent infrastructure. Sandel, president of Sandel and Associates, works with St. Louis, Kansas City and other communities worldwide and runs the Gigabit City Summit, a global forum of community leaders who are engaged in discussion on new forms of leadership for managing such networks.

Sandel said that new gigabit leadership must break out of traditional silos and engage in greater information exchange and collaboration. Less hierarchy, more inclusion and more communication, facilitate the success of gigabit services and applications, he said.

What’s Happening Now

Sandel and other panelists gave examples of how 100-plus megabit per second and gigabit-level connectivity is already providing considerable benefits to cities that have it – even where the majority of a city’s consumers do not yet have needs for those levels of service.

For example, Sandel described the success of a two-mile gigabit main street in St. Louis, Missouri. This project has attracted a number of innovative businesses to the area. He said that such projects carry several benefits to an entire city, such as enabling the use of cloud services, driving up real estate values, and creating high-value jobs. In addition, the current relatively higher costs of gigabit service in communities can be partially offset by institutional and industrial uses.

Similarly, Sheldon Grizzle, founder and co-director of the Chattanooga-based GIGTANK, a technology start-up accelerator, said that the implementation of gigabit broadband by the local utility EPB has been a boon to its electrical grid. Power outages in the area have decreased by 60 percent, he said.

Grizzle says that Chattanooga, as a small city of 170,000, sees itself as a good test case for gigabit networks. Its network now provides speeds of 50 Mbps for 50,000 subscribers. It also offers or Gbps symmetrical service (i.e. 1 Gbps upload and 1 Gbps download) for $300 a month, although the number of subscribers has been fewer. He attributed the relatively low demand for the gigabit offered to the high price point.

Grizzle said that GIGTANK has been recruiting application developers from around the world to build appropriate apps for the community, as Chattanooga’s gigabit network grows beyond its infancy.

Speed Issues

Notwithstanding high-profile gigabit build-outs in recent years, nationally broadband speeds have been steadily increasing by other methods over the last several years, said Kevin McElearney, senior vice president of network engineering and technical operations for Comcast Cable.

McElearney said that, for example, Comcast has innovated on nextgen technologies every year, increasing network speeds 11 times over the last 11 years, and is now running terabit links over the backbone to allow capacity for new applications. He said that Comcast now provides up to 100 Mbps download capacity, with 70 percent of consumers electing for 25 Mbps and 30 percent for tiers higher speeds.

McElearney said that Comcast sees the increasing use of multiple devices in households as the principal driver behind the demand for higher broadband speeds for consumers.

Application Development

William Wallace, Executive Director of U.S. Ignite, a developer of gigabit-ready digital experiences and applications, spoke of an “internet of immersive experience,” suggesting an internet experience completely different from prior experiences. Users will also be creating their own experiences, he said.

Wallace further noted that customization of network features around applications will help to build in the greatest efficiencies. For example, different applications will be characterized by different speeds, security features, cloud storage locations, latencies etc.

Scott Wallsten, vice president for research and senior fellow at the Technology Policy Institute, said that focus on ultra-high broadband speeds is misplaced. According to Wallsten, because internet speeds are already increasing consistently, policies focusing on speed are unnecessary. Instead, Wallsten said, greater attention should be paid to other metrics of broadband quality, such as latency and reliability.

Additionally, Wallsten stated that the government’s adoption programs should be focused on low-income inner-city non-adopters rather than rural high-speed development. He said that the Federal Communications Commission’s high cost fund portion of the Universal Service Fund has not been sufficient to pay for rural development. Instead, the best hope to help the most individuals get broadband is to focus on urban areas. Increased efficiencies in cities will offer a better chance for providers to lower costs and then expand network development in rural areas.

Sandel concluded with how education is critical for successful gigabit network development and that there should be a three-pronged approach: education for leaders as to the impacts and benefits of gigabit networks and applications across all sectors, development of clear economic development models that draw lines to revenue flows, and policies for inclusion of all populations so that everyone can participate.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending