Connect with us

Broadband Updates

Panel Tackles Prickly Issue of FCC Regulation

WASHINGTON, March 9, 2010 – Public Knowledge, Silicon Flatirons and the Information Technology and Innovation Foundation last week sponsored a half-day conference to discuss the Federal Communications Commission and its efforts in reform, regulatory responsibility and standard setting. The second panel looked at the pros and cons of regulation, self-regulation and co-regulation.

Avatar

Published

on

WASHINGTON, March 9, 2010 – Public Knowledge, Silicon Flatirons and the Information Technology and Innovation Foundation last week sponsored a half-day conference to discuss the Federal Communications Commission and its efforts in reform, regulatory responsibility and standard setting.

The panel “Regulatory Reform: Standard Setting and Mediating Institutions” moderated by ITIF President Rob Atkinson took a philosophical approach on regulatory responsibility, how to frame certain problems and where the FCC should regulate, co-regulate or self-regulate.

Examining the Internet Ecosystem
Pierre DeVries, Silicon Flatirons’ senior adjunct fellow at the University of Colorado, started the panel by introducing the idea of an internet ecosystem. He questioned whether the two terms really fit together and why they are used so much. He compared the question of whether the internet is an ecosystem to that of whether a whale was an elephant. They clearly are not the same but “if you want to know more about a whale, knowing a little about an elephant will help because they are both very large social mammals,” he explained.

There are two major common features between the internet and ecosystems, according to DeVries. First, in both cases there is a responsibility to manage and regulate, but because they are so large it is very difficult to do so. Second, both are examples of complex adaptive systems, built up of subsystems that interact and adapt. He provides the example of an immune system. For example, people can make changes in life style or diet that will help our immune system but it does most of the work on its own.

DeVries asks what to do with these complex adaptive systems in order to regulate them. He asserts that it is important to rid ourselves of the “illusion of one right answer…it is not applicable to complex reality.” He believes that if we used a physicist or economist’s approach to deal with complex systems we would look for an answer at the efficient minimum, which is unstable.

By using an ecosystem management approach, we look for the point of resistance where there will be crashes and booms but they will stay within certain bounds and return toward the middle. The principles behind ecosystem management involve being flexible, delegating responsibility and preserving diversity.

To apply this theory to the panel, DeVries said the challenge for how one should organize regulatory institutions is “how do you keep learning, how do you learn as you go…We are seeing that principles are more appropriate than rules.”

Is Self-Regulation a Dirty Word?
Rick Whitt, Google’s Washington telecom and media counsel, has written on the topics of new growth thinking affecting policy makers and adaptive policymaking. He states that a key problem for policy makers is in determining which framework and tools to employee in different situations. Policy makers often overlook tools they can use to achieve their agenda, especially those that rest outside the agency, he said.

Whitt added that while self-regulation has a dirty connotation, co-regulation means having a government back stop while allowing the unloading of complex technical issues to experts in engineering and other bodies. Without any standards there will be maximum uncertainty for the players to find out appropriate and inappropriate ways to behave. Ideal regulation would balance adaptability, accountability, and some form of enforceability, he said.

Atkinson contrasted the co-regulation and adaptive systems idea to the earlier panel that touted rigid views on openness with numerous filings and notices of proposed rule making. He asked Kathryn Brown, Verizon senior vice president, public policy and corporate responsibility, if those views are reconcilable.

Brown used privacy issues and wireless technology compatibility examples that demonstrate how standards have been set in two very important areas without any actual rule of law or government regulation.

These solutions required a technical dialogue and understanding of compatibility. As we look at how to govern this space through a rule of law…Verizon and Google have sat down to discuss principles for governance and how to determine what would be appropriate oversight, Brown said.

Kathleen Wallman, CEO of Wallman Strategic Consulting, asked if protecting public interests is done outside of government regulation.

The standard-setting process is completely chaotic, she said.

“The public’s interest is things have to work…things have to be not too expensive…we want things to be dazzling and new on a regular and cyclical basis,” she said.

Wallman explained that the current standard setting process is moving toward “ad hoc-ism,” where companies come together opportunistically to set standards for what they need to do in the near future.

This ad hocism works for the first two objectives but how to protect the public interest of innovation is the real question.

She added: “Maybe there are places where we don’t need standard setting, just a platform to mix it up and figure it out” and hoped that the agency figures out many of its broadband issues that way.

Innovation in a Digital Age
Atkinson then asked Paul de Sa, chief of the FCC’s Office of Strategic Planning and Policy Analysis, about the future of innovation in the digital age.

De Sa began by noting that it is very difficult for the government to get the valuable information from parties that are part of a proceeding. He continued that the agency has been careful about trying to define the problems to solve before they gather data and burden others. In order to set standards, they need to examine the ecosystem and try to determine the problem, and then ask who should be setting the standards and how they should be done.

De Sa added that standards are important because they give users confidence and provide more information for better choices. He mentioned that internet service standards downstream ought to be set for delivery at a certain speeds.

If there are no standards for these speeds, it is hard for consumers to make choices, he said. Furthermore, if infrastructure standard settings change constantly it will be impossible for application developers to create new products. Standards that are too tight will constrain innovation. However, many different standards will prompt innovators to customize their product for the myriad standards. That will impede the ability of small players and innovators to compete with those that have greater resources.

De Sa also noted the importance of facilitating interoperability. He added that it does not have to be done through the FCC but there needs to be a format and platform for setting standards.

“We don’t want to be asleep at the switch,” he said. “It would be a mistake to pretend that interests are always aligned, and that innovators and new entrants can always compete with incumbents in terms of resources and abilities to enter markets.”

Atkinson asked which kinds of problems are best suited to which kinds of approach.

DeVries brought up standards as a notion for engineers, when engineers hear standards there is a defined problem and they need to decide which solution is best. He said self-regulation should be used “when there is a fair degree of homogeneity in the culture of people working on the problem.” At that point, “strong norms will lead to enforcement and therefore it is unlikely that there will be bad actors.”

“The next step up in terms of what we mean by standards is standards of behavior, norms, how different players interact and how they are going to divide up the pie,” he said. DeVries noted that while regulators traditionally thought this was their role, in a fast paced industry, it makes sense for companies to decide how they want to divide it up.

The Net Neutrality Equation
DeVries uses the net neutrality debate as an example. At the engineer level, the issue is between Verizon, Comcast and AT&T as to how they run the networks and what makes up reasonable network management.

DeVries continued that the discussion is then kicked up to what he considers the commissars. They discuss how to divide up the rents. This he believes is where co-regulation works.

“Normally there is a conflict and a threat of a stick in the background, if you do not solve the problem we will solve it for you,” he said,

For this part of the process to work, content providers and participants must be clear about the type of stick they want held over their heads. Regulators will then play a backstop role. If there is an insolvable problem with well-demonstrated harm to a stable industry, they will need to step in and write the rules.

Brown believed that DeVries’ discussion lacks the central force of the users. She said that applications are changing in the face of consumers’ expectations, which put pressure on network providers.

She touted the partnership between Google and Verizon on the Droid smart phone: “There are no user manuals…the users will build their own experience.” Brown did not disagree that there aren’t issues at the engineering level, but she thinks that consumers play a much larger role now.

De Sa had some issues with co-regulation. He believes it is not obvious to pin down what the in-between is in an internet ecosystem. Businesses and innovators would like to avoid Washington, he said..

“If co-regulation means lots of meetings then that inherently favors the larger player and excludes many of those without the resources,” argued de Sa.

Atkinson asked the panelists to think of disclosure of bandwidth practices and ask about how the process works on a co regulation basis. Who is included, who organizes and manages, how you deal with conflicts and bad actors?

Whitt wanted to clarify the notion of regulation. He said the common law process, which basically says what the agency uses is a good idea as long as it is transparent and expeditious. Co-regulation would be useful in defining network management and transparency.

He said either the agency can come up with a standards or rule that can then be developed though the complaint process, or a broad based group of users, developers and industry can adopt standards of what transparency means to them.

Through the use of online tools, sharing of ideas and discussion, this co-regulation would provide for a much richer environment for standard setting to happen. The FCC would either agree with the standards or see if they were going too far and in that way they would be acting with a stick in case of communications break down.

Whitt added that these co-regulation entities could be created through advisory groups for each issue, or there could be different groups setting acceptable standards from each of the players’ perspectives. Whitt believed that this is the tough question.

Wallman worried that in creating new advisory committees, “we would be creating another meeting to go to, a new barrier to overcome.”

Brown added that with the explosion of new technologies, regulation might become a whirlwind for anyone in this space. She did not want to see a government framework imposed on the ethos of freethinking and innovation.

DeVries countered that there are non-engineering problems that need to be addressed. “What does openness mean in practice? What is allowable price discrimination and what isn’t? When you give advance notice of terms, what is advanced, what is notice and what are terms? These all need to be addressed through some form of regulation or standard setting. DeVries is not sure that industry and competitors are solving these problems every day.

In response to DeVries, Brown stated that with regard to market definition, she does not believe that academia has caught up with reality as to who are the real competitors.

In response to Brown’s comment, Whitt clarified that Verizon and Google frame the issues surrounding the notion of the internet ecosystem very differently. Google does not believe that everyone in an internet ecosystem should be treated the same and likewise, does not believe that the internet eco system is truly a self-regulating system.

Brown and Whitt agreed that the lowest common denominator approach is not what is needed. The metric has to be, can we live with it and develop to the next level. They agreed that the government should not force parties together and that the public plays a large part of the regulatory equation.

De Sa ended by stating that he is optimistic about the productivity that will be created through Washington.

As Deputy Editor, Chris Naoum is curating expert opinions, and writing and editing articles on Broadband Breakfast issue areas. Chris served as Policy Counsel for Future of Music Coalition, Legal Research Fellow for the Benton Foundation and law clerk for a media company, and previously worked as a legal clerk in the office of Federal Communications Commissioner Jonathan Adelstein. He received his B.A. from Emory University and his J.D. and M.A. in Television Radio and Film Policy from Syracuse University.

Broadband Data

U.S. Broadband Deployment and Speeds are Beating Europe’s, Says Scholar Touting ‘Facilities-based Competition’

Avatar

Published

on

WASHINGTON, June 10, 2014 – In spite of press reports to the contrary, U.S. broadband coverage is not falling behind European levels of service, academic Christopher Yoo said on Wednesday at the National Press Club.

“It seems like every other week there’s a new infographic or news story that talks about how the U.S. is falling behind in broadband speeds, we don’t have fiber to the home, and telecom companies are rolling in the profits while consumer prices soar,” said Doug Brake, telecommunications policy analyst with The Information Technology and Innovation Foundation, setting up the topic tackled in by Yoo in his presentation.

On the contrary, said Yoo, the founding director of the Center for Technology, Innovation and Competition, the U.S. led in many broadband metrics in 2011 and 2012. And, he said, it is precisely the absence of a “one size fits all” regulatory structure that has been been driving technological innovation forward in the marketplace.

In other words, according to Yoo, the American approach to facilities-based competition – where cable companies and telephone companies compete through rival communications networks –has succeeded.

While the findings may be “surprising” to some, Yoo said they proved the importance of examining the best approach to broadband regulation based on “real world data.”

The notion that “fiber is the only answer” to affordable high-speed broadband is a misconception, he said. Countries emphasizing fiber over rival technologies – including Sweden and France – were among the worst broadband performers.

In the U.S., 82 percent of households received broadband at speeds of at least 25 Megabits per second (Mbps), versus 54 percent in Europe. In rural areas, the difference was even greater: 48 percent in the U.S., versus 12 percent in Europe. The five countries that did beat U.S. coverage of greater than 25 Mbps (including Denmark and the Netherlands) are compact, urbanized regions with greater population densities.

Additionally, even looking at fiber-based technologies, the U.S. is outperforming Europe, he said. Fiber coverage in the U.S. went from 17 percent in 2011 to 23 percent in 2012. In Europe, fiber coverage went from 10 percent in 2011 to 12 percent in 2012.

And, based on the measurement of telecommunications investment per household, the U.S. number is more than double that of Europe: $562 versus $244 in the old world.

And, he said, American users consumed 50 percent more bandwidth than Europeans in 2011 and 2012.

“The best measure of how much a network is really worth is how much you use it,” Yoo said. “It’s great to have a very fast car, but unless you use it, it’s not really doing very much for you.”

One area where the U.S. could see improvement is in the area of broadband adoption, Brake said. That demonstrates continued need to demonstrate value in broadband for consumers.

Yoo agreed: “Availability is only a part of the question. There are plenty of people who have broadband available to them who are choosing not to adopt.”

Moderator Gerry Faulhaber added: “As regulators, we can mandate coverage, we can mandate buildout. What we can’t do is mandate people to use it.”

Keeping a series of tiered rates for broadband service is exactly what America’s broadband rollout needs, said Brake. That not only encourages consumers to purchase internet at lower introductory rates, it also efficiently places the burden on those who wish to pay more for higher-speed service. This helps to recuperate costs for networks.

“Is it better to provide 75 to 100 Mbps to 80 to 90 percent of the population, or one Gigabit per second to 10 to 20 percent of the population?”

Blair Levin, former director of the FCC’s National Broadband Plan, and now communications a science fellow at the Aspen Institute, said that comparisons with Europe doesn’t change America’s objective to build deeper fiber, use broadband to improve the delivery of goods and services, and connect more users.

“Which activity is more productive – looking at oneself in the mirror and asking, ‘do these jeans make me look fat?’ or going to the gym? Focusing on actions that improve one’s condition is better than wondering about how one should appear relative to others,” said Levin.

Continue Reading

Broadband Updates

Discussion of Broadband Breakfast Club Virtual Event on High-Capacity Applications and Gigabit Connectivity

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

Drew Clark

Published

on

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

To register for the next Broadband Breakfast Club Virtual Event, “How Will FirstNet Improve Public Safety Communications?,” on Tuesday, October 15, 2013, at 11 a.m. ET/10 a.m. CT, please visit http://gowoa.me/i/XV8

Continue Reading

#broadbandlive

Breakfast Club Video: ‘Gigabit and Ultra-High-Speed Networks: Where They Stand Now and How They Are Building the Future’

Avatar

Published

on

WASHINGTON, May 24, 2013 – Emphasizing the developing nature of broadband networks in the United States, speakers at the May 21 Broadband Breakfast Club event said that the recent achievement of ultra-high speed broadband networks has been a critical factor seeding transformative developments for organizations, individuals and communities. These developments, panelists said, were simply not possible before with slower speed networks.

Yet panelists at the event, “Becoming a Gigabit Nation: What Have We Learned About Ultra-High Speed Broadband?” also agreed that speed is not actually the most important factor in the maturing of these networks.

Event Highlights

Complete Program

Successful deployment of such networks requires concerted efforts and continual upgrades involving community leadership, assessment of consumer needs and desires, infrastructure development, application development and successful assessment of usage patterns. All of these factors affect the success of such gigabit and high-speed networks, panelists said.

In other words, high-speed networks need to be developed in concert with proposed applications, which are in turn developed in the context of their communities or customer base.

As gigabit cities consultant David Sandel said, gigabit and smart city transformation being undertaken is 90 percent sociology and 10 percent infrastructure. Sandel, president of Sandel and Associates, works with St. Louis, Kansas City and other communities worldwide and runs the Gigabit City Summit, a global forum of community leaders who are engaged in discussion on new forms of leadership for managing such networks.

Sandel said that new gigabit leadership must break out of traditional silos and engage in greater information exchange and collaboration. Less hierarchy, more inclusion and more communication, facilitate the success of gigabit services and applications, he said.

What’s Happening Now

Sandel and other panelists gave examples of how 100-plus megabit per second and gigabit-level connectivity is already providing considerable benefits to cities that have it – even where the majority of a city’s consumers do not yet have needs for those levels of service.

For example, Sandel described the success of a two-mile gigabit main street in St. Louis, Missouri. This project has attracted a number of innovative businesses to the area. He said that such projects carry several benefits to an entire city, such as enabling the use of cloud services, driving up real estate values, and creating high-value jobs. In addition, the current relatively higher costs of gigabit service in communities can be partially offset by institutional and industrial uses.

Similarly, Sheldon Grizzle, founder and co-director of the Chattanooga-based GIGTANK, a technology start-up accelerator, said that the implementation of gigabit broadband by the local utility EPB has been a boon to its electrical grid. Power outages in the area have decreased by 60 percent, he said.

Grizzle says that Chattanooga, as a small city of 170,000, sees itself as a good test case for gigabit networks. Its network now provides speeds of 50 Mbps for 50,000 subscribers. It also offers or Gbps symmetrical service (i.e. 1 Gbps upload and 1 Gbps download) for $300 a month, although the number of subscribers has been fewer. He attributed the relatively low demand for the gigabit offered to the high price point.

Grizzle said that GIGTANK has been recruiting application developers from around the world to build appropriate apps for the community, as Chattanooga’s gigabit network grows beyond its infancy.

Speed Issues

Notwithstanding high-profile gigabit build-outs in recent years, nationally broadband speeds have been steadily increasing by other methods over the last several years, said Kevin McElearney, senior vice president of network engineering and technical operations for Comcast Cable.

McElearney said that, for example, Comcast has innovated on nextgen technologies every year, increasing network speeds 11 times over the last 11 years, and is now running terabit links over the backbone to allow capacity for new applications. He said that Comcast now provides up to 100 Mbps download capacity, with 70 percent of consumers electing for 25 Mbps and 30 percent for tiers higher speeds.

McElearney said that Comcast sees the increasing use of multiple devices in households as the principal driver behind the demand for higher broadband speeds for consumers.

Application Development

William Wallace, Executive Director of U.S. Ignite, a developer of gigabit-ready digital experiences and applications, spoke of an “internet of immersive experience,” suggesting an internet experience completely different from prior experiences. Users will also be creating their own experiences, he said.

Wallace further noted that customization of network features around applications will help to build in the greatest efficiencies. For example, different applications will be characterized by different speeds, security features, cloud storage locations, latencies etc.

Scott Wallsten, vice president for research and senior fellow at the Technology Policy Institute, said that focus on ultra-high broadband speeds is misplaced. According to Wallsten, because internet speeds are already increasing consistently, policies focusing on speed are unnecessary. Instead, Wallsten said, greater attention should be paid to other metrics of broadband quality, such as latency and reliability.

Additionally, Wallsten stated that the government’s adoption programs should be focused on low-income inner-city non-adopters rather than rural high-speed development. He said that the Federal Communications Commission’s high cost fund portion of the Universal Service Fund has not been sufficient to pay for rural development. Instead, the best hope to help the most individuals get broadband is to focus on urban areas. Increased efficiencies in cities will offer a better chance for providers to lower costs and then expand network development in rural areas.

Sandel concluded with how education is critical for successful gigabit network development and that there should be a three-pronged approach: education for leaders as to the impacts and benefits of gigabit networks and applications across all sectors, development of clear economic development models that draw lines to revenue flows, and policies for inclusion of all populations so that everyone can participate.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending