Better Broadband & Better Lives

Tag archive

Rob Atkinson

Broadband Prioritization Not Such A Terrible Idea, Georgetown University Panelists Say

in Broadband's Impact/FCC/National Broadband Plan by

WASHINGTON, June 17, 2014 – Paid prioritization may not be such a bad idea – in fact, the notion that the entire internet needs to be treated equally is misleading, said panelists at a June 10 symposium on internet regulation.

Internet users might have greater interest in seeing phone packets go through first than someone downloading an episode of “Game of Thrones,” to take a hypothetical episode, said Rob Atkinson, president of the Information Technology and Innovation Foundation.

“I have a deep interest in my Skype conversation going reasonably well and I really am indifferent to whether my email gets delivered 27 milliseconds late, unless you’re always looking at your email…even then, 27 milliseconds isn’t bad,” said Atkinson.

Similarly, a highly competitive multiplayer video game might give a company good reasons to pay for prioritization, Atkinson said.

It’s a “game of gigs,” said Aspen Institute fellow Blair Levin in reference to the broadband landscape. And much like “Game of Thrones,” Levin said, “not everyone’s going to survive.”

“Network providers block many, many things. They block malware,” Atkinson said. “If they didn’t, all computers would be infected. The question is, what kind of blocking?”

Core values are needed, said AT&T’s vice president of public policy Brent Olson: reliability, consumer protection, competition, and public safety. While the panelists all shared this sentiment, none had a definite answer to what the proper regulatory framework should be at this time.

While ISPs should be prohibited from punishing certain consumers, Patrick Gilmore, chief technology officer of Markley Group, said that there was a big misconception about “fast lanes” and “slow lanes”: They don’t actually exist.

“I like to think of it more as a freeway ramp with too many cars and you let some of them cut in line. There’s not two pipes. There’s actually just one pipe” with a que” Gilmore said.

ISPs technically have no incentive to ever block traffic for some users and content providers, said Olson. The only provider that’s ever tried that last decade was a small telecommunications company called Madison River, he said, and it’s now bankrupt.

Jeffrey Campbell, vice president of global government affairs for Americas at Cisco Systems, expressed his fear that a major legislative change — including reclassifying internet services as Title II public utility under a theory of broadband as a public utility would bring about rules that make no sense in the current technology marketplace.

If regulatory bodies are going to interfere with broadband at all, the best route is the one that causes the least harm, Campbell said. That route would be the much more modest approach of Section 706, which was added to the Communications Act in 1996.

“I’m a real believer in what I call Sword of Damocles regulation – it’s the sword that’s going to come down and get [people] if they do something wrong,” Campbell said. “You can write whatever you want in the rules, but at the end of the day, if a large national provider does something that just feels wrong, smells wrong, is wrong – whether it’s against the rules or not – it will stop.”

Practices inciting controversy among people have always ended prematurely, Campbell said. He said they were halted without rules because the public didn’t like what was going on.

” As long as we have transparency and sunlight, any problems that arise will be cleansed no matter what regulatory regime we have,” said Campbell.

Gilmore did not agree. Although it is true that some bad actors change when exposed, there are also bad actors on whose actions are never she light in the first place.

“Comcast got caught…making sure that unless you paid them for peering, your traffic would never get through, [at least] not reliably,” he said. ”Comcast in many ways is a monopoly. When I sit at home, I have a choice between 1 Megabit per second (Mbps) Verizon and 100 Mbps Comcast. Guess which one I use? Guess how much public outcry there was about Comcast saying, ‘oh, my transit is kind of congested.’ There was none.”

The reason it is not being corrected, Gilmore said, is because there’s nowhere else to turn for consumers. Switching to the 1 Mbps Verizon network is not a viable option. Ultimately,  even Gilmore said he is “scared about government regulation on the internet” because of the damage it could cause ISPs by requiring interconnection.

Campbell said Internet traffic is projected to triple by 2018. He said letting “suits” in Washington decide the fate of the web might be misguided.

Statement of Information Technology and Innovation Foundation on Julius Genachowski’s Departure

in FCC by

From Rob Atkinson, President of ITIF:

We commend Chairman Genachowski for his leading role in spurring tremendous advances in broadband innovation and the Internet economy, while assisting the United States in transitioning to an advanced wireless world. During his tenure, the U.S. has made major advances in the speed of our broadband networks and expanded the deployment of next generation broadband technology. As ITIF has recently shown, 82 percent of American homes are now passed by a broadband network with speeds of 100 Mbps or higher, over 96 percent of the U.S. now has access to wired broadband, and we lead the world in adoption of 4G LTE mobile broadband.

These accomplishments are due in part to the FCC’s efforts to develop the National Broadband Plan which establishes a road map for expanding innovation and deployment of high speed technology while enhancing the use of advanced networking in education, healthcare, homeland security and numerous additional industries. In addition, Commissioner Genachowski has been a leader in addressing the digital divide – the major factor holding back America’s digital progress – creating innovative educational partnerships such as Connect2Compete which seeks to enhance broadband adoption and digital literacy.

Under the Chairman’s leadership the FCC has also been a leader in expanding spectrum access through the first of its kind incentive auctions, which have the potential to free up large amounts of valuable spectrum for wireless broadband. Chairman Genachowski has also successfully navigated the thorny issue of net neutrality, adopting a “third way” approach that addresses open access concerns without inhibiting continued Internet innovation.

We praise the Chairman for his efforts and hope that his successor takes the same reasoned and innovative approach to these important issues that have tremendous implications for the continued health of our growing Internet environment.

ITIF Panel Debates Government Intervention in Broadband Networks

in Broadband's Impact by

Updated, June 2, 3:55 p.m. EDT: This article has been corrected with respect to comments by Jim Baller.

WASHINGTON, June 2, 2011 – The Information Technology & Innovation Foundation assembled a panel Wednesday to debate the merits and minuses of government support of broadband networks.

The  Oxford-style debate, “Governments Should Neither Subsidize nor Operate Broadband Networks to Compete with Commercial Ones,” included four panelists.  Rob Atkinson, President of ITIF and Jeff Eisenach, Managing Director and Principal of Navigant Economics argued against government intervention; Jim Baller, President of Baller Herbst Law Group, and Chris Mitchell, Director of Telecommunications at the Institute for Local Self-Reliance argued in favor.

The arguments against government intervention centered largely around the harms to private-sector competition and duplicative nature of government sponsored networks that compete with private ones.  The pro-government sponsorship side argued that there are significant market failures in the sector and that governments, especially local ones, are in a unique position to assess the overall needs of the community without deference to profit models.

Rob Atkinson, President of ITIF, steered the beginning of the debate to the problem of government overbuilding – that is where a government broadband network is built where another network already exists.

“[Overbuilding] takes money out of the broadband ecosystem that could be used to bring prices down for the rest of us or improve [capital expenditures into other networks] for the rest of us,” said Atkinson.

Moreover, he said, governments that overbuild networks tend to impede competition rather than foster it.  Because profit margins in the industry do not approach 50 – 60 percent, the addition of another competitor in the field will raise the cost per consumer to network providers, thereby eventually raising costs to the consumer.

Mitchell, however, asserted that fundamental flaws in the marketplace necessitate government intervention to supply needed infrastructure.

“Market competition has failed,” said Mitchell. “Americans pay too much for too little compared to peer nations.”

Mitchell also questioned whether markets are truly in competition with each other, given significant variations in service quality and speed.

“The question [of whether networks compete] is vague,” he said, “is fiber really duplicative of a DSL network?”

Following his fellow panelist’s point, Baller noted that in several instances, municipalities unsuccessfully attempted to convince network providers to lay networks before building their own.

“Community leaders have almost always gone to incumbents and pleaded with them to provide networks first,” he said, pointing to the success of Ft. Wayne, Indiana as a rare example of an incidence where an municipality convinced a provider to lay a high-speed fiber network.  In many other instances, said Baller, communities have been unable to convince established carriers like Verizon to connect them to fiber networks.

Though municipalities may have a unique eye to the needs of the community, government-supported networks still cost money and will always drag unwilling participants with them, pointed out Eisenach.

“I want to challenge [the premise] that money is free,” said Eisenach. “When you levy a tax to support a broadband network, you are taking money from people, many of whom don’t want to give it.”

A video of the ITIF Debate, “Government Should Neither Subsidize Nor Operate Broadband Networks That Compete with Private Ones” can be found here.

House Subcommittee Examines Digital Goods Tax Bill

in Congress/House of Representatives/The Innovation Economy by

WASHINGTON, May 24, 2011 – The Subcommittee on Courts, Commercial and Administrative Law held a hearing Monday on a bill that would define limits on taxes for virtual goods and services on the Internet.

Rep. Lamar Smith (R-TX) sponsored the proposed legislation, known as the “Digital Goods and Services Tax Fairness Act of 2011.” The measure aims to “promote neutrality, simplicity, and fairness in the taxation of digital goods and digital services.” The legislation would restrict taxing authority to the jurisdiction of the customer’s tax address.

Witnesses Robert Atkinson, President of the, D.C.-based Information Technology & Innovation Foundation (ITIF), and James Eads, Jr., Director of Public Affairs for Ryan, LLC, favorably supported the proposed legislation.

“By creating a fairer and more consistent tax system for digital goods, this legislation will promote and sustain our growing digital economy,” said Atkinson during the hearing.

Eads emphasized the need for certainty regarding taxation and digital commerce among consumers, businesses, Internet service providers, and state and local taxing authorities.

“Complexities that arise and transcend state boundaries cry out for a solution,” said Mr. Eads.

Russ Brubaker, National Tax Policy Advisor with the Washington State Department of Revenue, opposed the bill. Brubaker expressed concern that the bill would create the unfairness it opposed and limit local taxation activity.

“This Act prohibits or preempts perfectly legitimate state taxing authority,” said Mr. Brubaker. Again on Leading Edge of Debate About National Broadband Plan

in Broadband Calendar/Broadband's Impact/National Broadband Plan/Wireless by

WASHINGTON, March 11, 2011 – The internet and intellectual property policy news and events service will hold its March 2011 Broadband Breakfast Club event, “The National Broadband Plan: A One-Year Update” on Tuesday, March 15th, 2011.

In hosting the event, continues its tradition of being on the leading edge of the broadband and intellectual property debates by identifying and zeroing in on the key issues of debate and dialogue in Washington.

The Broadband Breakfast Club debate will feature one of the key architects of the National Broadband Plan (John Erik Garr), together with one of America’s foremost thinkers on innovation economics, an expert on disruptive voice-over-internet-protocol technology, and a leading rural broadband stimulus evaluator.

Moderating and encouraging the discussion will be Drew Clark, the founder of – and now the Executive Director of the Partnership for a Connected Illinois, or, which is the State Broadband Data and Development designee. Drew will reflect on Illinois’ efforts to implement the National Broadband Plan on the state level.

The National Broadband Plan, released March 16, 2010, provided a comprehensive inventory of broadband resources, collected the most current broadband data, and offered ambitious — yet reachable — goals. And the plan did this without being fiscally imprudent. Now, one year later, what has transpired as a result and what opportunities does the plan provide for future action?

This special Broadband Breakfast Club event will focus on the plan’s three parts: (1) promoting investment and innovation, (2) including all Americans in the digital economy through availability and adoption, and (3) promoting “national purposes.”

Panelists for the event are the following:

Robert D. Atkinson, President, The Information Technology and Innovation Foundation (ITIF)

Dr. Robert D. Atkinson is one of the country’s foremost thinkers on innovation economics is the founder and president of the Information Technology and Innovation Foundation (ITIF), a cutting-edge technology and economic policy think tank based in Washington, DC. With has an extensive background in technology policy, he has conducted ground-breaking research projects on technology and innovation, is a valued adviser to state and national policy makers, and a popular speaker on innovation policy nationally and internationally. He is the author of the State New Economy Index series and the book, The Past and Future of America’s Economy: Long Waves of Innovation That Power Cycles of Growth (Edward Elgar, 2005). Ars Technica listed Atkinson as one of 2009’s tech policy People to Watch.

Daniel Berninger, Independent Communication Architect and Analyst

Daniel Berninger is an expert in technical and regulatory aspects of Internet enabled disruptive communications and active in VoIP since 1995. Daniel’s work as a communication architect started with the original assessment of VoIP at Bell Laboratories, technical contributions to the founding of Free World Dialup, and continued with the first VoIP deployments at Verizon, HP , and NASA after joining VocalTec Communications. He won a VON Pioneer Award as co-founder of the VON Coalition. Daniel led the founding teams, created the business model, and recruited the CEO’s for ITXC (Tom Evslin) and Vonage (Jeffrey Citron).

John Erik Garr, Principal, Diamond Advisory Services (Diamond)

John Erik Garr is a Principal in the Diamond Advisory Services (Diamond), a federal government consulting practice advising some of the world’s top companies and agencies on strategic and operational issues. His areas of expertise include strategic planning, econometrics and statistics, sourcing strategy, IT assessment and governance, organizational design, and market development. Formerly a U.S. cabinet secretary, Erik led a mission review of a key public private partnership engaged in advocating for federal research funding on behalf of Illinois labs, universities, and businesses. Erik was named a 2004 Marshall Memorial Fellow, one of 55 emerging leaders selected from around the United States to participate in a high-level exchange program with European governments.

Keith Montgomery, Senior Program Director, Broadband, ICF International

Mr. Montgomery is the Senior Program Director of the ICF Broadband Group. Prior to ICF, he was Executive Officer for iTown Communications, CLEAR Communications in New Zealand and Concert Communications, and held senior management positions for MCI’s network construction and revenue finance teams. He developed the West Virginia First Advanced Broadband program to create broadband communities. He has completed broadband fiber-to-the-home and WiMAX network studies in over 15 states and several international ventures. In support of the USDA Broadband Initiatives Program, he managed the finance and engineering application review process.  Mr. Montgomery currently leads a team that evaluates broadband networks and helps groups leverage broadband technology to deliver services.

The event will take place at Clyde’s of Gallery Place, 707 7th St. NW, Washington, DC 20001, from 8 a.m. to 10 a.m. American and Continental breakfasts are included. The program begins shortly after 8:30 a.m.

Tickets to the event are $45.00 plus a small online fee. Registration is available at The Broadband Breakfast Club schedule can be viewed at

The Broadband Breakfast Club is sponsored by The National Cable & Telecommunications Association (NCTA), U.S. Telecom and the Telecommunications Industry Association (TIA).

The Tuesday morning event kicks off a more-than-a-week-long series of events in Washington marking the anniversary of the National Broadband Plan. On Wednesday, March 16, ITIF hosts a forum on the “national purposes” of the NBP. The Columbia Institute for Tele-Information (CITI) and Georgetown University’s Communication Culture and Technology Program host an event on the plan on Friday, March 18.

Earlier in March, at the Intellectual Property Breakfast Club, panelists commented on the patent reform legislation. The discussion occurred one day after the Senate voted to cut off debate and move toward a vote on changes to patent law, and the same day that the measure wound its way toward a final vote.

The Intellectual Property Breakfast Club meets on the second Tuesday of each month. Registration for the April event can be found at

The Intellectual Property Breakfast Club schedule can be viewed at

For More Information Contact:
Sylvia Syracuse
Director of Marketing and Events




Illinois and the National Broadband Map… Make That a Mashup!

in Broadband Data/Broadband Stimulus/Broadband's Impact/Expert Opinion/National Broadband Plan/NTIA by

SPRINGFIELD, February 21, 2011 – President Abraham Lincoln began his political career here with a passionate interest in infrastructure improvements.

America knows President Lincoln today because his belief in equal opportunity. What connects that which we know about Lincoln and that which brought him into public life?

“Lincoln knew firsthand the deprivations, the marginal livelihood of the subsistence farmer unable to bring produce to market without dependable roads,” writes historian Doris Kearns Goodwin. “Primitive roads, clogged waterways, lack of rail connections, inadequate schools — such were not merely issues to Lincoln, but hurdles he had worked all his life to overcome in order to earn an ampler share of freedom.”

Today’s “internal improvements” aren’t about canals or railroads, but about an information superhighway — one that needs to run through all the towns, villages and boroughs of our united nation.

These improvements are, in a word, about broadband.

Under the American Recovery and Reinvestment Act of February 2009, our nation’s communications agencies were charged with investing in broadband infrastructure, devising a plan to encouraged improved utilization — and to map it. And last Thursday, February 17, marked a new day for broadband data collection with the release of the National Broadband Map (NBM), at

Commentary has only just begun to emerge about this significant map. It will take some time to digest the enormity of more than 25 million items of data.

Three of the most important aspects about the national broadband map are: carrier confidentiality; measuring speeds and prices; and matching data about supply with data about demand.

Putting Broadband on the Map

The most important single fact about the NBM is that it includes the identities of broadband providers, on a Census block-by-Census block basis. That’s pretty important. Imagine a list of airline flight reservations without knowing which company’s plane you’d be flying on.

Broadband carriers have resisted this disclosure for years. On the one hand, this resistance is mysterious, given that consumers know who sells them broadband. Plus, wireless companies are known for waving their colored coverage maps on television ads. On the other hand, carriers have a instinctive reaction against having their service areas directly compared against, and exposed to, their competitors.

But the time for these arguments is now passed. Consider how far we have come. In September 2006, while at the non-profit Center for Public Integrity, we filed a Freedom of Information Act lawsuit to obtain the Federal Communication Commission’s Form 477 database, which contains the names of carriers at the ZIP code level. An average of 7,750 people live in a single ZIP code. In other words, they are not that granular. Census block information isn’t address level information, but it’s a lot closer to addresses. An average of 38.75 people live within a single Census block. That gives a fairly decent representation of whether service is truly available to the consumer who inquires.

We lost the lawsuit over ZIP code broadband data, but momentum in favor of carrier disclosure ultimately has been accepted by the National Telecommunications and Information Administration. Initially, the NTIA’s July 1, 2009, rules, said that carrier information, even at the Census block, would be confidential. Fortunately, it quickly changed its mind. On August 7, 2009, the NTIA declared, in the Federal Register, that “a service provider’s footprint will likewise no longer be included in the definition of confidential information.”

Entities funded by the NTIA must collect and submit this carrier information to NTIA. And Census block-level carrier data is now published on the National Broadband Map. It’s also available to State Broadband Data and Development entities – like Broadband Illinois – and is visible on our map at Perhaps more significantly, this data has been released via Application Programming Interfaces to software developers. It is public data, free for any conceivable use and reuse.

Speeds and Prices

The NBM has detailed data about which carriers offer service where. It allows the user to distinguish cable modem service and DSL service from wireless offerings. It renders displays with a degree of analytical capability previously lacking. There are also great visualizations of broadband based upon particular demographic information.

Two complaints are been being made about NBM, Version 1.0. The first are questions about accuracy. The second is the lack of price data, and of actual speed test data.

With regard to the accuracy of the data, it’s important to think about what is now being seen, for the first time: carriers claim that they offer broadband in areas that they may not actually serve. Until we had the NBM, which identifies these carriers in their Census blocks, this fact was hidden. But now that carrier footprints are publicly available, a public verification process may begin. This is the work that all of us are engaged in.

Including actual speed test results and prices will also be vital. Again, knowing the carrier is the key to unlocking the usefulness of this data. I launched in January of 2008 in order to “crowdsource” broadband data across multiple dimensions. We called this the Broadband SPARC — for speeds, prices, availability, reliability and competition. These elements are necessary components for understanding, and ranking, the economic and broadband vitality of regions, counties, and Census sub-units.

Crowdsourcing is difficult to get started. But it will ultimately be more useful than carrier data. Think of it as the difference between a regular road map, for example, and the traffic maps that you can click on and off using Google Maps. The web site of Broadband Illinois includes a broadband speed test component that is collecting carriers’ actual speeds, with consumer ratings. We are also collecting and publishing consumers’ monthly prices. Having this information will aid everyone’s quest to understand the health of our broadband networks.

Supply and Demand

The next step for the NBM is not as a map. Ultimately, the NBM needs to stand for the National Broadband Mashup. That means that it should provide data and functionality from multiple sources, creating new services in the process.

Two experiences illustrate the point. Before the opportunity arose for me to move to the Land of Lincoln and lead the Partnership for a Connected Illinois (also known as Broadband Illinois), Broadband Census created its own beta map of the state of broadband, including carriers, technologies and advertised speeds, in Columbia, South Carolina. We did this without any access to carrier-supplied data. In other words, there are lots of data sources that can be combined in a common-sense fashion and generate useful results.

The other experience concerns the U.S. Broadband Coalition, a large and diverse coalition whose formation presaged the National Broadband Plan. The group offered many recommendations that were included in the FCC’s plan. I had the opportunity to co-chair the Metrics Working Group with Rob Atkinson of the Information Technology and Innovation Foundation. Our group generated a surprising amount of consensus around a series of recommendations. One of these recommendations was the creation of a National Broadband Data Warehouse. The stated goal was to assemble as much raw data, from as many sources as possible, into a repository from which mashups and analyses can be performed.

One of the most vital of these sources of new broadband information will be about the ways and areas in which broadband could be effectively used. Basic carrier information is now available about the “supply” of broadband. This data will be refined, verified and checked through crowdsourcing, and through comparisons with other public sources. Now is time to ensure that information about broadband “demand” is also being collected and imported into national and state broadband warehouses.

All told, the United States spends more than $8 billion a year on various forms of statistics. Much of that goes to fund the U.S. Census Bureau and data collection about agricultural and labor markets, such as the monthly unemployment report. These are important. But remember that we are no longer fundamentally an agriculture- or even labor-based economy. Whether for rural and for urban areas – and Illinois has both in abundance – the pathway to opportunity today runs through the super-high-speed internet connectivity.

Today, that’s something that Abraham Lincoln would surely appreciate.

This Expert Opinion commentary originally appeared on Broadband Illinois at on Monday, February 21, 2011.

Mignon Clyburn Expects FCC Universal Service Fund Proposals by Year-End

in Broadband Data/FCC/Universal Service/Wireless by

ARLINGTON, Va., October 2, 2010 – Federal Communications Commissioner Mignon Clyburn emphasized the need for quality research in policy making, particularly with regard to reforming the universal service fund for telephone and internet connectivity.

Speaking at the Friday evening dinner session at the Telecommunications Policy Research Conference, a top telecom research conference here, Clyburn also said that she expected the FCC to propose changes to the USF system, and to propose funding for universal broadband, by the end of 2010.

Earlier on Friday at TPRC, the conference began with a panel examining broadband plans around the globe. The panel included officials from developed and developing nations, including the United States, Canada, Australia, Singapore and the European Union; and the developing nations of India and Brazil.

The common problem between both groups was determining the value of broadband to the overall economy, panelists said.

They said it was simple to determine the direct value based upon construction of broadband networks; but the longer-term value to the economy was difficult to monetize.

Developing nations face this analytical problem when contemplating whether to invest in broadband or other more traditional resources such as hospitals or schools. Panelists said that the developed nations, by contrast, are more concerned with maximizing the value of government investment.

The challenges of broadband deployment differ by population density, geography and the government’s proclivity to intervene in the marketplace. Even different types of capitalism change the willingness of private industry to invest, said Rob Atkinson, President of the Information Technology and Innovation Foundation, who compared Japan’s longer-term focus to U.S. firms’ greater focus on the short term.

The largest problem faced by developing nations was determining which type of broadband service to deploy (i.e. wired or wireless), and of finding the necessary funding. While most of the world accesses the internet via a computer, the penetration of mobile phones in India is so high that many regulators are beginning to pay greater attention to questions of mobile broadband.

The TPRC conference, currently in its 38th year, continues on Saturday and Sunday.

Panel Tackles Prickly Issue of FCC Regulation

in Broadband Stimulus/Broadband Updates/Broadband's Impact/FCC/National Broadband Plan/Net Neutrality by

WASHINGTON, March 9, 2010 – Public Knowledge, Silicon Flatirons and the Information Technology and Innovation Foundation last week sponsored a half-day conference to discuss the Federal Communications Commission and its efforts in reform, regulatory responsibility and standard setting.

The panel “Regulatory Reform: Standard Setting and Mediating Institutions” moderated by ITIF President Rob Atkinson took a philosophical approach on regulatory responsibility, how to frame certain problems and where the FCC should regulate, co-regulate or self-regulate.

Examining the Internet Ecosystem
Pierre DeVries, Silicon Flatirons’ senior adjunct fellow at the University of Colorado, started the panel by introducing the idea of an internet ecosystem. He questioned whether the two terms really fit together and why they are used so much. He compared the question of whether the internet is an ecosystem to that of whether a whale was an elephant. They clearly are not the same but “if you want to know more about a whale, knowing a little about an elephant will help because they are both very large social mammals,” he explained.

There are two major common features between the internet and ecosystems, according to DeVries. First, in both cases there is a responsibility to manage and regulate, but because they are so large it is very difficult to do so. Second, both are examples of complex adaptive systems, built up of subsystems that interact and adapt. He provides the example of an immune system. For example, people can make changes in life style or diet that will help our immune system but it does most of the work on its own.

DeVries asks what to do with these complex adaptive systems in order to regulate them. He asserts that it is important to rid ourselves of the “illusion of one right answer…it is not applicable to complex reality.” He believes that if we used a physicist or economist’s approach to deal with complex systems we would look for an answer at the efficient minimum, which is unstable.

By using an ecosystem management approach, we look for the point of resistance where there will be crashes and booms but they will stay within certain bounds and return toward the middle. The principles behind ecosystem management involve being flexible, delegating responsibility and preserving diversity.

To apply this theory to the panel, DeVries said the challenge for how one should organize regulatory institutions is “how do you keep learning, how do you learn as you go…We are seeing that principles are more appropriate than rules.”

Is Self-Regulation a Dirty Word?
Rick Whitt, Google’s Washington telecom and media counsel, has written on the topics of new growth thinking affecting policy makers and adaptive policymaking. He states that a key problem for policy makers is in determining which framework and tools to employee in different situations. Policy makers often overlook tools they can use to achieve their agenda, especially those that rest outside the agency, he said.

Whitt added that while self-regulation has a dirty connotation, co-regulation means having a government back stop while allowing the unloading of complex technical issues to experts in engineering and other bodies. Without any standards there will be maximum uncertainty for the players to find out appropriate and inappropriate ways to behave. Ideal regulation would balance adaptability, accountability, and some form of enforceability, he said.

Atkinson contrasted the co-regulation and adaptive systems idea to the earlier panel that touted rigid views on openness with numerous filings and notices of proposed rule making. He asked Kathryn Brown, Verizon senior vice president, public policy and corporate responsibility, if those views are reconcilable.

Brown used privacy issues and wireless technology compatibility examples that demonstrate how standards have been set in two very important areas without any actual rule of law or government regulation.

These solutions required a technical dialogue and understanding of compatibility. As we look at how to govern this space through a rule of law…Verizon and Google have sat down to discuss principles for governance and how to determine what would be appropriate oversight, Brown said.

Kathleen Wallman, CEO of Wallman Strategic Consulting, asked if protecting public interests is done outside of government regulation.

The standard-setting process is completely chaotic, she said.

“The public’s interest is things have to work…things have to be not too expensive…we want things to be dazzling and new on a regular and cyclical basis,” she said.

Wallman explained that the current standard setting process is moving toward “ad hoc-ism,” where companies come together opportunistically to set standards for what they need to do in the near future.

This ad hocism works for the first two objectives but how to protect the public interest of innovation is the real question.

She added: “Maybe there are places where we don’t need standard setting, just a platform to mix it up and figure it out” and hoped that the agency figures out many of its broadband issues that way.

Innovation in a Digital Age
Atkinson then asked Paul de Sa, chief of the FCC’s Office of Strategic Planning and Policy Analysis, about the future of innovation in the digital age.

De Sa began by noting that it is very difficult for the government to get the valuable information from parties that are part of a proceeding. He continued that the agency has been careful about trying to define the problems to solve before they gather data and burden others. In order to set standards, they need to examine the ecosystem and try to determine the problem, and then ask who should be setting the standards and how they should be done.

De Sa added that standards are important because they give users confidence and provide more information for better choices. He mentioned that internet service standards downstream ought to be set for delivery at a certain speeds.

If there are no standards for these speeds, it is hard for consumers to make choices, he said. Furthermore, if infrastructure standard settings change constantly it will be impossible for application developers to create new products. Standards that are too tight will constrain innovation. However, many different standards will prompt innovators to customize their product for the myriad standards. That will impede the ability of small players and innovators to compete with those that have greater resources.

De Sa also noted the importance of facilitating interoperability. He added that it does not have to be done through the FCC but there needs to be a format and platform for setting standards.

“We don’t want to be asleep at the switch,” he said. “It would be a mistake to pretend that interests are always aligned, and that innovators and new entrants can always compete with incumbents in terms of resources and abilities to enter markets.”

Atkinson asked which kinds of problems are best suited to which kinds of approach.

DeVries brought up standards as a notion for engineers, when engineers hear standards there is a defined problem and they need to decide which solution is best. He said self-regulation should be used “when there is a fair degree of homogeneity in the culture of people working on the problem.” At that point, “strong norms will lead to enforcement and therefore it is unlikely that there will be bad actors.”

“The next step up in terms of what we mean by standards is standards of behavior, norms, how different players interact and how they are going to divide up the pie,” he said. DeVries noted that while regulators traditionally thought this was their role, in a fast paced industry, it makes sense for companies to decide how they want to divide it up.

The Net Neutrality Equation
DeVries uses the net neutrality debate as an example. At the engineer level, the issue is between Verizon, Comcast and AT&T as to how they run the networks and what makes up reasonable network management.

DeVries continued that the discussion is then kicked up to what he considers the commissars. They discuss how to divide up the rents. This he believes is where co-regulation works.

“Normally there is a conflict and a threat of a stick in the background, if you do not solve the problem we will solve it for you,” he said,

For this part of the process to work, content providers and participants must be clear about the type of stick they want held over their heads. Regulators will then play a backstop role. If there is an insolvable problem with well-demonstrated harm to a stable industry, they will need to step in and write the rules.

Brown believed that DeVries’ discussion lacks the central force of the users. She said that applications are changing in the face of consumers’ expectations, which put pressure on network providers.

She touted the partnership between Google and Verizon on the Droid smart phone: “There are no user manuals…the users will build their own experience.” Brown did not disagree that there aren’t issues at the engineering level, but she thinks that consumers play a much larger role now.

De Sa had some issues with co-regulation. He believes it is not obvious to pin down what the in-between is in an internet ecosystem. Businesses and innovators would like to avoid Washington, he said..

“If co-regulation means lots of meetings then that inherently favors the larger player and excludes many of those without the resources,” argued de Sa.

Atkinson asked the panelists to think of disclosure of bandwidth practices and ask about how the process works on a co regulation basis. Who is included, who organizes and manages, how you deal with conflicts and bad actors?

Whitt wanted to clarify the notion of regulation. He said the common law process, which basically says what the agency uses is a good idea as long as it is transparent and expeditious. Co-regulation would be useful in defining network management and transparency.

He said either the agency can come up with a standards or rule that can then be developed though the complaint process, or a broad based group of users, developers and industry can adopt standards of what transparency means to them.

Through the use of online tools, sharing of ideas and discussion, this co-regulation would provide for a much richer environment for standard setting to happen. The FCC would either agree with the standards or see if they were going too far and in that way they would be acting with a stick in case of communications break down.

Whitt added that these co-regulation entities could be created through advisory groups for each issue, or there could be different groups setting acceptable standards from each of the players’ perspectives. Whitt believed that this is the tough question.

Wallman worried that in creating new advisory committees, “we would be creating another meeting to go to, a new barrier to overcome.”

Brown added that with the explosion of new technologies, regulation might become a whirlwind for anyone in this space. She did not want to see a government framework imposed on the ethos of freethinking and innovation.

DeVries countered that there are non-engineering problems that need to be addressed. “What does openness mean in practice? What is allowable price discrimination and what isn’t? When you give advance notice of terms, what is advanced, what is notice and what are terms? These all need to be addressed through some form of regulation or standard setting. DeVries is not sure that industry and competitors are solving these problems every day.

In response to DeVries, Brown stated that with regard to market definition, she does not believe that academia has caught up with reality as to who are the real competitors.

In response to Brown’s comment, Whitt clarified that Verizon and Google frame the issues surrounding the notion of the internet ecosystem very differently. Google does not believe that everyone in an internet ecosystem should be treated the same and likewise, does not believe that the internet eco system is truly a self-regulating system.

Brown and Whitt agreed that the lowest common denominator approach is not what is needed. The metric has to be, can we live with it and develop to the next level. They agreed that the government should not force parties together and that the public plays a large part of the regulatory equation.

De Sa ended by stating that he is optimistic about the productivity that will be created through Washington.

ITIF Urges Government Involvement to Speed Mobile Payments in U.S.

in National Broadband Plan/Wireless by

WASHINGTON, November 19, 2009 – Experts at the Information Technology and Innovation Foundation said Tuesday they are hopeful that mobile payment can catch on in the United States, but admitted that responsibility will fall to governments to provide the catalyst.

“The market cannot do this on its own either in the short term or in the long-term,” said ITIF President Rob Atkinson.

Mobile payment is an expanding alternative to conventional payment methods such as cash or credit card. Popular in Japan and South Korea, the system allows consumers to use their mobile phones to pay for most goods and services.

An expanded form of this system is the mobile wallet which would link personal identification, discount cards, and other information into one device. Mobile payment requires the installation of specialized near-field communication terminals to access the phone information. These terminals are a main obstacle to adoption in the United States.

Broadband adoption in the American market presents a chicken-or-egg paradox. First, business owners are unwilling to pay for installation of expensive point-of-sale terminals unless they are sure to be used.

Second, phone companies are unwilling to develop and produce mobile phones that carry a new technology with potentially limited application in the U.S.

And third, consumers are unwilling to pay for mobile payment enabled phones without a guarantee that the payment method will be universally available in the U.S.

Without one of these parties bending, progress for mobile payment technologies in the U.S. seems unlikely, said ITIF.

“In summary, mobile payments represent a critical information technology system for the US economy to realize,” writes Stephen Ezell, senior analyst with ITIF, in the report, “Explaining International IT Application Leadership: Contactless Mobile Payments”.

“It is not at all clear that market forces alone will get the United States there, or produce the completely open, multifunctional system that we need, certainly not anytime soon,” Ezell wrote. “Therefore, applying lessons from the leading countries, there appears to be a strategic role for the federal government to play in facilitating and accelerating the arrival of mobile payments in the United States.”

In particular, Ezell felt that governments should actively promote deployment by requiring public transit to deploy contactless fare systems throughout the country. He also advocates funding for pilot programs deploying near-field communication infrastructure.

This is necessary because “American companies are more cautious than the Japanese in these kind of situations. [Public-private partnerships] are more acceptable there,” said Mark MacCarthey, Adjunct Professor at Georgetown University.

In response to challenging questions about the role of market competition in the process, Atkinson commented that “if competition was the key factor, we’d have [mobile payment] here already.”

MacCarthey was more optimistic about the potential success of the competition driven market, but when asked later responded, “[competition] might sort it out. Card readers did it right by themselves.” He suggested that government supervised discussion between the different parties might be the solution

The ITIF report recommends the organization of an inter-government working group as well as a private-sector advisory council to introduce by mid-2010 a plan for spurring deployment of an interoperable mobile wallet.

“If we can get 1 percent of the attention that’s being devoted to broadband, we’d get contactless payment within a year,” said Atkinson.

Balancing Broadband Supply and Demand in Quest to Stoke High-Speed Internet Adoption

in Broadband's Impact/FCC Workshops/National Broadband Plan by

Christina Kirchner, Reporter-Researcher,

WASHINGTON, November 5, 2009 – Panelists at the Information Technology and Innovation Foundation on Friday agreed that price and digital literacy have created a barrier to broadband demand that can affect more than just broadband adoption.

The event was based off of a report written by Robert Atkinson, president of ITIF, “Policies to Increase Broadband Adoption at Home.” The report said that of the 92 to 94 percent of Americans have the opportunity to subscribe to broadband, only 65 percent have chosen to do so. The broadband penetration number comes from the widely-regarded random-digit-dial surveys of the Pew Internet & American Life Project.

James Prieger, associate professor of public policy at Pepperdine University’s school of public policy, cited another barrier to adoption: the price of broadband service is just too high.

Creating subsidization programs for broadband, or lowering taxes that pertain to broadbandmight be additional possibilities, he said. Prieger said that Canada had used tax credits to subsidize broadband, which could be a possibility for the United States, too.

But Prieger cautioned, “Just because you have a plan, doesn’t mean that it is going to work.”

According to panelists, another problem for broadband adoption is that consumers may not recognize that all of the pieces of technology connected to broadband are not – for now – all going to use a single platform.

According to Atkinson, “People want a general purpose device, not a single purpose device.”

For that, he said, there must be a level of digital and technological literacy among citizens.

“Digital literacy means different things to different areas,” said Laura Taylor, chief policy officer at Connected Nation. “We need to address the issues, then tackling the issues at the same time.”

Prieger interjected by saying that digital literacy among the younger generation is not as big of a barrier as it is for the older generation. Children are gaining “general digital literacy,” in part, from the presence of technology in public schools.

As for that older generation, panelists said that many do not see the importance of being connected with broadband.

One way to get consumers more aware about the impact that broadband will have on their lives in to point out how companies – and local governments – are putting job applications on-line – and nowhere else! On line is the only way that many jobs will receive applications.

“We need to help stimulate demand,” said John Horrigan, consumer researcher for the National Broadband Taskforce with the Federal Communications Commission. “We don’t necessarily want to award ISPs [for the number of consumers they have], but award the consumers. This way seeds the adoption among users.”

Before joining the FCC’s National Broadband Taskforce, Horrigan directed the Pew Internet & American Life Project’s research into broadband.

Horrigan said that by spurring adoption, demand is stimulated, affecting supply by making broadband more affordable for consumers.

About was launched in January 2008, and uses “crowdsourcing” to collect the Broadband SPARC: Speeds, Prices, Availability, Reliability and Competition. The news on is produced by Broadband Census News LLC, a subsidiary of Broadband Census LLC that was created in July 2009.

A recent split of operations helps to clarify the mission of Broadband Census Data LLC offers commercial broadband verification services to cities, states, carriers and broadband users. Created in July 2009, Broadband Census Data LLC produced a joint application in the NTIA’s Broadband Technology Opportunities Program with Virginia Tech’s eCorridors Program. In August 2009, released a beta map of Columbia, South Carolina, in partnership with Benedict-Allen Community Development Corporation.

Broadband Census News LLC offers daily and weekly reporting, as well as the Broadband Breakfast Club. The Broadband Breakfast Club has been inviting top experts and policy-makers to share breakfast and perspectives on broadband technology and internet policy since October 2008. Both Broadband Census News LLC and Broadband Census Data LLC are subsidiaries of Broadband Census LLC, and are organized in the Commonwealth of Virginia. About

1 2 3
Go to Top