Better Broadband & Better Lives

Tag archive


Broadband Roundup: Senate Announces Hearings on Open Internet, While House Democratcs Urge FCC to Regulate Broadband, and Popular Web Sites Protest ‘Slow Lanes’

in Net Neutrality by

WASHINGTON, September 10, 2014 – The Senate Judiciary Committee announced that it had scheduled a hearing next Wednesday on the best means to protect an open internet. Committee Chairman Patrick Leahy, D-Vt., said he saw the hearing as an opportunity to hear testimony about his views regarding importance of a free and open internet.

Leahy and Rep. Doris Matsui, D-Calif., each sponsored legislation dubbed the “Online Competition and Consumer Choice Act” in their respective chambers, S. 2476 and H.R. 4880. Their bills would direct the Federal Communications Commission to ban “certain preferential treatment or prioritization of internet traffic.”

“Open Internet rules are the Bill of Rights for the online world,” Leahy said in a statement. “It is crucial that rules are put in place to protect consumers, online innovators, and free speech. Next week’s hearing will build on the discussion the committee started in Vermont. I look forward to hearing from a wide range of stakeholders who can speak firsthand about the impact the FCC’s decision will have on the Internet landscape.”

Significantly, S. 2476 aims to promote open internet approaches without requiring public utility regulation under Title II of the Communications Act.

Popular Websites Stage Online Protest 

On Wednesday, many internet users will come across spinning-wheel icons on their favorite websites.

Organized by activist groups Demand Progress and Fight for the Future, websites such as Reddit, Kickstarter, Vimeo, Foursquare and WordPress are attempting to simulate for their visitors what they believe would be a potential consequence unless stricter net neutrality rules are put in place than those proposed by FCC Chairman Tom Wheeler in May.

The spinning site-loading icon is only symbolic in nature, as the web sites won’t actually slow their load times. Instead, sites such as BitTorrent, Etsy, Digg, Urban Dictionary, and Netflix will urge their visitors to contact their US policymakers in support of strong net neutrality rules, according to Techhive.

IDG News Service chronicles how this slow lane protest came to be supported by advocacy groups such as the ACLU, the EFF, Engine Advocacy, the Free Press Action Fund, and Common Cause. It comes less than a week prior to the deadline for second-round comments in the FCC’s net neutrality proceedings.

Pelosi Wants Broadband Reclassified 

House Minority Leader Nancy Pelosi, D-Calif. wrote a letter on Tuesday to Wheeler describing her concern that FCC’s current position may lead to discrimination and prioritization of certain online content.

Pelosi referenced January’s D.C. Circuit Court of Appeals decision in Verizon v. FCC. Although the ruling upheld the FCC’s authority under which it might use Section 706 of the Telecommunications Act of 1996 as a basis for Wheeler’s current approach, Pelosi said that “the FCC should follow the court’s guidance and reclassify broadband as a Telecommunications Service under Title II of the Communications Act.”

Wheeler’s Remarks at CTIA 

Wheeler spoke to the wireless industry association CTIA on Tuesday at the group’s conference in Las Vegas. Wheeler cited the FCC’s blocking of AT&T’s acquisition of T-Mobile, as well as his opposition of Sprint’s recent attempt to acquire T-Mobile, as examples of the agency’s street-credibility with broadband voters.

Motion Picture Association Tries To “Unfriend” Google In Court Fight Over Digital Copyright Law

in Copyright by

SAN FRANCISCO, February 21, 2011 — Movie industry lawyers told an appeals court late last week that it should ignore an attempt by Google to get involved in an appeal of a case that they won — even if Google is ostensibly on their side.

The Motion Picture Association of America is fighting off an appeal of a lower court decision last year that enables the movie studios to stem the flow of online movie piracy by barring the BitTorrent search engine isoHunt from hosting, indexing or linking to any Torrents of movies specified by the studios. The decision also mandates that isoHunt filter for keywords that the studios say are associated with copyright infringement.

IsoHunt filed an appeal in December in the Ninth Circuit Court of Appeals. Google swooped in early in February asking the court for permission to file a friend-of-the-court brief explaining why isoHunt should in fact be liable — but not for the reasons given by federal district court judge Stephen V. Wilson.

“Although Google asserts that the district court result should be affirmed on alternative grounds … the bulk of Google’s brief is devoted to attacking the grounds on which the plaintiffs prevailed below and plaintiffs’ legal arguments in support of the district court’s holdings,” responded the movie industry’s lawyers in a brief filed late last week with the court.

Because Google’s brief really is a brief opposing their position, the company’s lawyers have missed the filing deadline, and the friend-of-the-court brief shouldn’t be allowed, argue the MPAA’s lawyers — one of several technical arguments that they used.

For its part, Google argues that the federal district court should not have tied the issue of inducing copyright infringement to the availability of the DMCA’s safe harbor provision.

Instead, the court should have arrived at its conclusion based on some basic facts that showed that isoHunt failed to fulfill  some basic requirements in the safe harbor section of the DMCA, Google argued.

Google sees the lower court decision as a dangerous development that gives content-owners a powerful new weapon.

“Under the district court’s approach, opportunistic litigants would doubtless see such claims as an easy way to force online service providers into expensive litigation,” wrote Google’s lawyers. “But a service that demonstrates full DMCA eligibility in response to such charges should not have its safe-harbor protection left in limbo until it goes to the considerable burden and expense of defending against an inducement claim.”

The movie industry’s lawyers argue that the concern is “implausible,” and “unsubstantiated.”

A History of Network Neutrality

in Broadband's Impact/FCC/Net Neutrality by

WASHINGTON, October 11, 2010 – The issue of network neutrality is one that has become an increasing problem around the globe. In the United States, the problem became more of an issue with the rise of cable and DSL service.

With the decreasing level of competition among internet service providers and the increasing number of violations of network neutrality, the issue has garnered increasing importance. While Congress has attempted to protect consumers, it has failed. Private industry firms has already begun to adopt rules that they claim will protect consumers but avoid critical issues. The Federal Communications Commission has supported the issue but has yet to formally codify any protections.

In order to understand the importance of network neutrality one must first understand its principles. While there are many variants of the definition, they all agree on some basic points: users should be able to connect to any device they wish. They should be able to run any legal application they want to. They should not have their service degraded based upon usage.

Columbia University Professor Tim Wu, who wrote about the subject in 2003, has said, “Network neutrality is best defined as a network design principle. The idea is that a maximally useful public information network aspires to treat all content, sites, and platforms equally.” For consumers, this means that they are able to use their internet connection for any purpose they see fit.

There have been two major violations of these principles. In 2004, Madison River Communications blocked the voice-over-IP (voice over internet protocol, or VoIP) service Vonage over its DSL connections. In 2007, Comcast was accused of slowing down the cable connections of customers who used BitTorrent, a file-sharing application.

The Madison River violation was resolved when the FCC intervened, and Madison River agreed to pay a fine and stop blocking access. The FCC consent decree states :“On February 11, 2005, the bureau issued a Letter of Inquiry (LOI) to Madison River, initiating an investigation. Specifically, the bureau inquired about allegations that Madison River was blocking ports used for VoIP applications, thereby affecting customers’ ability to use VoIP through one or more VoIP service providers.” In order to avoid future costs associated with litigation Madison River settled and paid the fine.

The Comcast case went to the courts when the ISP claimed that the FCC did not have the authority to stop them from blocking BitTorrent. In April 2010, the Comcast case went before the D.C. Circuit Court which held that the FCC exceeded its authority in pursuing Comcast.

Shortly after this ruling, the FCC issued a notice of inquiry in May that proposed changing the classification of broadband from a Title I service to a more heavily-regulated Title II service, but it also included a new proposal which the FCC chairman called the Third Way. This Third Way was a hybrid of Title I and Title II regulations. The Third Way gained a support from a wide range of stakeholders including some ISPs and consumer protection advocates. Democratic Sens. John Kerry of Massachusetts, Maria Cantwell of Washington, Ron Wyden of Oregon and Tom Udall of New Mexico all sent letters in support.

In June, a group of ISPs and technology companies announced the creation of the Broadband Internet Technical Advisory Group (BITAG or TAG) which would help advise and mitigate the problems of network neutrality. The organization had a wide membership including AT&T, Cisco Systems, Comcast, DISH Network, EchoStar, Google, Intel, Level 3 Communications, Microsoft, Time Warner Cable and Verizon. While many saw this as a positive step, the organization has yet to propose any solutions.

While the notice of inquiry was receiving responses, news that the FCC began to hold closed-door meetings with major stakeholders on the issue of network neutrality created an uproar. Many consumer protection advocates opposed these secret meetings, and they were stopped.

In August, Google and Verizon announced a joint policy statement in which they outlined their own network neutrality principles. Their statement said in part: “Users should choose what content, applications, or devices they use, since openness has been central to the explosive innovation that has made the internet a transformative medium.” They however specifically left out wireless since they said it was too new of a market to require consumer protection. Their omission sparked an outcry from consumers groups and others who said consumers’ increasing use of wireless devices showed that wireless is the wave of the future and should be watched closely to better benefit consumers.

During this period, many called upon Congress to act, claiming that the FCC did not have the necessary authority to reclassify broadband even if it wanted to. The commissioner supported this and stated numerous times that he was willing to work with Congress to find a suitable solution. Rep. Henry Waxman, the chair of the House Energy and Commerce Committee, proposed legislation that would codify many of the FCC’s principles and also included language which would require ISPs to disclose accurate speed and pricing information. The Waxman bill also included wireless along with wireline which was further than the FCC’s original plans. Waxman was unable to get the bill passed through his committee due to Republican opposition.

The future of network neutrality remains unclear. However, whatever direction it takes in the policy, business or consumer arenas will affect the growth of the internet for years to come.

Appeals Court Deals Network Neutrality Blow to FCC

in FCC/National Broadband Plan/Net Neutrality by

WASHINGTON, April 6, 2010 – A federal appeals court ruled Tuesday that the Federal Communications Commission does not have the power to mandate that broadband provider Comcast must give equal treatment to Internet traffic streaming through its networks.

The ruling by the District of Columbia’s U.S. Court of Appeals is a huge victory for Comcast. The nation’s largest cable firm had challenged the FCC’s authority to impose network neutrality requirements on broadband companies.

The ruling also calls into question the FCC’s ability to implement parts of its recently released National Broadband Plan.

Comcast had challenged an FCC decision in 2008 forbidding the company from blocking its broadband subscribers from using BitTorrent, an online file-sharing technology.

In reaction to the court’s decision, Federal Communications Commission spokeswoman Jen Howard said the agency “is firmly committed to promoting an open Internet and to policies that will bring the enormous benefits of broadband to all Americans,” adding that the decision “invalidated the prior commission’s approach to preserving an open Internet. But the court in no way disagreed with the importance of preserving a free and open Internet; nor did it close the door to other methods for achieving this important end.”

Proponents of network neutrality were quick to slam the court’s decision.

Parul Desai, vice president of the Media Access Project, said: “I am disappointed in the court’s finding that the commission did not make the case for its authority to take action against Comcast’s blocking of BitTorrent….The commission must have the authority to protect all Internet users against harmful and anticompetitive conduct by Internet service providers.”

Executive Director Markham Erickson of the Open Internet Coalition agreed: “Today’s D.C. Circuit decision in Comcast creates a dangerous situation, one where the health and openness of the Internet is being held hostage by the behavior of the major telco and cable providers.”

Gigi Sohn, president and co-founder of Public Knowledge, said the court decision means there are “no protections in the law for consumers’ broadband services.”

S. Derek Turner, the research director for Free Press, said the decision forces the FCC into an “existential crisis, leaving the agency unable to protect consumers in the broadband marketplace, and unable to implement the National Broadband Plan.”

But Barbara Esbin, a senior fellow at The Progress & Freedom Foundation, applauded the decision, saying the FCC’s action against Comcast’s Internet network management practices was unlawful because Congress has not delegated to the FCC regulatory authority over the provision of Internet services.

Free State Foundation President Randolph May saw the ruling as a possible impetus for  Congress to begin a rewrite of the Communications Act “which ties the commission’s regulatory activity over broadband explicitly to evidentiary showings of abuse of substantial market power and demonstrable consumer harm.”

A comment from Comcast was not available by press time.

The Cable Pipeline Opinion: Net Neutrality’s Conundrum

in Expert Opinion/Net Neutrality by
View of Wall Street, Manhattan.
Image via Wikipedia

Through continued research of the Net Neutrality debate, distinct realizations come to mind for Regulators’, Consumers, and Network Providers alike in pondering the heated discussions around whether either regulation, or a (hands-off) approach, are sufficient to allow unfettered and equal access, including clear competition, and that all are present on the Broadband pipelines. 

First there has continued to be somewhat of a hysteria and possibly pre-ordained fear, albeit without serious incidents of record, that network providers both have and will continue to throttle speeds and limit access of their customers to the copious amounts of content becoming available through the Internet. Perhaps the hysteria has unfolded as a result of one BitTorrent case, or associated with a fear of other industry debacles as seen with banks, Insurance companies, Investor Management companies, and Wall Street, driving the public to government as their interventionist in reigning in these industries; but how realistic are these fears based on the current Internet model?

Regulation can hamper Broadband Access and Adoption

Increased regulation of a burgeoning Internet on the verge of offering just the recipe the FCC is mandating could backfire in helping startup companies materialize and grow while slowing the proliferation of increased infrastructure, and network upgrades. Without the freedom to invest and seek sufficient ROI’s network providers will cut costs rather than invest for the future. This could stunt job creation, a by-product of innovation and free-flowing investment, in an industry with a broad potential to produce applications and services for the Internet.

Network Management Polices will continue to improve and evolve to handle varying Traffic Needs

It is in the best interest of private network providers to provide the best network management policies for all users in continuing to build their consumer and business base. This correlates to (Business Management – Best Practices-101). If a company cannot offer the best experience for its customers all businesses, whether an Internet Provider or a Wal-Mart, cannot survive the long term.

Use of Anti-Trust Statutes to curtail (Bad-Actors)

Absent a serious history of abuses within the Internet pipelines the FCC should concentrate on harnessing (bad-Actors) with Anti-Competitive Statues, not regulation, allowing that these companies will receive stiff penalties, and will certainly be brought to the forefront via customers and competitors having been abused, disenfranchised, and denied access to fair treatment.

Incentives rather than Regulation

Broadband Stimulus Plan funds should be used to incent companies to build new infrastructure and upgrade their networks to realize an adoption and access vision which the FCC has been mandated to accomplish.  First, detailed maps must be created to determined where the infrastructure is located, and where it is not. Then current providers of Telephone, Cable, or Wireless are incented to build and upgrade their networks in rural areas to provide needed Internet services. Monies will be better spent with incentives associated to quantifiable results rather than regulation and mandates of an existing industry.

The Cable Pipeline has written about both sides of the Net Neutrality issue. It is without question a passionate and personal debate with results having far reaching implications in the lives of individuals, businesses, and public sectors alike. The FCC has been prudent in seeking comment from all stakeholders which will hopefully produce the right results for all concerned. When the dust settles, my preference would lean more toward less regulation and more incentives therefore spurring economic growth and job creation.

Reblog this post [with Zemanta]

Google, Verizon Laud FCC Principles, But See No Role for Agency in Internet’s Future

in Net Neutrality Comments by

WASHINGTON, January 15, 2010 – Cozying up by declaring “our businesses rely on each other,” Internet service giant Verizon and content and search company Google submitted a joint filing on Thursday to the Federal Communications Commission responding to the agency’s proposed open internet rules.

Each company filed separate comments but the duo used the rare joint filing to highlight areas where their interests intersect: “We believe that we need a policy that will ensure openness and preserve the essential character of the Internet as a global, interconnected network of networks and users that is thriving based on a common set of core values.”

The Internet must be treated as a “unique, worldwide network of networks” based on “overarching values that are embraced by all players in the eco-system to support continuing innovation and investment,” they recommended. The companies acknowledged the network’s roots in open research and cooperation between stakeholders, but cautioned these successes took place “in an environment of minimal regulation.” The Internet has become successful because of cooperation and non-interference, the filing said.

Concerns over regulations aside, the filing boldly declared: “It is essential that the Internet remains an unrestricted and open platform, where people can access the lawful content, services and applications of their choice.”

The language of the joint filing appears to be an explicit endorsement of the FCC’s Internet policy statement, which some broadband watchers have criticized for lacking any force of law – an issue currently before the courts as cable firm Comcast fights the FCC censure it received for blocking company BitTorrent’s content from effectively streaming over the Internet.

An open Internet must ensure “innovation without permission,” the companies wrote. Such innovation has been a characteristic of the Internet from the beginning and strong infrastructure is a key part of preserving the Internet, they declared.

“We strongly believe that open, robust and advanced broadband networks are essential to the future to development of the Internet,” they reiterated, adding that public policies should be formulated to encourage investment in new infrastructure and new technologies to achieve the nation’s broadband potential.

Users must have control of their information, and as many choices as possible, the companies wrote. That control should cover “all aspects of [consumer] Internet experience, from the networks and software they use, to the hardware they plug in…and the services they choose.” No single entity should be able dictate consumer choice, the filing boldly declares, and calls for the commission to implement policies to protect consumers’ rights.

But the FCC has no place in the future of the Internet, the companies later note: “…[C]ommunications laws and regulations should not apply to Internet applications, content, or services.” The FCC has no jurisdiction over the Internet and there is “no sound reason” to impose communications laws on the medium, they argue – instead declaring consumer disputes are best resolved by the jurisdiction of the Federal Trade Commission.

Editor’s Note: Don’t miss the Intellectual Property Breakfast Club event, “Net Neutrality, Copyright Protection and the National Broadband Plan,” on Tuesday, January 19, 2010, from 8 a.m. to 11 a.m. Register here.

Comcast vs. FCC: Implications in throttling BitTorrent

in Expert Opinion/Net Neutrality by
Image representing BitTorrent as depicted in C...
Image via CrunchBase

Comcast is appealing a ruling before a three-judge appeals court panel concerning the FCC’s sanctions in 2008 of the operator, and whether it has jurisdiction under current Net Neutrality rules to do so, for what has become known throughout the media as past throttling of BitTorrent. (See FCC formally rules Comcast’s throttling of BitTorrent was illegal). This could be an important decision for ISP industry operators, who have many (irons-in-the-fire) when it comes to a business model that depends on both residential Internet and business customers, in helping it pay for a broadband pipeline created with private investment.

It also has implications for consumers who are increasingly using more file sharing applications to watch video content from their Internet Service Provider connections, and Internet giants like Google (Nasdaq: GOOG)who depend on free access to its information sharing business model. While Comcast (Nasdaq: CMCSA,CMCSK) has indicated their Internet management practices have since been changed, as a result of the issue, and it no longer throttles customers, what remains is a court challenge this past week in which the court grilled the FCC on its authority to regulate ISP’s under current Net Neutrality rules without a legislative mandate. (See Comcast Scores Against FCC in Court Battle over Net Neutrality).

The wider ramifications is whether the ruling will apply to business applications, which require special and unique service agreements for much larger file sharing and speeds in offering these programs. In essence, ISP’s need the flexibility to charge differing rates depending on the requirements of certain applications, which in-turn allow for infrastructure investments to accommodate these needs. This is their (Bread and Butter) of profitability.

One the one hand the FCC is under a mandate by the current administration to have a free flowing Internet with consumers and file sharing applications having unfettered access, and on the other, private investors which have created the pipeline are mandated by economics to make a profit depending on differing needs, from both consumer and business. If the FCC loses this current battle in court, then future challenges will likely occur concerning any new Net Neutrality rules that are adopted.

It seems from opening arguments before the courts that the FCC may have overstepped its boundaries in taking Comcast to task over BitTorrent, and may have to back up and ask Congress for a legislative mandate in regulating broadband as an information service.

FCC Commissioner Robert McDowell on C-SPAN’s ‘Communicators’

in National Broadband Plan/Net Neutrality/Wireless by

WASHINGTON, January 8, 2010 –  With about a month left until the Federal Communications Commission delivers its National Broadband Plan to Congress, Commissioner Robert McDowell spoke about the impending plan – as well as spectrum politics, Net neutrality and competition in the video media landscape – on C-SPAN’s “The Communicators.”

Commissioner McDowell began by commending the quality work of the broadband team and the numerous updates and outlines that have been presented over the past year.

The interview with Washington Post Report Cecilia Kang, was conducted before FCC Chairman Julius Genachowski requested a month-long extension of the February 17, 2010, deadline. McDowell said that the agency’s five agency commissioners would have their first look at the broadband plan on February 11. McDowell also said that there was no requirement that the agency would vote formally on the proposal.

A whether there will be any Republican dissent to the plan, McDowell countered that the Republicans and Democrats on the Commission can dissent on any number of issues but that will not affect the outcome of the presentation to Congress.

McDowell hopes Congress will analyze and consider all of the issues and take appropriate action to stir adoption through tax incentives or other means. Once the plan is in the hands of Congress the Commission will then be able to focus on other essential spin off issues such as changes to the Universal Service Fund subsidy program.

When asked about the true purpose of the plan for 2010, McDowell said that while some studies claim that there is already a 95 percent penetration rate for broadband, the true question is whether “those speeds are actually fast enough and whether there is enough bandwidth for cutting edge technologies?”

He followed by explaining that while cable might pass 92% of the country, with an upgrade to DOCIS 3.0, 92 percent of the country might actually be wired to 100 Megabits per second.

Since he came to the agency, McDowell’s main focus has been on the construction of new delivery platforms such as fiber, wireless and satellite. These delivery platforms are the only real way to address the broadband supply gap.

Kang mentioned that the Commissioner’s comments echo some of the most recent correspondence from the administration calling for the need for more competition.

She asked McDowell what are his beliefs on the competitive landscape specifically the wireless sector and how, given the need for competition can one reconcile the fact that the biggest wireless providers are also the biggest providers of fixed wireless, AT&T and Verizon.

McDowell responded that “you cannot have enough competition…since I have been at the Commission, I have looked for ways to create new competition and that obviates regulation on many levels.”

He added, “I would like to see more spectrum audits as long as we manage our expectations ahead of time.” He said that it was very difficult to pin point a time and place to see who is using what spectrum for what purpose.

McDowell said he believed that we are in “The Golden Age of Wireless.” He quoted Marty Cooper, one of the most influential people in the development of the cell phone, in saying that “spectral efficiency doubles every 2.5 years and since the development of radio we are 2 trillion times more spectrally efficient.”

He also added that with the increased use of smartphones there might be a current efficiency gap, but this tension creates more incentives to use the airwaves even more efficiently.

The spectrum efficiency discussion lead to the question of the use of white spaces. McDowell again commended the agency for their November 2008 decision to approve the use of unlicensed devices for unused spectrum. Kang then turned the discussion towards Google’s role in administering the white spaces database and documenting the gaps between the users and the unused spectrum.

She followed up with a question about whether such a task supposed to be performed by a neutral third party can be accomplished by a company like Google with its own communications interest.

While traditionally such a role would have been handled by a truly neutral party, McDowell believes that Google could handle the task as long as their interests are examined.

Next, Kang asked McDowell why he chose to agree to a Net Neutrality rulemaking proceeding when he felt that there would be no need for a new policy? Would there be a new rule or policy that he would be comfortable with, and how can white spaces solve some of the problems with net neutrality policy?

Since the Net Neutrality proceeding comments are due on January 14, McDowell felt the most important feed back would be hard evidence on whether without these rules there will be a systemic market failure.

He understands that his opponents might fear anticompetitive discrimination on behalf of the operators, but he also explained that few instances of discrimination have been handled through the appropriate FCC procedures.

McDowell said he believed that “the cure for anticompetitive conduct is more competition.” He continued, “we have not yet seen the fruits of the newly auctioned 700Mhz spectrum…WiMAX technologies and white spaces.” With these technologies in place, McDowell believes that the consumers will be fully protected.

McDowell said he was worried about the unforeseen consequences of new regulation. “The new regulation would essentially be a tax, and when you tax you tend to get less out of the service.”

In quoting Ronald Reagan he said “they are those that see something moving and they want to tax it, if it keeps moving they regulate and if stop moving then they subsidize it.” McDowell said he did not want that to happen with the Internet.

On a related note, McDowell was asked whether the FCC has jurisdiction to regulate the internet and more specifically the question of search neutrality. He stated that the proposed rules place all regulation on the network operators and not the application providers; however, he welcomes comment from the public as to whether they believe the FCC has such jurisdiction over the search providers.

McDowell also agreed with the dissent in the 2008 Comcast-BitTorrent ruling because he questions whether congress has given the FCC enough authority to regulate information services in such a way.

While McDowell avoided a question on the Comcast-NBC merger, he did admit that the transition of video from fixed cable and satellite to the internet is an exciting area to watch. McDowell believes that these issues are in their adolescent phases and there are issues between subscription and advertising that the market has yet to figure out.

 Therefore it is his belief that government should allow as much room for free experimentation as long as there are no anticompetitive actions taken.

Finally, McDowell ended his interview by welcoming legislation by Sen. Olympia Snowe, R-Maine, and Sen. Mark Warner, D-Va., to bring more expertise to the FCC and the decision of the Commission to hire their first scholar in residence as a liaison between the Commission and the academic community.

Panelists Consider Pros and Cons of Alternatives to Internet's Transport Protocol

in Broadband Data/Net Neutrality by

Editor’s Note: This is the one of a series of panelist summary articles that will be reporting from the Telecommunications Policy Research Conference, September 25-27, at George Mason University School of Law in Arlington, Va.

ARLINGTON, Va., September 26, 2009 – Whether internet service providers will accelerate early efforts to prioritize bandwidth, and what impact such measure might have upon the open internet, were actively discussed by panelists at the Telecommunications Policy Research Conference here.

Traditionally, internet traffic has been managed by the Transmission Control Protocol  (TCP), the engineering standard for almost all internet transmissions. When there is a great demand for internet content than is available to flow over the network at any given point in time, “each flow of the network gets a roughly equal share of the bottleneck capacity,” according to Steve Bauer, a professor of computer science at MIT.

Bauer was presenting a paper on “The Evolution of Internet Congestion,” with Professor David Clark and William Lehr, also of MIT.

Such a standard for routing internet traffic has been dubbed “TCP Fair,” and this approach remains the standard for dealing with congestion. However, a variety of internet providers, including Comcast – which was punished by the FCC for blocking traffic from the BitTorrent application – have been experimenting with alternatives.

Bauer said that Comcast and other broadband providers have been experimenting with changes to the “TCP Fair” approach because of changing expectation by end users, changing composition of internet traffic, and because of new ideas – ideas challenging the traditional notion of “end-to-end” internet – emerging in the technical community.

Among the alternatives, or additions to, the “TCP Fair” approach include re-ECN, or a re-feedback of Explicit Congestion Notification, LEDBAT, and P4P, or peer-four-peer, an approach to peer-to-peer (P2P) communications that allows DSL providers to maximize  the effectiveness of their networks.

Considering the validity of different approaches is particularly significant in light of Federal Communications Communication Chairman Julius Genachowski’s announcement, this past Monday, that the agency will begin implement Network Neutrality requirements.

Bauer recommended that the academic community “obtain more data about traffic patterns, congestion and usage [while also] ensuring that transparency requirements don’t discourage experimentation with new congestion management techniques.”

Also speaking on the panel was Nicholas Weaver, a software expert at the International Computer Science Institute at the University of California at Berkeley. Weaver highlighted the unusual economics of P2P communications. Content providers save enormous amounts. He said CNN has saved up to 30 percent of bandwidth costs by aggressively using P2P.

“But the internet service providers sees a magnification of costs,” said Weaver. The economics can be changed, however, but the introduction of peer-to-peer “edge caches” that are offered free of charge.

Panelists for this session included:

  • Marius Schwartz, Georgetown University (Moderator)
  • Steve Bauer, David Clark, William Lehr: Massachusetts Institute of Technology
  • Guenter Knieps, Albert Ludwigs Universitat Freiburg
  • Nicholas Weaver, ICSI

Google Enters Free Speed Test Marketplace with Academic Collaboration

in Broadband Data by

WASHINGTON, January 27, 2009 – Search giant Google is preparing to enter the market for free broadband speed tests, through a collaboration with the university research consortium PlanetLab, and the New America Foundation.

Google is set to announce the collaboration on Wednesday, at an event at the New America Foundation in Washington, and keynoted by Vint Cerf, vice president and chief internet evangelist at Google.

Google follows, which launched in January 2008, in providing a free internet speed tests to consumers.’s speed test allows internet uses to test actual speeds and compare them to the speeds that are promised by their internet providers.

Google and the other participants in the research consortium will be using the same speed test – the Network Diagnostic Tool of Internet2 – that was deployed by beginning in February 2008.

As with, Google apparently seeks to make the data publicly available, as a means of providing transparency into the operations of internet providers.

“Transparency has always been an essential component of the Internet’s success,” reads the press release announcing Wednesday’s event. “To remedy today’s information gap, researchers need resources to develop new analytical tools.”

“At this event, speakers will discuss the importance of advancing research in network measurement tools and introduce new developments that will benefit end-users, innovators, and policymakers,” reads the release.

The organizational framework for the speed tests and other network tools is to be called the Measurement Lab, and is expected to be hosted through PlanetLab at Princeton University.

Among the individuals also scheduled to speak at the event include Larry Peterson, chair of the Department of Computer Science at Princeton, and Princeton Professor Ed Felten, director of the Center for Information Technology Policy.

In addition to the NDT speed test, the Measurement Lab will allow internet users to use two additional tests, “Glasnost,” developed by the Max Planck Institute for Software Systems, in Kaiserlautern and Saarbrucken, Germany, and the NPAD diagnostic service, Pathdiag, developed by the Pittsburgh Supercomputing Center.

According to the Max Plank Institute web site, Glastnost “creates a BitTorrent-like transfer between your machine and our server, and determines whether or not your [internet service provider] is limiting such traffic. This is a first step towards making traffic manipulation by ISPs more transparent to their customers.”

In fall 2007, through tests conducted by the Electronic Frontier Foundation, Comcast was found to have been interfering in the packet transfers by users of BitTorrent, a peer-to-peer software system. After a complaint, the FCC punished Comcast in August 2008.

Comcast’s system of network management – which the cable operator says it has discontinued – became Exhibit A in the battle over network neutrality, or the procedures by which broadband carriers can prioritize internet traffic.

Over the past several years, Google has opposed attempts by carriers to circumvent Net neutrality.

According to the Pittsburgh Supercomputing Center web site, NPAD’s Pathdiag “is designed to easily and accurately diagnose problems in the last-mile network and end-systems that are the most common causes of all severe performance degradation over long end-to-end paths.”

“Our goal is to make the test procedures easy enough and the report it generates clear enough to be suitable for end-users who are not networking experts,” the PSC web site continues.

Google, PlanetLab, New America Foundation and the software engineers that designed each of the three tools are involved in the new venture.

“We are listed as an advisory board” to the project, said Rich Carlson, a network engineer at Internet2. “Google is providing some rackspace. Google is providing the funding to purchase the hardware, and the network connectivity to connect [the tests] to the commercial internet.”’s goal in allowing internet users to test their speeds is to provide a publicly-available repository of speeds, prices, availability, reliability and competition in measuring local broadband.

In Taking the Broadband Census, individuals answer a brief questionnaire about their location, their carriers and the quality of service. They are also invited to comment on their carrier.

Information about all speed tests conducted on are immediately publicly available, both by carrier and by ZIP code, after the tests are concluded. All the content on is available under a Creative Commons Attribution Noncommercial License, allowing it to be republished and reused for free by academics and by local government agencies. reported about its experience using the Internet2’s NDT speed test, and made a presentation about its findings at an Internet2/Joint Techs Conference in Lincoln, Neb., in July 2008.

Carlson said he believes that Google will also make its data publicly available. “My intention is to make that data available, as soon as possible.”

Carlson said that he and Internet2 believed it was important to “get the data collection started, and see what kind of community resources can be put to bear, to do some analysis” about internet traffic.

Other academic organizations, including Virginia Tech’s eCorridors Program, have also used the NDT speed test, which is open source software. Speed test data from eCorridors is also publicly available.

Google announced its interest in the speed test marketplace at Supernova conference in June 2008, and the collaboration apparently took root after an invitation-only conference Google organized in Mountain View, Calif., in the summer of 2008.

More details are expected to be made available at the Wednesday New America Foundation event.

Google CEO Eric Schmidt is chairman of the New America Foundation, and Schmidt personally has made significant financial contributions to the think tank.

The Foundation has taken stances congruent with positions that Google been pushing. For example, the think tank strongly advocated for the FCC to make vacant television channels available for unlicensed use by internet devices, a position endorsed by Google.

Editor’s Note

Internet2 provided technical direction about deploying a speed test to, and the eCorridors Program at Virginia Tech has provided encouragement and technical advice in taking the Broadband Census to a national audience. See supporters.

Go to Top