Connect with us

Broadband Updates

Access Bandwidth Bottleneck Drives Innovation for Online Video Delivery

LONDON, October 1, 2010 – With online video now the main cause of internet bottlenecks and consumer frustration over poor performance, new ways are emerging to provide better picture quality within limited bandwidth under the banner of Adaptive Bit Rate Streaming.

Published

on

LONDON, October 1, 2010 – With online video now the main cause of internet bottlenecks and consumer frustration over poor performance, new ways are emerging to provide better picture quality within limited bandwidth under the banner of Adaptive Bit Rate Streaming.

Various versions of this technology already have been deployed for mobile TV transmission, where bandwidth is even more limited. For example, by Apple with HTTP Live Streaming for iPhone, and by Google in its Android mobile operating system. Microsoft Windows Phone 7 scheduled for an October launch also incorporates a form of Adaptive Bit Rate Streaming.

The idea is simple and old – encoding video at a variety of bit rates to suit different quality requirements and consumer display capabilities, and then break the bit streams into small chunks that can be delivered more efficiently. But the new development is to make this work better for high bit rate video transmission, and above all do it via HTTP (Hypertext Transfer Protocol), the linchpin of communications over the World Wide Web. Adaptive Bit Rate Streaming adds the key quality component for video, enabling HTTP to cope with the limited bandwidth available at various points of the internet delivery chain.

The delivery of video in small chunks also allows adjustment in real-time to deliver the best pictures that the bandwidth will allow at any given moment. By contrast, the preceding technique of progressive downloading assumes the worst and delivers video only at the quality supported by the minimum bandwidth available, making use of any temporary boost in capacity to get ahead and deliver video that the viewer has not yet watched.

This trades off quality for reliability, but Adaptive Bit Rate Streaming achieves both by varying the quality to suit conditions, always delivering pictures even if the quality drops. This enables the internet to work more like a broadcast TV service, so that people can start playing a video almost instantly without having first to wait for buffers to fill.

As a result, Adaptive Bit Rate Streaming is now being adopted for delivery of video to PCs and other fixed internet connected devices as well as mobiles. It optimizes quality and fits well with the changes in Internet structure being driven by video and in particular the needs of live broadcast where many people want to watch the same content, according to Mike Nann, director of marketing at Digital Rapids, a vendor of video transmission technology.

“Adaptive Bit Rate Streaming is already associated with moving content closer to the consumption point, as it leverages existing web infrastructure and the vast network of HTTP caches already deployed for serving Web pages worldwide,” said Nann. Adding video is then largely a matter of scaling the capacity, while implementing the appropriate Adaptive Bit Rate Streaming software.

The main emerging Adaptive Bit Rate Streaming techniques using HTTP include Microsoft IIS Smooth Streaming and Adobe’s HTTP Dynamic Streaming. But as Nann added, “while these two technologies get much of the attention for Web delivery, they aren’t the only technologies that are using HTTP to adaptively reach viewers – there are numerous others, as well as adaptive streaming capabilities incorporated within broader service offerings.”

Although Adaptive Bit Rate Streaming is the key to effective distribution of popular video content including mass live streaming events like major concerts, it is not the only show in town, being less efficient for niche material with small audiences. “It isn’t going to be used all the time for the small-audience and lower-value use cases that are the majority of online streaming,” said Nann.

This is partly because Adaptive Bit Rate Streaming is costly to deploy and so only worth it when the audience and revenue are relatively large. “While deploying ABRS offers cost savings for scaling the delivery infrastructure, it is more expensive at various points in the chain,” said Nann. In particular there are increased processing requirements to create the multiple streams at different bit rates, coupled with additional storage and management requirements for handling on-demand content, which will have to be held at different locations for delivery over a wide area.

Currently only 1 percent of internet traffic is distributed via Adaptive Bit Rate Streaming, but this is likely to increase rapidly to at least 30 percent and possibly 50 percent or more over the next two years.

Philip Hunter is a London based technology reporter specialising in broadband platforms and their use to access high speed services and digital entertainment. He has written extensively for European publications about emerging broadband services and the issues surrounding deployment and access for over 10 years, with a technical background in ICT systems development and testing.

Continue Reading
Click to comment

Leave a Reply

Broadband Data

U.S. Broadband Deployment and Speeds are Beating Europe’s, Says Scholar Touting ‘Facilities-based Competition’

Published

on

WASHINGTON, June 10, 2014 – In spite of press reports to the contrary, U.S. broadband coverage is not falling behind European levels of service, academic Christopher Yoo said on Wednesday at the National Press Club.

“It seems like every other week there’s a new infographic or news story that talks about how the U.S. is falling behind in broadband speeds, we don’t have fiber to the home, and telecom companies are rolling in the profits while consumer prices soar,” said Doug Brake, telecommunications policy analyst with The Information Technology and Innovation Foundation, setting up the topic tackled in by Yoo in his presentation.

On the contrary, said Yoo, the founding director of the Center for Technology, Innovation and Competition, the U.S. led in many broadband metrics in 2011 and 2012. And, he said, it is precisely the absence of a “one size fits all” regulatory structure that has been been driving technological innovation forward in the marketplace.

In other words, according to Yoo, the American approach to facilities-based competition – where cable companies and telephone companies compete through rival communications networks –has succeeded.

While the findings may be “surprising” to some, Yoo said they proved the importance of examining the best approach to broadband regulation based on “real world data.”

The notion that “fiber is the only answer” to affordable high-speed broadband is a misconception, he said. Countries emphasizing fiber over rival technologies – including Sweden and France – were among the worst broadband performers.

In the U.S., 82 percent of households received broadband at speeds of at least 25 Megabits per second (Mbps), versus 54 percent in Europe. In rural areas, the difference was even greater: 48 percent in the U.S., versus 12 percent in Europe. The five countries that did beat U.S. coverage of greater than 25 Mbps (including Denmark and the Netherlands) are compact, urbanized regions with greater population densities.

Additionally, even looking at fiber-based technologies, the U.S. is outperforming Europe, he said. Fiber coverage in the U.S. went from 17 percent in 2011 to 23 percent in 2012. In Europe, fiber coverage went from 10 percent in 2011 to 12 percent in 2012.

And, based on the measurement of telecommunications investment per household, the U.S. number is more than double that of Europe: $562 versus $244 in the old world.

And, he said, American users consumed 50 percent more bandwidth than Europeans in 2011 and 2012.

“The best measure of how much a network is really worth is how much you use it,” Yoo said. “It’s great to have a very fast car, but unless you use it, it’s not really doing very much for you.”

One area where the U.S. could see improvement is in the area of broadband adoption, Brake said. That demonstrates continued need to demonstrate value in broadband for consumers.

Yoo agreed: “Availability is only a part of the question. There are plenty of people who have broadband available to them who are choosing not to adopt.”

Moderator Gerry Faulhaber added: “As regulators, we can mandate coverage, we can mandate buildout. What we can’t do is mandate people to use it.”

Keeping a series of tiered rates for broadband service is exactly what America’s broadband rollout needs, said Brake. That not only encourages consumers to purchase internet at lower introductory rates, it also efficiently places the burden on those who wish to pay more for higher-speed service. This helps to recuperate costs for networks.

“Is it better to provide 75 to 100 Mbps to 80 to 90 percent of the population, or one Gigabit per second to 10 to 20 percent of the population?”

Blair Levin, former director of the FCC’s National Broadband Plan, and now communications a science fellow at the Aspen Institute, said that comparisons with Europe doesn’t change America’s objective to build deeper fiber, use broadband to improve the delivery of goods and services, and connect more users.

“Which activity is more productive – looking at oneself in the mirror and asking, ‘do these jeans make me look fat?’ or going to the gym? Focusing on actions that improve one’s condition is better than wondering about how one should appear relative to others,” said Levin.

Continue Reading

Broadband Updates

Discussion of Broadband Breakfast Club Virtual Event on High-Capacity Applications and Gigabit Connectivity

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

Published

on

WASHINGTON, September 24, 2013 – The Broadband Breakfast Club released the first video of its Broadband Breakfast Club Virtual Event, on “How High-Capacity Applications Are Driving Gigabit Connectivity.”

The dialogue featured Dr. Glenn Ricart, Chief Technology Officer, US IGNITESheldon Grizzle of GigTank in Chattanooga, Tennessee; Todd MarriottExecutive Director of UTOPIA, the Utah Telecommunications Open Infrastructure Agency, and Drew ClarkChairman and Publisher, BroadbandBreakfast.com.

To register for the next Broadband Breakfast Club Virtual Event, “How Will FirstNet Improve Public Safety Communications?,” on Tuesday, October 15, 2013, at 11 a.m. ET/10 a.m. CT, please visit http://gowoa.me/i/XV8

Continue Reading

#broadbandlive

Breakfast Club Video: ‘Gigabit and Ultra-High-Speed Networks: Where They Stand Now and How They Are Building the Future’

Published

on

WASHINGTON, May 24, 2013 – Emphasizing the developing nature of broadband networks in the United States, speakers at the May 21 Broadband Breakfast Club event said that the recent achievement of ultra-high speed broadband networks has been a critical factor seeding transformative developments for organizations, individuals and communities. These developments, panelists said, were simply not possible before with slower speed networks.

Yet panelists at the event, “Becoming a Gigabit Nation: What Have We Learned About Ultra-High Speed Broadband?” also agreed that speed is not actually the most important factor in the maturing of these networks.

Event Highlights

Complete Program

Successful deployment of such networks requires concerted efforts and continual upgrades involving community leadership, assessment of consumer needs and desires, infrastructure development, application development and successful assessment of usage patterns. All of these factors affect the success of such gigabit and high-speed networks, panelists said.

In other words, high-speed networks need to be developed in concert with proposed applications, which are in turn developed in the context of their communities or customer base.

As gigabit cities consultant David Sandel said, gigabit and smart city transformation being undertaken is 90 percent sociology and 10 percent infrastructure. Sandel, president of Sandel and Associates, works with St. Louis, Kansas City and other communities worldwide and runs the Gigabit City Summit, a global forum of community leaders who are engaged in discussion on new forms of leadership for managing such networks.

Sandel said that new gigabit leadership must break out of traditional silos and engage in greater information exchange and collaboration. Less hierarchy, more inclusion and more communication, facilitate the success of gigabit services and applications, he said.

What’s Happening Now

Sandel and other panelists gave examples of how 100-plus megabit per second and gigabit-level connectivity is already providing considerable benefits to cities that have it – even where the majority of a city’s consumers do not yet have needs for those levels of service.

For example, Sandel described the success of a two-mile gigabit main street in St. Louis, Missouri. This project has attracted a number of innovative businesses to the area. He said that such projects carry several benefits to an entire city, such as enabling the use of cloud services, driving up real estate values, and creating high-value jobs. In addition, the current relatively higher costs of gigabit service in communities can be partially offset by institutional and industrial uses.

Similarly, Sheldon Grizzle, founder and co-director of the Chattanooga-based GIGTANK, a technology start-up accelerator, said that the implementation of gigabit broadband by the local utility EPB has been a boon to its electrical grid. Power outages in the area have decreased by 60 percent, he said.

Grizzle says that Chattanooga, as a small city of 170,000, sees itself as a good test case for gigabit networks. Its network now provides speeds of 50 Mbps for 50,000 subscribers. It also offers or Gbps symmetrical service (i.e. 1 Gbps upload and 1 Gbps download) for $300 a month, although the number of subscribers has been fewer. He attributed the relatively low demand for the gigabit offered to the high price point.

Grizzle said that GIGTANK has been recruiting application developers from around the world to build appropriate apps for the community, as Chattanooga’s gigabit network grows beyond its infancy.

Speed Issues

Notwithstanding high-profile gigabit build-outs in recent years, nationally broadband speeds have been steadily increasing by other methods over the last several years, said Kevin McElearney, senior vice president of network engineering and technical operations for Comcast Cable.

McElearney said that, for example, Comcast has innovated on nextgen technologies every year, increasing network speeds 11 times over the last 11 years, and is now running terabit links over the backbone to allow capacity for new applications. He said that Comcast now provides up to 100 Mbps download capacity, with 70 percent of consumers electing for 25 Mbps and 30 percent for tiers higher speeds.

McElearney said that Comcast sees the increasing use of multiple devices in households as the principal driver behind the demand for higher broadband speeds for consumers.

Application Development

William Wallace, Executive Director of U.S. Ignite, a developer of gigabit-ready digital experiences and applications, spoke of an “internet of immersive experience,” suggesting an internet experience completely different from prior experiences. Users will also be creating their own experiences, he said.

Wallace further noted that customization of network features around applications will help to build in the greatest efficiencies. For example, different applications will be characterized by different speeds, security features, cloud storage locations, latencies etc.

Scott Wallsten, vice president for research and senior fellow at the Technology Policy Institute, said that focus on ultra-high broadband speeds is misplaced. According to Wallsten, because internet speeds are already increasing consistently, policies focusing on speed are unnecessary. Instead, Wallsten said, greater attention should be paid to other metrics of broadband quality, such as latency and reliability.

Additionally, Wallsten stated that the government’s adoption programs should be focused on low-income inner-city non-adopters rather than rural high-speed development. He said that the Federal Communications Commission’s high cost fund portion of the Universal Service Fund has not been sufficient to pay for rural development. Instead, the best hope to help the most individuals get broadband is to focus on urban areas. Increased efficiencies in cities will offer a better chance for providers to lower costs and then expand network development in rural areas.

Sandel concluded with how education is critical for successful gigabit network development and that there should be a three-pronged approach: education for leaders as to the impacts and benefits of gigabit networks and applications across all sectors, development of clear economic development models that draw lines to revenue flows, and policies for inclusion of all populations so that everyone can participate.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending