Connect with us

Broadband's Impact

Blockbuster Movies May Boost Emerging 3-D Services

LONDON, March 31, 2010 – The runaway success of the digitally dazzling movies Avatar and Alice in Wonderland has boosted hopes among satellite and cable television operators that their emerging 3-D services will gain significant traction despite the high cost for the TV set and goggles. However, more 3-D means greater use of already scarce broadband.

Published

on

LONDON, March 31, 2010 – The runaway success of the digitally dazzling movies Avatar and Alice in Wonderland has boosted hopes among satellite and cable television operators that their emerging 3-D services will gain significant traction despite the high cost for the TV set and goggles.

Those films showed that 3-D technology can add depth and power to the visual experience, which is an important step toward adoption of 3-D TV, at least when the price comes down, according to Richard Broughton, senior analyst at the London-based TV specialist research firm Screen Digest.

“Pay-TV companies are keen to use this publicity to their advantage,” said Broughton. “Without such major budget films really highlighting the state of 3-D technology, it is unlikely that 3-D would have reached such a high point in consumer mindsets.”

From the TV perspective, the films timed nicely with the impending soccer World Cup, which will be used to showcase 3-D, particularly in Europe where there are a number of participating countries in the event.

“Sports were a critical factor for [high-definition] uptake,” Broughton said. Another major European sports championship two years ago prompted uptake of HD TV by many households. It’s no coincidence that many of the first HD channels to be launched by platforms are sports channels.

Pictures in three dimensions are likely to follow the same trend, according to Broughton. Sky TV has plans for HD channels and is rolling out a range of 3-D screens to a number of public venues in time for the World Cup. Sky is launching its first 3D channel in the United Kingdom in April timed for the World Cup, planning to follow with a more robust range of 3-D programming later this year.

However, there are still doubts whether 3-D TV really is ripe for widespread consumer acceptance — even if the prices do come down — because it still relies on goggles.

While people are happy to wear goggles to watch a movie occasionally, it is doubtful whether many will be willing to don them on a regular basis.

Goggles currently are needed to simulate the way the human brain creates 3-D image, by presenting each eye with slightly offset images. By contrast, traditional 2-D TV, including HD, is just a fast changing sequence of 2-D still images, relying just on perspective within each frame to convey depth.

The major TV makers have been working on “goggle-less” technology, but have so far failed to translate this into a screen that works without creating headaches or visual problems.

“There are a number of concerns regarding perceived image quality and comfort of viewing for auto stereoscopic (goggle-less) technologies which means that for the near future, 3D televisions requiring glasses are likely to take front stage,” said Broughton.

The problem lies in the way the human visual cortex has evolved to process offset binocular images, which is hard to cater for accurately in an artificial flat screen system. The result is that some people suffer from the same sort of problems caused by wearing someone else’s glasses, for example.

Such problems emerged during the testing of Philips’ 3-D TV, due for launch in summer 2010. Originally this was going to be goggle-less, but after causing visual discomfort among many testers trials will now be introduced with viewing glasses.

Which ever viewing technology is adopted, 3-D TV is going to soak up even more bandwidth than HD, causing further problems for broadband service providers.

It is no surprise that satellite and cable operators are coming out first with 3-D services, which generally consume twice as much bandwidth per channel as 1080p HD, which is the the highest resolution HD category. The 1080p already generates twice as much data as 1080i or 720p HD, the format used by many existing HD services.

In effect, each eye needs its own channel for 3-D. Given that the whole point of 3-D is to deliver the highest quality viewing experience possible, there is little point having anything less than the best HD for each eye. This may be transmitted at around 16 megabits per second with the latest H.264 compression. However, developers and manufacturers are concerned about sacrificing too much quality in the process.

Broadband operators would have to upgrade their digital subscriber line networks to VDSL2 to deliver 3-D. Even then they may run out of headroom when they deliver multiple channels, while their cable and satellite competitors are better placed with more broadcast spectrum.

Over time, further improvements in management of the electromagnetic spectrum over copper, combined with greater fiber penetration will enable broadband operators to deliver multichannel 3-D. Meanwhile, some players in the 3-D community may be hoping that mass acceptance will be delayed by the continuing problems getting goggle-less technology to work.

It may be crystal clear if this is the case when the World Cup is over.

Philip Hunter is a London based technology reporter specialising in broadband platforms and their use to access high speed services and digital entertainment. He has written extensively for European publications about emerging broadband services and the issues surrounding deployment and access for over 10 years, with a technical background in ICT systems development and testing.

Digital Inclusion

Lack of Public Broadband Pricing Information a Cause of Digital Divide, Say Advocates

Panelists argued that lack of equitable digital access is deadly and driven by lack of competition.

Published

on

September 24, 2021- Affordability, language and lack of competition are among the factors that continue to perpetuate the digital divide and related inequities, according to panelists at a Thursday event on race and broadband.

One of the panelists faulted the lack of public broadband pricing information as a root cause.

In poorer communities there’s “fewer ISPs. There’s less competition. There’s less investment in fiber,” said Herman Galperin, associate professor at the University of Southern California. “It is about income. It is about race, but what really matters is the combination of poverty and communities of color. That’s where we find the largest deficits of broadband infrastructure.”

While acknowledging that “there is an ongoing effort at the [Federal Communications Commission] to significantly improve the type of data and the granularity of the data that the ISPs will be required to report,” Galperin said that the lack of a push to make ISP pricing public will doom that effort to fail.

He also questioned why ISPs do not or are not required to report their maps of service coverage revealing areas of no or low service. “Affordability is perhaps the biggest factor in preventing low-income folks from connecting,” Galperin said.

“It’s plain bang for their buck,” said Traci Morris, executive director of the American Indian Policy Institute at Arizona State University, referring to broadband providers reluctance to serve rural and remote areas. “It costs more money to go to [tribal lands].”

Furthermore, the COVID-19 pandemic has only made that digital divide clearer and more deadly. “There was no access to information for telehealth,” said Morris. “No access to information on how the virus spread.”

Galperin also raised the impact of digital gaps in access upon homeless and low-income populations. As people come in and out of homelessness, they have trouble connecting to the internet at crucial times, because – for example – a library might be closed.

Low-income populations also have “systemic” digital access issues struggling at times with paying their bills having to shut their internet off for months at a time.

Another issue facing the digital divide is linguistic. Rebecca Kauma, economic and digital inclusion program manager for the city of Long Beach, California, said that residents often speak a language other than English. But ISPs may not offer interpretation services for them to be able to communicate in their language.

Funding, though not a quick fix-all, often brings about positive change in the right hands. Long Beach received more than $1 million from the U.S. CARES Act, passed in the wake of the early pandemic last year. “One of the programs that we designed was to administer free hotspots and computing devices to those that qualify,” she said.

Some “band-aid solutions” to “systemic problems” exist but aren’t receiving the attention or initiative they deserve, said Galperin. “What advocacy organizations are doing but we need a lot more effort is helping people sign up for existing low-cost offers.” The problem, he says, is that “ISPs are not particularly eager to promote” low-cost offers.

The event “Race and Digital Inequity: The Impact on Poor Communities of Color,” was hosted by the Michelson 20MM Foundation and its partners the California Community Foundation, Silicon Valley Community Foundation and Southern California Grantmakers.

Continue Reading

Broadband's Impact

USC, CETF Collaborate on Research for Broadband Affordability

Advisory panel includes leaders in broadband and a chief economist at the FCC.

Published

on

Hernan Galperin of USC's Annenberg School

WASHINGTON, September 22, 2021 – Researchers from the University of Southern California’s Annenberg School and the California Emerging Technology Fund is partnering to recommend strategies for bringing affordable broadband to all Americans.

In a press release on Tuesday, the university’s school of communications and journalism and the CETF will be guided by an expert advisory panel, “whose members include highly respected leaders in government, academia, foundations and non-profit and consumer-focused organizations.”

Members of the advisory panel include a chief economist at the Federal Communications Commission, digital inclusion experts, broadband advisors to governors, professors and deans, and other public interest organizations.

“With the federal government and states committing billions to broadband in the near term, there is a unique window of opportunity to connect millions of low-income Americans to the infrastructure they need to thrive in the 21st century,” Hernan Galperin, a professor at the school, said in the release.

“However, we need to make sure public funds are used effectively, and that subsidies are distributed in an equitable and sustainable manner,” he added. “This research program will contribute to achieve these goals by providing evidence-based recommendations about the most cost-effective ways to make these historic investments in broadband work for all.”

The CETF and USC have collaborated before on surveys about broadband adoption. In a series of said surveys recently, the organizations found disparities along income levels, as lower-income families reported lower levels of technology adoption, despite improvement over the course of the pandemic.

The surveys also showed that access to connected devices was growing, but racial minorities are still disproportionately impacted by the digital divide.

The collaboration comes before the House is expected to vote on a massive infrastructure package that includes $65 billion for broadband. Observers and experts have noted the package’s vision for flexibility, but some are concerned about the details of how that money will be spent going forward.

Continue Reading

Broadband's Impact

Technology Policy Institute Introduces Data Index to Help Identify Connectivity-Deprived Areas

The Broadband Connectivity Index uses multiple datasets to try to get a better understanding of well- and under-connected areas in the U.S.

Published

on

Scott Wallsten is president and senior fellow at the Technology Policy Institute

WASHINGTON, September 16, 2021 – The Technology Policy Institute introduced Thursday a broadband data index that it said could help policymakers study areas across the country with inadequate connectivity.

The TPI said the Broadband Connectivity Index uses multiple broadband datasets to compare overall connectivity “objectively and consistently across any geographic areas.” It said it will be adding it soon into its TPI Broadband Map.

The BCI uses a “machine learning principal components analysis” to take into account the share of households that can access fixed speeds the federal standard of 25 Megabits per second download and 3 Mbps upload and 100/25 – which is calculated based on the Federal Communications Commission’s Form 477 data with the American Community Survey – while also using download speed data from Ookla, Microsoft data for share of households with 25/3, and the share of households with a broadband subscription, which comes from the American Community Survey.

The BCI has a range of zero to 10, where zero is the worst connected and 10 is the best. It found that Falls Church, Virginia was the county with the highest score with the following characteristic: 99 percent of households have access to at least 100/25, 100 percent of households connect to Microsoft services at 25/3, the average fixed download speed is 243 Mbps in Ookla in the second quarter of this year, and 94 percent of households have a fixed internet connection.

Meanwhile, the worst-connected county is Echols County in Georgia. None of the population has access to a fixed connection of 25/3, which doesn’t include satellite connectivity, three percent connect to Microsoft’s servers at 25/3, the average download speed is 7 Mbps, and only 47 percent of households have an internet connection. It notes that service providers won $3.6 million out of the $9.2-billion Rural Digital Opportunity Fund to provide service in this county.

“Policymakers could use this index to identify areas that require a closer look. Perhaps any county below, say, the fifth percentile, for example, would be places to spend effort trying to understand,” the TPI said.

“We don’t claim that this index is the perfect indicator of connectivity, or even the best one we can create,” TPI added. “In some cases, it might magnify errors, particularly if multiple datasets include errors in the same area.

“We’re still fine-tuning it to reduce error to the extent possible and ensure the index truly captures useful information. Still, this preliminary exercise shows that it is possible to obtain new information on connectivity with existing datasets rather than relying only on future, extremely expensive data.”

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending