Connect with us

Wireless

Congress, FCC See DTV Transition Progress; Low Power Broadcasters Say Left Behind

WASHINGTON, March 26, 2009 – The transition to digital television since the passage of the DTV Delay Act has been a “major accomplishment,” House Energy and Commerce Subcommittee Chairman Rick Boucher, D-Va., said Thursday at a hearing on the state of the DTV transition.

Published

on

WASHINGTON, March 26, 2009 – The transition to digital television since the passage of the DTV Delay Act has been a “major accomplishment,” Rep. Rick Boucher, D-Va., said Thursday at a hearing on the state of the DTV transition.

Boucher, chairman of the House Energy and Commerce Subcommittee on Technology, Communications and the Internet, said that while he was pleased at seeing “clear results” and positive progress, “much remains to be done,” Boucher said.

Ranking member Cliff Stearns, R-Fal., agreed that “the glass is 95 percent full,” on the country’s readiness. But he lamented the amount of money set aside for coupons, and suggested significant savings could be had by confining the program to households without cable or satellite television.

“Shepherding the transition” has been “priority number 1” since taking over the FCC, Acting Chairman Michael Copps said.

Even before his elevation from the position of commissioner, Copps said he believed “it was clear the country was not ready…for the February 17 cutoff.” Besides “rampant consumer confusion,” Copps said a major problem had been a lack of coordination between public and private stakeholders .

Copps thanked Congress for the Delay Act, but was careful to warn members that “we are nowhere out of the woods yet.” The transition may not be “seamless,” he said. “But there is time to make a real difference.”

The FCC, Commerce Department’s National Telecommunications and Information Administration, and private sector actors are focusing “day and night” on education and outreach, he said. Starting in April, public service announcements will begin to mention antenna and converter box rescanning issues, as well as publicizing walk-in help centers.

Cable and broadcast television providers have “really stepped up to the plate” in helping build a unified DTV help call center, said Copps.

In addition, the FCC is working with AmeriCorps and other groups to put “boots on the ground” to help people get set up who might not otherwise be able to install equipment.

Consumer Electronics Association CEO Gary Shapiro said the education program had been one of the most effective consumer campaigns he has ever seen. “I bet more people know about the transition…than could identify the Vice-President of the United States,” he said.

“This great nation of ours can ill afford to delay the transition again,” Shapiro said in a statement released after the hearing. “To do so would put at risk the many benefits that will accrue from the switch to digital: phenomenal amount of beachfront-quality spectrum for new licensed and unlicensed services, including sorely needed improvement to Internet access; better communications platforms for law enforcement and public safety; and almost $20 billion in auction revenues for the U.S. Treasury.”

But Copps lamented that his calls for increased awareness of the analog “cliff” effect had gone ignored for the past two years.

However, the FCC has recently launched an online “map” that allows consumers to determine if they will be able to receive a signal post-transition. When Rep. Anthony Weiner, D-N.Y., asked Copps if urban landscapes would pose reception problems post-transition, Copps said that there was “no question” in his mind that such problems exist. “We’ll have to deal with them…and we would be remiss if we did not study them further.”

The DTV coupon backlog is clear as of five days ago, said Acting Assistant Commerce Secretary Anna Gomez, currently the top-ranking official at the NTIA.

Mark Lloyd, vice president of the Leadership Conference on Civil Rights said many consumers have found the FCC’s new online map very useful.

But Lloyd noted his group has found DTV reception to be inconsistent not only within the same community, but within the same apartment building. Most important to a smooth transition, he said, is “the importance of being able to go in the homes” of populations in need and help them with rescanning, and other issues.

Idaho Public Television general manager Peter Morrill said his organization has identified six areas, primarily located in rugged terrain that will be affected by the “cliff effect.”

Morrill acknowledged the FCC’s efforts to address this need for digital television “fill-in” service, but said many stations lack the financial means to license and build these systems in time for the June 12, 2009 shutdown.

“Legislative encouragement and additional funding can help us ensure the smoothest transition possible,” said Morrill. But Weiner suggested a change in education was necessary to enable consumers to understand the importance of finding a solution. — “bad service means no service, in this case.”

Robert Prather, CEO of Gray Television summed up how to solve the cliff issue: “it all comes down to funding.”

Boucher was also troubled by numbers from CEA and NTIA that “did not match up” with regard to supply and demand for converter boxes. But Wal-Mart senior vice president Gary Severnson said there was close coordination between manufacturers, retailers, and NTIA to coordinate supply, and that he had “no concern” about a shortage. His biggest fear was that he would be left with a surplus of boxes post-transition.

Shapiro said the data flow from NTIA was very good, and at least four manufacturers were producing more than enough boxes to meet demand. But in the event of a shortage, Shapiro suggested that as a “safety valve,” coupons could be used to subsidize consumer access basic cable or allow them receive stripped-down DTV sets.

Rep. John Shimkus, R-Ill, voiced some concerned over the possibility of an additional delay upon the wireless and electronics industry. He introduced into the record a letter from Qualcomm, a manufacturer of wireless broadband equipment to highlight the impact on the company that further delay would bring. Boucher said Shimkus’ fears were unfounded, as both he and Chairman Henry Waxman were in complete agreement: “We’re not going to postpone this transition again – we need to get it right.”

Also read into the record was a letter from Community Broadcasters Association president Kyle Reeves expressing anger over Congress’ failure to include Class A and Low Power television stations in the transition. Despite having its entire industry threatened by the transition, CBA was explicitly denied the opportunity to testify at today’s hearing, a spokesman for the association said.

If something is not done about the Low Power and Class A station problem, Reeves predicted a disaster: “Diversity of voices and career opportunities will suffer a real setback,” he said, citing a CBA-commissioned survey showing 43 percent of Class A and Low Power TV stations have significant minority ownership. Most are small businesses, and 62 percent are owner-operated, the survey said.

And 34 percent of CBA member stations broadcast in foreign languages, the survey noted. Without some kind of action, “a critical source of emergency information for foreign language speakers will be lost,” Reeves warned. “Foreign language speakers will end up watching imported cable and satellite channels that carry only foreign-produced programming, pay no taxes, employ no U.S. citizens, and provide no local content and no American perspective.”

The CBA testified before the subcommittee under then-chairman Rep. Ed Markey, D-Mass., who told its president “Help is on the way.” But the CBA has not yet seen any real help, Reeves said.

Reeves suggested the FCC take an “active hand” in promoting a DTV transition for LPTV stations after [the FCC] “spent over a decade on full power TV, including devoting enormous resources to finding channels so that as many stations as possible could operate parallel analog and digital stations for several years.” Congress could fund a program to transition all stations to digital, or allow them to “leapfrog” broadcasting and move onto broadband, mobile television, or other emerging technologies.

In the alternative, Congress could allow LPTV stations to auction their spectrum and share in the proceeds for a “soft landing” as they shut down, Reeves said. “If that opportunity were offered, some would take it today, sadly perhaps; but it would be a lot better than losing everything: hopes, dreams, and investments.”

But in the current economy, Reeves suggested there was no valid reason to ignore his industry any longer: “The combined impacts of the decline of over-the-air viewing, the digital transition, and the recession have created the perfect storm. There is no more time to think about it.”

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

5G

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

Published

on

The author of this Expert Opinion is David Flower, CEO of Volt Active Data

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization and bad data

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

Hyper-personalization and slow data

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

How to fix the hyper-personalization paradox

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Dave Wright: Shared Relocation Fund Will Make More of Finite Spectrum Resource

‘Wireless connectivity is one of the most vital aspects of our digital infrastructure.’

Published

on

The Author of this Expert Opinion is Dave Wright, president of OnGo Alliance and head of global wireless policy at Hewlett Packard Enterprise

In order to meet the gaps in broadband connectivity that persist throughout the country, we must have a more comprehensive view for the necessity of all available spectrum – whether shared, licensed or unlicensed – understanding that they are complementary and independently important to our nation’s future.

As we figure out how we will meet the needs of an increasingly wireless world, it is critical that we think collaboratively on how we can free up and share spectrum, working closely and cooperatively with the federal agencies responsible for our nation’s spectrum resources, the Federal Communications Commission and the National Telecommunication and Information Administration.

With recent confirmed leadership appointments in the NTIA and FCC, and renewed focus on collaboration and collegiality between these organizations, there is hope for renewed effectiveness in America’s overall management of our spectrum resources.

From a policy perspective, the OnGo Alliance is working to shed light on the incentives that inherently exist around the way spectrum is made available today. For terrestrial uses, there are two long established methods for making spectrum available – via a licensing process including an auction of the frequencies, or via an unlicensed allocation where spectrum is made available on a license-exempt basis.

Licensed bands have given rise to our cellular connectivity, while unlicensed spectrum has enabled innovations like the Wi-Fi and Bluetooth solutions that we know and depend upon today. The near ubiquitous presence of these technologies speaks to the efficacy of these approaches. The US 3.5 GHz Citizens Broadband Radio Service is the first spectrum access framework that combines aspects of licensed (protected access) and unlicensed (opportunistic access) spectrum within a single, dynamically managed access paradigm.

Congress has increasingly been looking to licensed spectrum auctions as a source of revenue to cover the funding requirements for new programs. And Federal users who are occupying spectrum and then make the spectrum available for auction can take advantage of monies made available through the Spectrum Relocation Fund to cover the costs associated with transitioning their systems.

The SRF is in turn funded based the resulting auction revenues. These are examples of the current incentives in the system which are either directly or indirectly tied to auction revenues of licensed spectrum. These incentives inherently bias the policymaking processes toward licensed spectrum, at the expense of unlicensed and/or opportunistic spectrum like we have in the CBRS General Authorized Access tier.

This bias is not helpful in our quest to provide accessible broadband throughout the nation as unlicensed and GAA are key components in most solutions, from Wi-Fi as the “last meter” connection to a fixed broadband network to GAA’s prominent role in rural fixed wireless offerings.

CBRS is an optimal framework for putting mid-band spectrum to intensive uses for a wide variety of uses. In the only two years since CBRS commercial operations were approved by the FCC, over 225,000 CBRS base stations have been installed nationwide.

Collaboration between cloud players, system integrators, radio vendors and operators has reached critical mass, building a vibrant, self-sustaining ecosystem. CBRS has allowed enterprises and rural farms alike the opportunity to install private 4G and 5G networks that are connecting IoT devices – from factory robots to autonomous farm equipment. School districts, airports, military bases and logistics facilities, factories, hospitals, office buildings, and public libraries are only but a few of the limitless facilities where connectivity has been enabled by CBRS spectrum.

Wireless connectivity is one of the most vital aspects of our digital infrastructure, and we must use all of the available resources in order to make broadband as ubiquitous as any other utility. Our policymaking, and the incentives around it, must account for the fact that all types of spectrum are important – whether licensed, unlicensed or shared – and that it is vital to ensure that there are proper allocations of each type to meet the relentless demand. We must work together to make the most of what we have.

Dave Wright played an instrumental role in the formation of the OnGo Alliance (originally known as the CBRS Alliance), collaborating with other founding members to create a robust multi-stakeholder organization focused on the optimization of LTE and 5G services in the CBRS band. He served as the Alliance’s first Secretary from its launch in August 2016 and was elected as the President of the Alliance in February 2018. He advocates for unlicensed, licensed, and dynamic sharing frameworks – recognizing the vital role that all spectrum management regimes play in our increasingly wireless world. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

5G

Optional Security Features for 5G Technology Poses Risks

The next generation wireless technology is being touted as the most secure yet.

Published

on

Photo of Dan Elmore of the Idaho National Labratory

WASHINGTON, July 28, 2022 – 5G technology can still present security concerns despite being touted as the most secure of the cellular generations, said Dan Elmore of the Idaho National Laboratory at a 5G Future event Thursday.

In response to the emerging challenge of validating 5G security protocols and data protection technologies, the Idaho National Laboratory established its Wireless Security Institute in 2019 to coordinate government, academic, and private industry research efforts to foster more secure and reliable 5G technology.

While 5G network offers a “rich suite” of security features in the standards, most of it is optional for manufacturers and developers to choose to implement in their system or device, said Elmore, who is the director for critical infrastructure security at the INL. This poses a significant challenge for 5G, particularly for critical infrastructure applications, as consumers may not know how standards are implemented, Elmore said.

Elmore urged consumers, especially federal agencies, to ask the hard questions and consider “what vulnerabilities might be present in how they [manufacturers and developers] employ those standards that could be exploited.”

5G is designed to allow cellular devices to connect at higher speeds with lower latency, the delay in loading requests, than previous generations. Already, wireless carriers are incorporating it into devices and working on national 5G networks.

Because of its facilitation of real-time monitoring, 5G technology is expected to help tackle critical issues like climate change and environmental sustainability.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending