Connect with us

Expert Opinion

Shouldn't FCC Rules Over Indecency Just Grow Up? Reflections on Free Speech and Converging Media

WASHINGTON, November 4 – This article, “TV Has Grown Up. Shouldn’t FCC Rules?” first appeared in the Washington Post Outlook section on Sunday, May 16, 2004, or nearly four-and-a-half years ago. It remains more relevant today than ever: the Supreme Court is today considering Federal Communications Commission v. Fox Television Station, a case about whether the FCC acted properly in sanctioning Fox over the use of the words “fuck” and “shit” on broadcast television.

Published

on

Commentary

Editor’s Note: This article of mine, “TV Has Grown Up. Shouldn’t FCC Rules?” first appeared in the Washington Post Outlook section on Sunday, May 16, 2004, or nearly four-and-a-half years ago. It remains more relevant today than ever: the Supreme Court is today considering Federal Communications Commission v. Fox Television Station, a case about whether the FCC acted properly in sanctioning Fox over the use of the words “fuck” and “shit” on broadcast television. (November 4, 2008)

We Americans have always been on intimate terms with our televisions. They sit in our living rooms. They keep us company. They show us family values, from “I Love Lucy” to “All in the Family” to “The Cosby Show.” So it seems only natural that if our TV friends misbehaved by speaking foul language or showing too much skin, they would be in trouble — perhaps even grounded — very quickly.

Television and radio have always occupied a unique space in the nation’s public conversation, and politicians going back to at least Franklin Roosevelt and his “fireside chats” have understood the power of the electronic soapbox. Part of its influence came from an inherent limitation: The finite number of broadcast frequencies. That led the government to create the Federal Communications Commission, which regulated who could and couldn’t use the airwaves. The FCC also developed rules on what broadcasters couldn’t say.

But now our televisions and radios have grown up, and they have gotten married to all sorts of other electronic devices and technologies. These marriages are producing multimedia offspring that bear little or no resemblance to the bulky boxes of yesterday. This “convergence” of various technologies, as this trend is known in the industry, renders obsolete many of the rules that have governed broadcasting for decades.

It no longer makes any sense to impose one set of rules on the “over-the-air” networks while cable, Internet, satellite and music providers can send — almost unimpeded — all sorts of programming directly to your living room, car, laptop and even your cell phone.

Consider these three scenarios:

• A couple in Los Angeles — I’ll call them the TechnoYuppies — bought a 50-inch wide-screen plasma Gateway Media Center in March, just in time to watch the blood-soaked fifth season premiere of HBO’s “The Sopranos” in high-definition color and surround sound. With their $6,999 television-computer video player hooked up to Time Warner’s digital cable system, the family can also order many cable programs on demand — something their 3-year-old daughter likes because she doesn’t need to wait for the next “Scooby-Doo” on Cartoon Network. Mr. TechnoYuppie is particularly fond of the Media Center because it will allow him to access the Internet with a remote control and download episodes of “Fawlty Towers” (soon to be available on BBC’s Web site), which he will then be able to watch over the high-speed cable modem.

• Surfer Dude, a college student, used to tune into shock jock Howard Stern on a local radio station owned by Clear Channel Communications Inc., but when the FCC went after Stern for “indecency” a few months ago Clear Channel dropped Stern from its broadcasting stations. Surfer Dude hopes Stern will syndicate his show to satellite radio, where Stern can shock to his heart’s content. (In anticipation of such a deal, Surfer Dude recently installed a $260 Kenwood digital radio in his car and subscribed to the new Sirius satellite service.) Meanwhile, he spends drive time listening to Eminem’s uncut rap tracks on his Apple iPod, which he plays over his car radio with a $69 wireless transmitter.

• Mr. and Mrs. Protective Parents try to keep the influence of the media from their four children, ages 5 to 13. Unlike most Americans, this high-tech couple knows how to use the “V-chip” now included in all new televisions, and blocks all programs rated TV-14 or TV-MA. The family decided to enter the digital age in April, buying the RCA digital versatile disc player from Wal-Mart with parental controls by ClearPlay. The software is smart enough to skip over scenes of nudity or profanity in box-office hits such as “Terminator 3.”

Readers who think that these situations seem futuristic should realize that nothing is made up here except for the people. This plethora of viewing and listening choices demonstrates that the current debate over broadcast indecency standards is woefully out of touch with the realities of the digital world as we now know it — not to mention the world that is just over the horizon.

The TechnoYuppies are like 88 percent of current Americans who get cable or direct broadcast satellite services for more channels — and better reception — than they would from a broadcast tower. But unlike the traditional television and radio signals that pass over certain broadcast frequencies, satellite and wireless are “free speech” airwaves — they aren’t subject to the indecency standards that the FCC cited in going after Stern.

This glaring inconsistency has some legislators in Washington scratching their heads and wondering: Why doesn’t everyone live by the same rules?

Some say that cable and satellite are different because consumers have to pay for them. But broadcast and pay channels sit side by side in the electronic programming guide. “The average consumer doesn’t distinguish over-the-air television from cable or satellite,” says Texas Republican Joe Barton, the new chairman of the House Energy and Commerce Committee. His vision? “If I can see it in my living room, and my grandson can click channels, the same rules of indecency apply.”

With the Senate about to debate a bill that would allow the FCC to boost its fines from $27,500 to $500,000, the answer to this question is vital and urgent. Whether you agree with Barton that all television and radio should be barred from transmitting what the government deems “indecent,” or whether you believe that all media should be free from such censorship, as I do, it seems clear that the current model has become unsustainable. That’s why the brief flash of Janet Jackson’s breast may be remembered decades from now not just as a silly show of bad taste, but as a defining moment in the country’s ongoing debate about free speech.

Congress began regulating broadcasters in 1927 on the grounds of scarcity. In return for free and exclusive use of a given wavelength, broadcasters agreed to serve the “public interest, convenience, and necessity” — or at least to do what Congress and the FCC ordered. One element of this agreement was a ban on obscene, indecent and profane language.

This scarcity theory has always lacked substance. Nobel Prize-winning economist Ronald Coase’s reputation is based, in part, on a notable paper he wrote in 1959 that criticized the rationale behind the FCC’s command and control regime of licensing broadcasters. “It is a commonplace of economics that almost all resources in the economic system (and not simply radio and television frequencies) are limited in amount and scarce, in that people would like to use more than exists,” Coase argued in his seminal essay.

But now technology has created new electromagnetic spectrum. Higher wavelengths than those used by traditional radio and television systems have been pressed into service for digital cellular telephones, wireless data connections, and satellite television and radios. The XM and Sirius satellite radio companies each offer hundreds of channels with less spectrum than all FM radio broadcasters combined. And cellular carriers now pack thousands of conversations on a channel that once served a single voice conversation.

Nonetheless, “scarcity” remains the foundation of a bifurcated jurisprudence. Newspapers, magazines, books and the Internet enjoy expansive First Amendment protections. Radio and broadcast television, defined as “public” properties, do not.

The Supreme Court accepted the scarcity theory in a 1943 case, when it upheld the FCC’s power to grant or deny privileges to electronic speakers. In 1969, the court went further, ruling in Red Lion v. FCC that scarcity required a Pennsylvania radio station to give free reply time to an author whose book was criticized over the air. Thus, the “fairness doctrine” was affirmed.

Then came the famous “seven dirty words” — comedian George Carlin’s 1973 satiric monologue about the seven words, as he put it, that “you couldn’t say on the public, ah, airwaves, um, the ones you definitely wouldn’t say, ever.” Except that the defiant and mischievous Carlin did say them on the radio — over and over and over again.

A father who heard the monologue in his car — with his young son along for the ride — complained to the FCC, which sanctioned the Pacifica station that carried Carlin’s monologue. In 1978, the Supreme Court said the monologue wasn’t obscene, but that it was “patently offensive.” The court ruled in FCC v. Pacifica that the pervasiveness of broadcasting, and its easy accessibility to children, justified the FCC’s authority to impose indecency limitations.

I don’t want my 4-year-old son to see crude or provocative shows when he turns on our television. I also don’t want him to see such material when he turns on our Internet-connected computer. Yet it would be impractical, as well as unconstitutional, for the government to set itself as the censor of cable, satellite and Internet content. It makes more much sense for consumers to determine what comes into their homes.

The technology exists for us to be masters of convergence — whether it’s a V-chip, or Internet and movie filters, or a blocking device that keeps out cable and satellite channels that we don’t want to see. And, of course, it doesn’t require technology to talk with our kids about viewing standards.

Within the next decade it will be impossible to distinguish between televisions and computers. More cable, satellite and high-speed broadband means that it is only a matter of time before all Americans get television over the Internet — wirelessly or through a pipe of fiber or copper. It’s time to recognize that Congress and the FCC can no longer be the nation’s “public interest” nanny. Instead of trying to preserve rules from a world that no longer exists, they would do better to encourage the development of tools that will let us regulate ourselves.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Broadband Mapping & Data

Jeremy Jurick and Paul Schneid: Preparing Data for the FCC’s Broadband Filing

The new FCC requirements in the broadband data collection program are important to meet the nation’s connectivity goals.

Published

on

The authors of this Expert Opinion are Jeremy Jurick (left) and Paul Schneid of Michael Baker International.

The recent emphasis on the expansion of broadband access across the country, coupled with the requirements of the Infrastructure Investment and Jobs Act and Broadband Equity and Deployment program, has prompted the Federal Communications Commission to review and update its collection of data. Accurate data pinpointing where broadband service is – and is not – available is critically important. Broadband maps are used by Internet Service Providers and governments to identify locations that need service, as well as how to fund broadband expansion.

The FCC has recently established an important initiative called the Broadband Data Collection Program to ensure the collection of accurate, vital broadband availability data, implementing new requirements. Among other requirements of the BDC, ISPs must submit their serviceable location data and align that data with the FCC’s serviceable location fabric, which will require new methodologies from ISPs, resulting in additional hours spent and more resources allocated to address this upcoming task.

At Michael Baker International, our team is at the forefront of data collection and broadband expansion services. This article provides details on the requirement and filing process for ISPs.

Recognizing the challenges

The BDC filing process may be unfamiliar and challenging to some service providers due to the novelty of the program and the list of requirements it encompasses. Moreover, ISPs may be delayed in the processing and submission of their data, either due to limited resources or bandwidth to support these new tasks and responsibilities or experience to immediately and effectively tackle and complete this complex data collection/submittal process. With the extent of the data expected to be collected and submitted, which involves technical elements and resources, proceeding may seem daunting. Sifting through newly published materials and resources takes away valuable time and issues can arise before or after submittal with incomplete data or the ability to process the data into the appropriate standards, recently specified for fabric comparison by the FCC.

Getting started according to the timeline

To begin the BDC Filing process, ISPs should first become familiar with the timeline, federal regulations and data requirements surrounding the submission period.

Due to be submitted for the first time on September 1, 2022, and semi-annually going forward, specific data must be provided by all facilities-based providers of fixed and mobile broadband internet access who had one or more end user connections in service on June 30, 2022. Each filing will be based on the same schedule as the Form 477 filings (June 30th through September 1st and December 31st through March 1st).

Fulfilling the prerequisites ad the data requirements

As prerequisite to filing data in the BDC portal, the FCC requires ISPs or government entities to first complete the registration process within the FCC’s Commission Registrations System (CORES). Users will be assigned a 10-digit FCC Registration Number that will be used for verification purposes by the FCC.  Additionally, filers are also required by the FCC to show proof that they are indeed an organization that is responsible for tracking broadband coverage.  Each filer must provide documentation from the highest-ranking executive within their company confirming that the organization tracks broadband data.

Each BDC filing must include detailed information about the filer, broadband availability data (including supporting data) and Form 477 broadband subscription data. In addition, specific requirements are mandated for various ISPs:

  • Fixed wireline and satellite broadband service providers: Submit either polygon shapefiles or a list of locations constituting the provider’s service area.
  • Fixed wireless broadband service providers: Submit either propagation maps and propagation model details or a list of locations constituting the provider’s service area.
  • Mobile wireless broadband service providers: Submit propagation maps and propagation model details for each network technology, as well as for both outdoor stationary and in-vehicle mobile network coverage. Additionally, these ISPs must submit data for their signal strength heat map.

Finalizing for submission

Finally, ISPs must gain access to the serviceable location fabric, format the data to requirements for accurate comparison against the fabric and identify the addresses that meet requirements of serviceable areas. When the necessary data has been compiled and reviewed, the filing entity must navigate to the BDC system and submit its data onward to the FCC. The FCC gives the option to file submit data as an upload/web-based file or alternatively submit using an Application Programming Interface.

Partnering with a broadband expert

It is recommended that ISPs looking to both save time and ensure accuracy throughout the submission process partner with broadband experts that will ensure that all BDC requirements are met before submitting any data. Michael Baker International has thoroughly researched the BDC requirements and created a streamlined solution. ISPs simply provide the initial information, and our team then determines the appropriate data to be submitted, along with our translation of that data into the proper format. Once ISPs receive the data, they need only create a login and finally, upload the submission data.

Today, there is increased focus on an existing but growing need to close gaps in the digital divide. The new FCC requirements in the BDC program are an important part of ensuring the nation’s connectivity goals are met by collecting accurate data that will be necessary to provide services where they are most needed.

Jeremy Jurick is Michael Baker’s National Broadband Services Director and oversees Michael Baker International’s broadband planning, mapping and program management initiatives. His broadband experience includes roadmap development, planning, data collection and analysis, stakeholder engagement, broadband provider engagement, branding, multimedia design, GIS services, and software design, and he has provided testimony during several government hearings to inform policymakers on broadband policy and expansion, including advocating for high speed thresholds for the definition of broadband and allowing government entities to be eligible subgrantees for broadband funding.

Paul Schneid is a program manager at Michael Baker with nearly a decade of experience in broadband wireless equipment operation, customer service, and process improvement. Most recently, Schneid interfaced with vendors and clients to manage all implementation project phases from inception to completion across a citywide wireless broadband expansion in New York City. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Bryan Darr: An Order of Fiber, Please, with Wireless on the Side

Wireless is essential because for truly remote properties, a physical connection may never be practical.

Published

on

The author of this Expert Opinion is Bryan Darr, vice president of Smart Communities at Ookla.

Over the next five to ten years we will see an explosion of projects bringing high-speed connectivity to underserved communities in the United States. Although fiber infrastructure rightly gets most of the attention and funding, wireless networks should also be part of this planning process. Wireless networks can deploy faster, serve remote locations more economically, and provide some capabilities that fixed networks can’t. Failure to consider the comprehensive needs of the mobile broadband environment will hobble efforts in the next phase of this technology revolution.

How we got here

As federal broadband infrastructure funding is ramping up, state broadband offices are preparing to prove their need for a larger slice of the pie. This is detailed in the $42.5 billion Broadband Equity, Access and Deployment Program, which is a part of the infrastructure bill (the Infrastructure Investment and Jobs Act) passed into law in the fall of 2021. Although every state is guaranteed $100 million, that leaves about $37 billion yet to be divided up.

Assuredly, this pie won’t be sliced into equal portions across states, tribal areas, and U.S. territories. Differences in population, geographic area, household density, and income levels will impact the funding eligibility of individual jurisdictions. Preparedness to verify underserved areas will ensure that state and local governments can maximize their chances of securing adequate funding. The first step is to identify these communities and estimate the cost of covering each household. With a desire to help as many people as possible, there will be a tendency to prioritize areas with the lowest cost per connection.

State governments have been focused primarily on fiber access. However, as big a pot of money as the IIJA may be, it won’t be big enough to connect every household to fiber. Continued supply chain issues, inflation, and labor shortages (particularly with needed expertise) will expand the cost of projects in the coming years.

The race to compete for these billions of dollars has had a very uneven start. Some state broadband offices are fully staffed, have hired consultants, have obtained and collected network performance data, and already have mapping projects launched. Other states are just now funding their broadband offices and beginning to hire their first employees. States that cannot successfully challenge both the mapping fabric (think number of service addresses) and confidently identify unserved households will be disappointed with the size of their slice.

The recipe may require adjustment

Recently, Federal Communications Commission Chairwoman Jessica Rosenworcel called for the commission to reset its definition of broadband from 25 Mbps download speed and 3 Mbps upload speed to 100 down and 20 up. Many would agree that a reset is long overdue. The IIJA legislation is already requiring that new infrastructure builds meet this criteria. We should all recognize that this metric reset could make millions of additional households eligible for funding. Some policy organizations, including the Fiber Broadband Association, are voicing their opinions that those numbers are already dated and that the new target will not be enough for future needs such as the much-anticipated metaverse.

The specific benefits of wireless

Wireless connectivity can be broken down into three basic types of last-mile providers:

  1. Cellular service providers, offering traditional mobile and new fixed wireless access services
  2. Wireless internet service providers (WISPs), offering fixed point-to-point service
  3. Satellite companies (more on them later)

Wi-Fi is also wireless, but provides a final hop for only the last few feet of a network connection.

Wireless is essential because there is broad recognition that for truly remote properties, a physical connection may never be practical. As subsidies flow, that fact may be applicable to fewer locations, but there is certainly a point of diminishing return. As state and federal officials plan their networks to connect as many communities as they can, they should be factoring in where the wireless networks need bolstering as well. This is applicable for both mobile and WISP infrastructure.

Additional wireless investment could serve multiple needs. Poor wireless coverage is a common complaint even in densely populated areas. If you spend any significant time in rural areas, you know that there are locations where service is so spotty that the local population knows when to not risk initiating a call. Even if you get a signal, throughput can vary greatly. Just because you can receive a text in a particular location doesn’t mean you can download a video. These rural areas have weak wireless signals for the same reason that they lack good terrestrial broadband — the population density does not provide enough return on the investment.

Fiber is still a necessary ingredient

Today’s higher data usage demands the capacity that fiber provides. Mobile service providers are not going to build a new 5G tower without access to fiber backhaul. Sites that require long, dedicated fiber deployments can cost far more and lead to an unreasonable dent in the CapEx budget.

As new middle-mile networks are being designed, network planners should consider where wireless networks are weak and new towers are needed for improvement. Strategically adding splice points in poor service areas can significantly lower the barrier to attracting new wireless infrastructure. A lower cost of deployment will be a big incentive to wireless networks to bring improved service to rural communities.

We all depend on wireless services

Mobile connectivity has moved beyond a luxury and has become an expectation. Even if we could practically and affordably connect every house with fiber, there are many reasons to include wireless in your overall design plans.

  • Public safety – If you have ever had a flat tire or an overheated radiator, you know how important wireless coverage can be. Just try calling a rescue service with no bars. FirstNet wants to improve coverage as well, and incentivising new towers can provide a big assist.
  • Precision agriculture – Fiber-to-the-home can connect the farm house, the barn, and even the chicken houses, but it won’t connect the tractor or the combine. Livestock now wear devices that monitor animal well-being. Wireless is the only way to keep the whole farm connected and competitive in a global marketplace.
  • Healthcare – Devices to monitor blood pressure, heart rate, glucose levels, and more are revolutionizing patient care. Many can now automatically notify a care facility when a patient is in distress. Mobile networks keep these devices connected if the patient’s residence lacks fixed broadband and when they are away from the home.
  • Economic development – Picking the best location for a new factory, business park, or neighborhood is about more than adequate roads and water resources. Good connectivity for both wireless and fixed telecom services has become a standard amenity for site selection.
  • 5G, part 1 – These new networks are quickly overlaying the 4G footprint. You don’t have to experience the lightning speeds of inner city millimeter wave service to see huge improvements in network performance. Wireless carriers are now introducing Fixed Wireless Access (FWA) to directly compete with traditional fixed providers. Competition means pressure in the market to keep services more affordable.
  • 5G, part 2 – Just over the horizon is the Rural 5G Fund, established by the FCC in October 2020. Over $9 billion dollars will be made available to improve 5G coverage. However, the Competitive Carriers Association, which represents many rural mobile service providers, estimates the need at well over $30 billion. Without some advance planning and dialogue with the wireless providers in your state, you may see very little of those investments.

WISPs have brought first-time service to millions 

According to WISPA (the Wireless Internet Service Providers Association), over 2,800 WISPs are now serving more than seven million customers in portions of all 50 states, bringing internet to many rural households that had previously relied on aging satellite services. Although some subscribers are seeing median speeds below the current 25/3 broadband definition, new technologies are improving user experiences as equipment is modernized. Of course, better access to fiber is also needed to increase capacity and link to internet backbones.

All radio signals degrade with distance. Some of the largest WISPs cover sparsely populated regions, often with rugged terrain, making physical household connections particularly expensive to build. Commonly, customers who experience slower than advertised speeds are living at the practical edge of these coverage areas. Providing fiber to just a handful of locations can attract new towers that could substantially expand network services. This would also save much of the cost compared to direct-to-home routes and reduce the time needed for these subscribers to see significant improvements.

The IIJA is written to be technology-neutral, but some broadband officials seem to be paying little attention to proven solutions that could have immediate impact. Even if the eventual goal is to offer direct-to-home fiber for everyone, we may go well beyond this decade without realizing that dream.

Aren’t satellites wireless, too?

Modern and improved satellite services are already fulfilling broadband needs for some households and businesses. Availability is limited to certain geographies but is expanding, and new competitors plan to enter the mix soon.

Throughput speeds and latency have improved dramatically, but waitlists are long, and initial equipment costs of more than $500 (that’s for do-it-yourself) and subscription fees of $100 or more per month will make this a difficult purchase decision for low-income households. There’s also limited capacity for any given geographic area, so even if there is satellite service available in your location, it may be that your neighbors have already maxed out the service and you will be waiting for additional capacity to be made available.

Without wireless, a broadband plan is just half-baked

We are many years away from realizing the full impact of the IIJA and the other recent funding sources that will deliver new fiber connections across the country. The FCC’s map is already delayed. There are early grumblings about uncertain challenge processes and many states are just now getting their planning efforts underway. The federal government has promised millions of Americans better broadband and they are expecting action soon, not in five to ten years.

Regulators and policymakers will ultimately be held accountable by voters and Congress for how the BEAD funds are spent. Two key metrics will matter most: the number of households gaining a new or improved connection and how quickly this progress is being made. Monitoring compliance will become more important as projects hit milestones and contractors get paid.

For some rural communities, wireless may be the best option right now and, perhaps, for the foreseeable future. Some households can already experience better service from their wireless provider than from DSL or satellite options. Reports are surfacing of DSL providers refusing to reconnect service to households where an interruption of service has occurred — whether for late payment or change of ownership — leaving families cut off from the digital economy.

Because satellite service is expensive and hard to acquire, wireless services are the only logical solution to get some rural households (particularly those in low-income brackets) connected before these communities wither past the point of no return. WISPs and mobile providers can fill some of this gap today and, if given the opportunity, will provide competitive options for families unhappy with their service. FWA from the traditional mobile operators is gaining public acceptance quickly in select markets and where signal levels are strong.

Think of anticipating wireless needs while planning fixed networks like an extension of a “dig once” policy. You don’t want to look back years from now and ask why wireless wasn’t considered in your planning process. Across the country, economic and community development departments spend millions of dollars every year to attract new citizens and businesses. Reliable mobile coverage is an amenity everyone – and every thing – wants.

Data from Ookla can highlight areas of need for both fixed and wireless networks. Leveraging coverage data to spotlight deficiencies can serve as an additional assessment in your middle-mile fiber planning, which can ultimately improve public safety, agricultural competitiveness, and overall quality of life.

Prepare for all the broadband needs ahead of you. It’s smart business. It’s smart government.

Bryan Darr is the Vice President of Smart Communities at Ookla. He coordinates Ookla’s outreach to local, state and federal governments and serves on CTIA’s Smart Cities Business & Technology Working Group. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

5G

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

Published

on

The author of this Expert Opinion is David Flower, CEO of Volt Active Data

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization and bad data

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

Hyper-personalization and slow data

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

How to fix the hyper-personalization paradox

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending