Connect with us

Expert Opinion

Yoni Mazor: Three Amazon and Supply Chain Predictions for 2022

The omicron variant could spell trouble for the supply chain in 2022.

Published

on

The author of this Expert Opinion is Yoni Mazor, chief growth officer of GETIDA

With a hectic 2021 over, it’s a good opportunity to explore 3 Amazon and supply chain predictions for 2022.

Heading into 2022, the world will also be marking the entrance to the third year of recovering from COVID-19 and its effects. At the beginning of 2020, the world was shaken up by the eruption of the pandemic and its spread from Asia all over the world. The challenges of the pandemic for the global economy have been significant; here are 3 predictions for how things might look in 2022.

1. Global supply chain

The global economy will probably continue to struggle due to the challenges of constant interruptions in the global supply chain. The new omicron variant, which proves to be highly contagious as it spreads at record-breaking speed, has already placed numerous countries around the world under travel, work and movement restrictions.

The limitations that omicron has imposed on these countries will add another layer of complexity to the interruptions to the global supply chain during the first quarter. It will be compounded by the strain that omicron will place on the workforce itself.

The costs of shipping inventory and supplies around the world rose sharply during 2021 and are currently cooling off a bit from the surge. Until the appearance of the new omicron variant, it was expected that costs would continue to cool down during the year at a moderate pace, however, such predictions are volatile as omicron is causing the same type of interruptions and price spikes that caused the whole global supply chain to reach this point.

Some of the main strains on the global supply chain that are expected to continue into 2022 are semiconductor supply shortages, shortages in container shipping, and shortages in professional labor for transportation carriers and at seaports. The rising costs of transportation, labor, and energy are challenging the global supply chain while also impacting financial institutions and governments all over the world. The reason: rising costs are another way to describe the next point of our predictions, inflation.

2. Inflation

Most of the current generation in the United States are not familiar with the meaning and challenges of inflation. The last era of significant inflation was in the early 80s when Ronald Reagan was president. Many economists describe inflation as a wild beast that is very hard to tame, capture and place back in its cage once it breaks loose. Another way to describe inflation is like a pendulum that keeps swinging and raising costs in one direction, that later raises costs in another direction, in an unexpected and disruptive way, and on and on it swings.

The Federal Reserve has kept a low-interest-rate environment for the past decade, and usually during inflationary periods, as prices of everything are rising, the Fed is expected to raise interest rates to help people get more interest on their savings and protect the purchasing power of most households. Nevertheless, inflation during 2021 has already crossed the 6% mark, which is about three times higher than the target of 2% per year usually aimed for by the Fed. Despite that, the Fed has kept interest rates low, and by doing so, it has yet to apply this key tool of raising interest to combat inflation.

There is a bit of challenge for many economists and the Fed to try to distinguish between real inflation of the economy or transitional inflation in the economy due to the effects of the pandemic and the global supply chain challenges. This might explain why the Fed has focused on keeping a low-interest-rate environment, as it is more concerned with battling the pandemic and global supply chain strains than with real inflation striking the economy.

It is not clear how long the Fed will be able to keep its current position if real inflation keeps its momentum and does not slow down. If the effects of the global supply challenges and its inflationary triggers do appear to be cooling off, and real inflation is causing havoc, we can expect the Fed to begin increasing interest rates. The Fed might raise interest rates during the first quarter of the year, or might even stretch into second or third quarters if omicron places further significant strains on the US economy.

3. Amazon

The global pandemic benefited the e-commerce industry and Amazon, the industry juggernaut, when it broke out in early 2020. It accelerated the adoption of shopping online by many consumers in the U.S. by a few good years, as consumers stranded at home could only shop for products they needed online. During 2021 Amazon’s financial results continued to grow at a rate of about 18% year on year, however not as dramatically as the 37% YOY rate in 2020.

As the largest online marketplace in the U.S., Amazon very much reflects the U.S. economy. It likewise gets heavily affected by global supply chain disruptions and inflationary pressures. If such challenges continue to affect Amazon’s marketplace and its stakeholders, the year 2022 might prove itself as the most challenging yet for Amazon. To add to that, it will be the first full year of not having its founder, Jeff Bezos, as CEO of the company. Andy Jassy took over the role on July 5th, 2021.

Amazon will be facing challenges in the upcoming years from a few main friction points. The first is the U.S. government cracking down on Amazon’s perceived marketplace dominance. The U.S. government will continue to challenge Amazon to oversee that company’s power is neither abusive nor destructive to the economy.

The global supply chain interruptions have challenged Amazon’s sourcing capabilities as well as many of its third-party sellers during 2021. They have all struggled to keep their products in stock on the platform. These supply constraints limit the depth and variety of products on Amazon’s platform with which most consumers are familiar. This trend, in turn, could cause consumers to look for alternatives in other marketplaces if it continues into 2022. One thing is clear about this prediction: third-party Amazon sellers will have to learn the art of Amazon business negotiation to keep their inventory levels in good shape, along with having their cost structures in check.

Another friction point is how inflation is affecting the competitiveness of the products offered on Amazon. It is important to remember that about 60% of Amazon’s marketplace revenue comes from third-party sellers. Most of these third-party sellers are not familiar with, nor equipped to battle inflation. Thus if they raise their prices on the platform during 2022 to adjust to the cost inflation and prices become too expensive compared with other traditional and established retailers, it will affect Amazon’s ability to stay competitive and maintain its growth momentum over other competitors.

Signs of weakness and volatility

The global economy is a marvelous and complex system that connects dots and lines in many unexpected ways. In the past few decades, this system has provided great prosperity to many countries. However, its complexity during a global pandemic is showing signs of weakness and volatility. By examining the status of the global supply, inflation and Amazon in the past year of 2021, we can see how they are all interconnected and affect each other in various ways.

This interconnectivity will determine much of where things are heading for us all during 2022. There is no attempt here to predict the future, but an attempt to examine past events and their effects, and try to assess where it might be all going next.

Yoni Mazor is the chief growth officer and co-founder of GETIDA. He began developing GETIDA after successfully operating a $20 million yearly Amazon business, selling fashion brands internationally. GETIDA specializes in Amazon discrepancy analytics and consulting. By utilizing data visibility technology, GETIDA focuses on discovering and managing financial and inventory-related discrepancies with billions of dollars of transactions managed daily. He previously served in special Navy intelligence. This Expert Opinion is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

 

Broadband Mapping & Data

Jeremy Jurick and Paul Schneid: Preparing Data for the FCC’s Broadband Filing

The new FCC requirements in the broadband data collection program are important to meet the nation’s connectivity goals.

Published

on

The authors of this Expert Opinion are Jeremy Jurick (left) and Paul Schneid of Michael Baker International.

The recent emphasis on the expansion of broadband access across the country, coupled with the requirements of the Infrastructure Investment and Jobs Act and Broadband Equity and Deployment program, has prompted the Federal Communications Commission to review and update its collection of data. Accurate data pinpointing where broadband service is – and is not – available is critically important. Broadband maps are used by Internet Service Providers and governments to identify locations that need service, as well as how to fund broadband expansion.

The FCC has recently established an important initiative called the Broadband Data Collection Program to ensure the collection of accurate, vital broadband availability data, implementing new requirements. Among other requirements of the BDC, ISPs must submit their serviceable location data and align that data with the FCC’s serviceable location fabric, which will require new methodologies from ISPs, resulting in additional hours spent and more resources allocated to address this upcoming task.

At Michael Baker International, our team is at the forefront of data collection and broadband expansion services. This article provides details on the requirement and filing process for ISPs.

Recognizing the challenges

The BDC filing process may be unfamiliar and challenging to some service providers due to the novelty of the program and the list of requirements it encompasses. Moreover, ISPs may be delayed in the processing and submission of their data, either due to limited resources or bandwidth to support these new tasks and responsibilities or experience to immediately and effectively tackle and complete this complex data collection/submittal process. With the extent of the data expected to be collected and submitted, which involves technical elements and resources, proceeding may seem daunting. Sifting through newly published materials and resources takes away valuable time and issues can arise before or after submittal with incomplete data or the ability to process the data into the appropriate standards, recently specified for fabric comparison by the FCC.

Getting started according to the timeline

To begin the BDC Filing process, ISPs should first become familiar with the timeline, federal regulations and data requirements surrounding the submission period.

Due to be submitted for the first time on September 1, 2022, and semi-annually going forward, specific data must be provided by all facilities-based providers of fixed and mobile broadband internet access who had one or more end user connections in service on June 30, 2022. Each filing will be based on the same schedule as the Form 477 filings (June 30th through September 1st and December 31st through March 1st).

Fulfilling the prerequisites ad the data requirements

As prerequisite to filing data in the BDC portal, the FCC requires ISPs or government entities to first complete the registration process within the FCC’s Commission Registrations System (CORES). Users will be assigned a 10-digit FCC Registration Number that will be used for verification purposes by the FCC.  Additionally, filers are also required by the FCC to show proof that they are indeed an organization that is responsible for tracking broadband coverage.  Each filer must provide documentation from the highest-ranking executive within their company confirming that the organization tracks broadband data.

Each BDC filing must include detailed information about the filer, broadband availability data (including supporting data) and Form 477 broadband subscription data. In addition, specific requirements are mandated for various ISPs:

  • Fixed wireline and satellite broadband service providers: Submit either polygon shapefiles or a list of locations constituting the provider’s service area.
  • Fixed wireless broadband service providers: Submit either propagation maps and propagation model details or a list of locations constituting the provider’s service area.
  • Mobile wireless broadband service providers: Submit propagation maps and propagation model details for each network technology, as well as for both outdoor stationary and in-vehicle mobile network coverage. Additionally, these ISPs must submit data for their signal strength heat map.

Finalizing for submission

Finally, ISPs must gain access to the serviceable location fabric, format the data to requirements for accurate comparison against the fabric and identify the addresses that meet requirements of serviceable areas. When the necessary data has been compiled and reviewed, the filing entity must navigate to the BDC system and submit its data onward to the FCC. The FCC gives the option to file submit data as an upload/web-based file or alternatively submit using an Application Programming Interface.

Partnering with a broadband expert

It is recommended that ISPs looking to both save time and ensure accuracy throughout the submission process partner with broadband experts that will ensure that all BDC requirements are met before submitting any data. Michael Baker International has thoroughly researched the BDC requirements and created a streamlined solution. ISPs simply provide the initial information, and our team then determines the appropriate data to be submitted, along with our translation of that data into the proper format. Once ISPs receive the data, they need only create a login and finally, upload the submission data.

Today, there is increased focus on an existing but growing need to close gaps in the digital divide. The new FCC requirements in the BDC program are an important part of ensuring the nation’s connectivity goals are met by collecting accurate data that will be necessary to provide services where they are most needed.

Jeremy Jurick is Michael Baker’s National Broadband Services Director and oversees Michael Baker International’s broadband planning, mapping and program management initiatives. His broadband experience includes roadmap development, planning, data collection and analysis, stakeholder engagement, broadband provider engagement, branding, multimedia design, GIS services, and software design, and he has provided testimony during several government hearings to inform policymakers on broadband policy and expansion, including advocating for high speed thresholds for the definition of broadband and allowing government entities to be eligible subgrantees for broadband funding.

Paul Schneid is a program manager at Michael Baker with nearly a decade of experience in broadband wireless equipment operation, customer service, and process improvement. Most recently, Schneid interfaced with vendors and clients to manage all implementation project phases from inception to completion across a citywide wireless broadband expansion in New York City. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Bryan Darr: An Order of Fiber, Please, with Wireless on the Side

Wireless is essential because for truly remote properties, a physical connection may never be practical.

Published

on

The author of this Expert Opinion is Bryan Darr, vice president of Smart Communities at Ookla.

Over the next five to ten years we will see an explosion of projects bringing high-speed connectivity to underserved communities in the United States. Although fiber infrastructure rightly gets most of the attention and funding, wireless networks should also be part of this planning process. Wireless networks can deploy faster, serve remote locations more economically, and provide some capabilities that fixed networks can’t. Failure to consider the comprehensive needs of the mobile broadband environment will hobble efforts in the next phase of this technology revolution.

How we got here

As federal broadband infrastructure funding is ramping up, state broadband offices are preparing to prove their need for a larger slice of the pie. This is detailed in the $42.5 billion Broadband Equity, Access and Deployment Program, which is a part of the infrastructure bill (the Infrastructure Investment and Jobs Act) passed into law in the fall of 2021. Although every state is guaranteed $100 million, that leaves about $37 billion yet to be divided up.

Assuredly, this pie won’t be sliced into equal portions across states, tribal areas, and U.S. territories. Differences in population, geographic area, household density, and income levels will impact the funding eligibility of individual jurisdictions. Preparedness to verify underserved areas will ensure that state and local governments can maximize their chances of securing adequate funding. The first step is to identify these communities and estimate the cost of covering each household. With a desire to help as many people as possible, there will be a tendency to prioritize areas with the lowest cost per connection.

State governments have been focused primarily on fiber access. However, as big a pot of money as the IIJA may be, it won’t be big enough to connect every household to fiber. Continued supply chain issues, inflation, and labor shortages (particularly with needed expertise) will expand the cost of projects in the coming years.

The race to compete for these billions of dollars has had a very uneven start. Some state broadband offices are fully staffed, have hired consultants, have obtained and collected network performance data, and already have mapping projects launched. Other states are just now funding their broadband offices and beginning to hire their first employees. States that cannot successfully challenge both the mapping fabric (think number of service addresses) and confidently identify unserved households will be disappointed with the size of their slice.

The recipe may require adjustment

Recently, Federal Communications Commission Chairwoman Jessica Rosenworcel called for the commission to reset its definition of broadband from 25 Mbps download speed and 3 Mbps upload speed to 100 down and 20 up. Many would agree that a reset is long overdue. The IIJA legislation is already requiring that new infrastructure builds meet this criteria. We should all recognize that this metric reset could make millions of additional households eligible for funding. Some policy organizations, including the Fiber Broadband Association, are voicing their opinions that those numbers are already dated and that the new target will not be enough for future needs such as the much-anticipated metaverse.

The specific benefits of wireless

Wireless connectivity can be broken down into three basic types of last-mile providers:

  1. Cellular service providers, offering traditional mobile and new fixed wireless access services
  2. Wireless internet service providers (WISPs), offering fixed point-to-point service
  3. Satellite companies (more on them later)

Wi-Fi is also wireless, but provides a final hop for only the last few feet of a network connection.

Wireless is essential because there is broad recognition that for truly remote properties, a physical connection may never be practical. As subsidies flow, that fact may be applicable to fewer locations, but there is certainly a point of diminishing return. As state and federal officials plan their networks to connect as many communities as they can, they should be factoring in where the wireless networks need bolstering as well. This is applicable for both mobile and WISP infrastructure.

Additional wireless investment could serve multiple needs. Poor wireless coverage is a common complaint even in densely populated areas. If you spend any significant time in rural areas, you know that there are locations where service is so spotty that the local population knows when to not risk initiating a call. Even if you get a signal, throughput can vary greatly. Just because you can receive a text in a particular location doesn’t mean you can download a video. These rural areas have weak wireless signals for the same reason that they lack good terrestrial broadband — the population density does not provide enough return on the investment.

Fiber is still a necessary ingredient

Today’s higher data usage demands the capacity that fiber provides. Mobile service providers are not going to build a new 5G tower without access to fiber backhaul. Sites that require long, dedicated fiber deployments can cost far more and lead to an unreasonable dent in the CapEx budget.

As new middle-mile networks are being designed, network planners should consider where wireless networks are weak and new towers are needed for improvement. Strategically adding splice points in poor service areas can significantly lower the barrier to attracting new wireless infrastructure. A lower cost of deployment will be a big incentive to wireless networks to bring improved service to rural communities.

We all depend on wireless services

Mobile connectivity has moved beyond a luxury and has become an expectation. Even if we could practically and affordably connect every house with fiber, there are many reasons to include wireless in your overall design plans.

  • Public safety – If you have ever had a flat tire or an overheated radiator, you know how important wireless coverage can be. Just try calling a rescue service with no bars. FirstNet wants to improve coverage as well, and incentivising new towers can provide a big assist.
  • Precision agriculture – Fiber-to-the-home can connect the farm house, the barn, and even the chicken houses, but it won’t connect the tractor or the combine. Livestock now wear devices that monitor animal well-being. Wireless is the only way to keep the whole farm connected and competitive in a global marketplace.
  • Healthcare – Devices to monitor blood pressure, heart rate, glucose levels, and more are revolutionizing patient care. Many can now automatically notify a care facility when a patient is in distress. Mobile networks keep these devices connected if the patient’s residence lacks fixed broadband and when they are away from the home.
  • Economic development – Picking the best location for a new factory, business park, or neighborhood is about more than adequate roads and water resources. Good connectivity for both wireless and fixed telecom services has become a standard amenity for site selection.
  • 5G, part 1 – These new networks are quickly overlaying the 4G footprint. You don’t have to experience the lightning speeds of inner city millimeter wave service to see huge improvements in network performance. Wireless carriers are now introducing Fixed Wireless Access (FWA) to directly compete with traditional fixed providers. Competition means pressure in the market to keep services more affordable.
  • 5G, part 2 – Just over the horizon is the Rural 5G Fund, established by the FCC in October 2020. Over $9 billion dollars will be made available to improve 5G coverage. However, the Competitive Carriers Association, which represents many rural mobile service providers, estimates the need at well over $30 billion. Without some advance planning and dialogue with the wireless providers in your state, you may see very little of those investments.

WISPs have brought first-time service to millions 

According to WISPA (the Wireless Internet Service Providers Association), over 2,800 WISPs are now serving more than seven million customers in portions of all 50 states, bringing internet to many rural households that had previously relied on aging satellite services. Although some subscribers are seeing median speeds below the current 25/3 broadband definition, new technologies are improving user experiences as equipment is modernized. Of course, better access to fiber is also needed to increase capacity and link to internet backbones.

All radio signals degrade with distance. Some of the largest WISPs cover sparsely populated regions, often with rugged terrain, making physical household connections particularly expensive to build. Commonly, customers who experience slower than advertised speeds are living at the practical edge of these coverage areas. Providing fiber to just a handful of locations can attract new towers that could substantially expand network services. This would also save much of the cost compared to direct-to-home routes and reduce the time needed for these subscribers to see significant improvements.

The IIJA is written to be technology-neutral, but some broadband officials seem to be paying little attention to proven solutions that could have immediate impact. Even if the eventual goal is to offer direct-to-home fiber for everyone, we may go well beyond this decade without realizing that dream.

Aren’t satellites wireless, too?

Modern and improved satellite services are already fulfilling broadband needs for some households and businesses. Availability is limited to certain geographies but is expanding, and new competitors plan to enter the mix soon.

Throughput speeds and latency have improved dramatically, but waitlists are long, and initial equipment costs of more than $500 (that’s for do-it-yourself) and subscription fees of $100 or more per month will make this a difficult purchase decision for low-income households. There’s also limited capacity for any given geographic area, so even if there is satellite service available in your location, it may be that your neighbors have already maxed out the service and you will be waiting for additional capacity to be made available.

Without wireless, a broadband plan is just half-baked

We are many years away from realizing the full impact of the IIJA and the other recent funding sources that will deliver new fiber connections across the country. The FCC’s map is already delayed. There are early grumblings about uncertain challenge processes and many states are just now getting their planning efforts underway. The federal government has promised millions of Americans better broadband and they are expecting action soon, not in five to ten years.

Regulators and policymakers will ultimately be held accountable by voters and Congress for how the BEAD funds are spent. Two key metrics will matter most: the number of households gaining a new or improved connection and how quickly this progress is being made. Monitoring compliance will become more important as projects hit milestones and contractors get paid.

For some rural communities, wireless may be the best option right now and, perhaps, for the foreseeable future. Some households can already experience better service from their wireless provider than from DSL or satellite options. Reports are surfacing of DSL providers refusing to reconnect service to households where an interruption of service has occurred — whether for late payment or change of ownership — leaving families cut off from the digital economy.

Because satellite service is expensive and hard to acquire, wireless services are the only logical solution to get some rural households (particularly those in low-income brackets) connected before these communities wither past the point of no return. WISPs and mobile providers can fill some of this gap today and, if given the opportunity, will provide competitive options for families unhappy with their service. FWA from the traditional mobile operators is gaining public acceptance quickly in select markets and where signal levels are strong.

Think of anticipating wireless needs while planning fixed networks like an extension of a “dig once” policy. You don’t want to look back years from now and ask why wireless wasn’t considered in your planning process. Across the country, economic and community development departments spend millions of dollars every year to attract new citizens and businesses. Reliable mobile coverage is an amenity everyone – and every thing – wants.

Data from Ookla can highlight areas of need for both fixed and wireless networks. Leveraging coverage data to spotlight deficiencies can serve as an additional assessment in your middle-mile fiber planning, which can ultimately improve public safety, agricultural competitiveness, and overall quality of life.

Prepare for all the broadband needs ahead of you. It’s smart business. It’s smart government.

Bryan Darr is the Vice President of Smart Communities at Ookla. He coordinates Ookla’s outreach to local, state and federal governments and serves on CTIA’s Smart Cities Business & Technology Working Group. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

5G

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

Published

on

The author of this Expert Opinion is David Flower, CEO of Volt Active Data

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization and bad data

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

Hyper-personalization and slow data

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

How to fix the hyper-personalization paradox

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending