Connect with us

Wireless

Ookla Names T-Mobile Fastest, Most Consistent Mobile Service Provider

68.5 percent of T-Mobile customers spent a majority of their time on 5G networks during the quarter, Ookla said.

Published

on

Photo of T-Mobile Headquarters in Bellevue, WA

WASHINGTON, July 18, 2022 – A market report released Monday by performance metrics company Ookla named T-Mobile as the fastest and most consistent mobile operator in the United States during the second quarter of 2022, with a substantial percent of its customers spending the majority of time on its 5G network in that period.

The latest report for April, May and June showed that T-Mobile achieved a median download speed of 116.54 Mbps with its competitor Verizon Wireless averaging at 59.67 Mbps and AT&T at 54.64 Mbps.

The company also scored the highest in upload speeds, averaging at 11.72 Mbps with Verizon and AT&T trailing at 9.14 Mbps and 7.00 Mbps respectively. Median latency – the time it takes the device to communicate with the network – for T-Mobile was 31 milliseconds, with Verizon at 32 ms and AT&T at 34 ms.

According to company’s Speedtest Intelligence data, T-Mobile also had the highest consistency in the U.S. with 85.7 percent of results showing at least 5 Mbps download and 1 Mbps upload speeds.

T-Mobile continues to take the cake for the fastest median 5G download speeds in the U.S. at 187.33 Mbps, a slight decrease from the first quarter results. According to the Ookla report, 68.5 percent of T-Mobile customers spent a majority of their time on 5G networks during the quarter compared to 31.2 percent of Verizon customers.

T-Mobile was named the fasted mobile provider in the first quarter of 2022 and was reported to be the leading provider in 5G performance last month.

The report named Samsung Galaxy S22 Ultra as the fastest popular device in the United States and the District of Columbia as the top spot for fastest median mobile download speeds at the state level.

Ookla is a sponsor of Broadband Breakfast.

5G

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

Published

on

The author of this Expert Opinion is David Flower, CEO of Volt Active Data

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization and bad data

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

Hyper-personalization and slow data

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

How to fix the hyper-personalization paradox

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Dave Wright: Shared Relocation Fund Will Make More of Finite Spectrum Resource

‘Wireless connectivity is one of the most vital aspects of our digital infrastructure.’

Published

on

The Author of this Expert Opinion is Dave Wright, president of OnGo Alliance and head of global wireless policy at Hewlett Packard Enterprise

In order to meet the gaps in broadband connectivity that persist throughout the country, we must have a more comprehensive view for the necessity of all available spectrum – whether shared, licensed or unlicensed – understanding that they are complementary and independently important to our nation’s future.

As we figure out how we will meet the needs of an increasingly wireless world, it is critical that we think collaboratively on how we can free up and share spectrum, working closely and cooperatively with the federal agencies responsible for our nation’s spectrum resources, the Federal Communications Commission and the National Telecommunication and Information Administration.

With recent confirmed leadership appointments in the NTIA and FCC, and renewed focus on collaboration and collegiality between these organizations, there is hope for renewed effectiveness in America’s overall management of our spectrum resources.

From a policy perspective, the OnGo Alliance is working to shed light on the incentives that inherently exist around the way spectrum is made available today. For terrestrial uses, there are two long established methods for making spectrum available – via a licensing process including an auction of the frequencies, or via an unlicensed allocation where spectrum is made available on a license-exempt basis.

Licensed bands have given rise to our cellular connectivity, while unlicensed spectrum has enabled innovations like the Wi-Fi and Bluetooth solutions that we know and depend upon today. The near ubiquitous presence of these technologies speaks to the efficacy of these approaches. The US 3.5 GHz Citizens Broadband Radio Service is the first spectrum access framework that combines aspects of licensed (protected access) and unlicensed (opportunistic access) spectrum within a single, dynamically managed access paradigm.

Congress has increasingly been looking to licensed spectrum auctions as a source of revenue to cover the funding requirements for new programs. And Federal users who are occupying spectrum and then make the spectrum available for auction can take advantage of monies made available through the Spectrum Relocation Fund to cover the costs associated with transitioning their systems.

The SRF is in turn funded based the resulting auction revenues. These are examples of the current incentives in the system which are either directly or indirectly tied to auction revenues of licensed spectrum. These incentives inherently bias the policymaking processes toward licensed spectrum, at the expense of unlicensed and/or opportunistic spectrum like we have in the CBRS General Authorized Access tier.

This bias is not helpful in our quest to provide accessible broadband throughout the nation as unlicensed and GAA are key components in most solutions, from Wi-Fi as the “last meter” connection to a fixed broadband network to GAA’s prominent role in rural fixed wireless offerings.

CBRS is an optimal framework for putting mid-band spectrum to intensive uses for a wide variety of uses. In the only two years since CBRS commercial operations were approved by the FCC, over 225,000 CBRS base stations have been installed nationwide.

Collaboration between cloud players, system integrators, radio vendors and operators has reached critical mass, building a vibrant, self-sustaining ecosystem. CBRS has allowed enterprises and rural farms alike the opportunity to install private 4G and 5G networks that are connecting IoT devices – from factory robots to autonomous farm equipment. School districts, airports, military bases and logistics facilities, factories, hospitals, office buildings, and public libraries are only but a few of the limitless facilities where connectivity has been enabled by CBRS spectrum.

Wireless connectivity is one of the most vital aspects of our digital infrastructure, and we must use all of the available resources in order to make broadband as ubiquitous as any other utility. Our policymaking, and the incentives around it, must account for the fact that all types of spectrum are important – whether licensed, unlicensed or shared – and that it is vital to ensure that there are proper allocations of each type to meet the relentless demand. We must work together to make the most of what we have.

Dave Wright played an instrumental role in the formation of the OnGo Alliance (originally known as the CBRS Alliance), collaborating with other founding members to create a robust multi-stakeholder organization focused on the optimization of LTE and 5G services in the CBRS band. He served as the Alliance’s first Secretary from its launch in August 2016 and was elected as the President of the Alliance in February 2018. He advocates for unlicensed, licensed, and dynamic sharing frameworks – recognizing the vital role that all spectrum management regimes play in our increasingly wireless world. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

5G

Optional Security Features for 5G Technology Poses Risks

The next generation wireless technology is being touted as the most secure yet.

Published

on

Photo of Dan Elmore of the Idaho National Labratory

WASHINGTON, July 28, 2022 – 5G technology can still present security concerns despite being touted as the most secure of the cellular generations, said Dan Elmore of the Idaho National Laboratory at a 5G Future event Thursday.

In response to the emerging challenge of validating 5G security protocols and data protection technologies, the Idaho National Laboratory established its Wireless Security Institute in 2019 to coordinate government, academic, and private industry research efforts to foster more secure and reliable 5G technology.

While 5G network offers a “rich suite” of security features in the standards, most of it is optional for manufacturers and developers to choose to implement in their system or device, said Elmore, who is the director for critical infrastructure security at the INL. This poses a significant challenge for 5G, particularly for critical infrastructure applications, as consumers may not know how standards are implemented, Elmore said.

Elmore urged consumers, especially federal agencies, to ask the hard questions and consider “what vulnerabilities might be present in how they [manufacturers and developers] employ those standards that could be exploited.”

5G is designed to allow cellular devices to connect at higher speeds with lower latency, the delay in loading requests, than previous generations. Already, wireless carriers are incorporating it into devices and working on national 5G networks.

Because of its facilitation of real-time monitoring, 5G technology is expected to help tackle critical issues like climate change and environmental sustainability.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending