Connect with us

Expert Opinion

Jessica Ward: Which Media Streaming Device is Best?

Published

on

July 7, 2016 – Since 2008, the ability to “cut the cord” has existed with the help of devices allowing us to stream Netflix directly to our TVs. From 2008 to 2013, the idea that this technology could actually replace Pay TV (cable and satellite) seemed absurd. Fast forward to 2014 when the percentage of households using these devices rose from a mere 7% to a shocking 21%. Oh what a difference a year can bring. At this point, individual media groups, like CBS, FOX, HBO and many others, began to find ways to use these devices to benefit them monetarily. Once these groups got on board, it was clear that these media streaming devices were to be reckoned with. The next year (2015) the 14% rise between 2013 and 2014 seemed minuscule compared to the 31% spurt the media streaming industry experienced in 2015. With that, devices like Roku, Apple TV, Amazon Fire and Chromecast became serious contenders to the big dogs in the Pay TV marketplace.

None of these bigger than Roku, though. While Roku flies somewhat under the radar without a name like Apple, Amazon or Google backing it, the company has quietly moved it’s way into the top spot owning 37% of the Media Streaming Device market. Not only is Roku among the least expensive of the devices, but thanks to their deals with Sharp, TCL and Sceptre to be installed into their Smart TVs they are more readily available to consumers than the other brands they are competing with. Along with Chromecast, they also have to most to offer consumers as far as free content goes.

While Media Streaming Devices seem perfect for consumers, there are a few deterring factors that satellite TV companies like DirecTV and DISH hope will prevent the majority of consumers moving in that direction. The first factor is that many of the free channels actually do require a subscription to a Pay TV service. Channels like ESPN, Comedy Central and FOX are not available through these devices if you do not currently have a cable subscription set up. There are obviously ways around this. If one person has a cable subscription, they can surely share their subscription with others. This is a roadblock for some consumers, but the majority will find a way around this.

The larger concern is one of user experience and cost as it relates to the consumer’s internet service. In 2016, Netflix publicly stated that an internet speed of 5 Mbps or higher is required for a “regularly positive” streaming experience. This is not just true for Netflix, but for the media streaming industry in general. Unfortunately, those with DSL, satellite internet and even some smaller cable providers may not be able to experience streaming in a way that the service is meant to be due to slower internet speeds. Aside from speeds, another large concern is for those consumers whose internet plan has a hard cap when it comes to data. These services, especially in HD, can be big time data drainers as they take up 3 GB of data per hour. This could make streaming a costly option.

While there are negatives, it’s clear that these services are serious contenders and are here to stay.

Have you been looking into cutting the cord? If so, take a look at the infographic below provided by InternetChoice.org to decide which platform would be best for you!

Jessica Ward is a blogger, DIY addict, coffee snob and marketing extraordinaire at InternetChoice.org. She writes about technology, fitness, marketing, and whatever fills her mind with wonder and fuels her passion. Follow her on Twitter @jessward87

cord-cutting-infographic

 

Expert Opinion

Bryan Darr: An Order of Fiber, Please, with Wireless on the Side

Wireless is essential because for truly remote properties, a physical connection may never be practical.

Published

on

The author of this Expert Opinion is Bryan Darr, vice president of Smart Communities at Ookla.

Over the next five to ten years we will see an explosion of projects bringing high-speed connectivity to underserved communities in the United States. Although fiber infrastructure rightly gets most of the attention and funding, wireless networks should also be part of this planning process. Wireless networks can deploy faster, serve remote locations more economically, and provide some capabilities that fixed networks can’t. Failure to consider the comprehensive needs of the mobile broadband environment will hobble efforts in the next phase of this technology revolution.

How we got here

As federal broadband infrastructure funding is ramping up, state broadband offices are preparing to prove their need for a larger slice of the pie. This is detailed in the $42.5 billion Broadband Equity, Access and Deployment Program, which is a part of the infrastructure bill (the Infrastructure Investment and Jobs Act) passed into law in the fall of 2021. Although every state is guaranteed $100 million, that leaves about $37 billion yet to be divided up.

Assuredly, this pie won’t be sliced into equal portions across states, tribal areas, and U.S. territories. Differences in population, geographic area, household density, and income levels will impact the funding eligibility of individual jurisdictions. Preparedness to verify underserved areas will ensure that state and local governments can maximize their chances of securing adequate funding. The first step is to identify these communities and estimate the cost of covering each household. With a desire to help as many people as possible, there will be a tendency to prioritize areas with the lowest cost per connection.

State governments have been focused primarily on fiber access. However, as big a pot of money as the IIJA may be, it won’t be big enough to connect every household to fiber. Continued supply chain issues, inflation, and labor shortages (particularly with needed expertise) will expand the cost of projects in the coming years.

The race to compete for these billions of dollars has had a very uneven start. Some state broadband offices are fully staffed, have hired consultants, have obtained and collected network performance data, and already have mapping projects launched. Other states are just now funding their broadband offices and beginning to hire their first employees. States that cannot successfully challenge both the mapping fabric (think number of service addresses) and confidently identify unserved households will be disappointed with the size of their slice.

The recipe may require adjustment

Recently, Federal Communications Commission Chairwoman Jessica Rosenworcel called for the commission to reset its definition of broadband from 25 Mbps download speed and 3 Mbps upload speed to 100 down and 20 up. Many would agree that a reset is long overdue. The IIJA legislation is already requiring that new infrastructure builds meet this criteria. We should all recognize that this metric reset could make millions of additional households eligible for funding. Some policy organizations, including the Fiber Broadband Association, are voicing their opinions that those numbers are already dated and that the new target will not be enough for future needs such as the much-anticipated metaverse.

The specific benefits of wireless

Wireless connectivity can be broken down into three basic types of last-mile providers:

  1. Cellular service providers, offering traditional mobile and new fixed wireless access services
  2. Wireless internet service providers (WISPs), offering fixed point-to-point service
  3. Satellite companies (more on them later)

Wi-Fi is also wireless, but provides a final hop for only the last few feet of a network connection.

Wireless is essential because there is broad recognition that for truly remote properties, a physical connection may never be practical. As subsidies flow, that fact may be applicable to fewer locations, but there is certainly a point of diminishing return. As state and federal officials plan their networks to connect as many communities as they can, they should be factoring in where the wireless networks need bolstering as well. This is applicable for both mobile and WISP infrastructure.

Additional wireless investment could serve multiple needs. Poor wireless coverage is a common complaint even in densely populated areas. If you spend any significant time in rural areas, you know that there are locations where service is so spotty that the local population knows when to not risk initiating a call. Even if you get a signal, throughput can vary greatly. Just because you can receive a text in a particular location doesn’t mean you can download a video. These rural areas have weak wireless signals for the same reason that they lack good terrestrial broadband — the population density does not provide enough return on the investment.

Fiber is still a necessary ingredient

Today’s higher data usage demands the capacity that fiber provides. Mobile service providers are not going to build a new 5G tower without access to fiber backhaul. Sites that require long, dedicated fiber deployments can cost far more and lead to an unreasonable dent in the CapEx budget.

As new middle-mile networks are being designed, network planners should consider where wireless networks are weak and new towers are needed for improvement. Strategically adding splice points in poor service areas can significantly lower the barrier to attracting new wireless infrastructure. A lower cost of deployment will be a big incentive to wireless networks to bring improved service to rural communities.

We all depend on wireless services

Mobile connectivity has moved beyond a luxury and has become an expectation. Even if we could practically and affordably connect every house with fiber, there are many reasons to include wireless in your overall design plans.

  • Public safety – If you have ever had a flat tire or an overheated radiator, you know how important wireless coverage can be. Just try calling a rescue service with no bars. FirstNet wants to improve coverage as well, and incentivising new towers can provide a big assist.
  • Precision agriculture – Fiber-to-the-home can connect the farm house, the barn, and even the chicken houses, but it won’t connect the tractor or the combine. Livestock now wear devices that monitor animal well-being. Wireless is the only way to keep the whole farm connected and competitive in a global marketplace.
  • Healthcare – Devices to monitor blood pressure, heart rate, glucose levels, and more are revolutionizing patient care. Many can now automatically notify a care facility when a patient is in distress. Mobile networks keep these devices connected if the patient’s residence lacks fixed broadband and when they are away from the home.
  • Economic development – Picking the best location for a new factory, business park, or neighborhood is about more than adequate roads and water resources. Good connectivity for both wireless and fixed telecom services has become a standard amenity for site selection.
  • 5G, part 1 – These new networks are quickly overlaying the 4G footprint. You don’t have to experience the lightning speeds of inner city millimeter wave service to see huge improvements in network performance. Wireless carriers are now introducing Fixed Wireless Access (FWA) to directly compete with traditional fixed providers. Competition means pressure in the market to keep services more affordable.
  • 5G, part 2 – Just over the horizon is the Rural 5G Fund, established by the FCC in October 2020. Over $9 billion dollars will be made available to improve 5G coverage. However, the Competitive Carriers Association, which represents many rural mobile service providers, estimates the need at well over $30 billion. Without some advance planning and dialogue with the wireless providers in your state, you may see very little of those investments.

WISPs have brought first-time service to millions 

According to WISPA (the Wireless Internet Service Providers Association), over 2,800 WISPs are now serving more than seven million customers in portions of all 50 states, bringing internet to many rural households that had previously relied on aging satellite services. Although some subscribers are seeing median speeds below the current 25/3 broadband definition, new technologies are improving user experiences as equipment is modernized. Of course, better access to fiber is also needed to increase capacity and link to internet backbones.

All radio signals degrade with distance. Some of the largest WISPs cover sparsely populated regions, often with rugged terrain, making physical household connections particularly expensive to build. Commonly, customers who experience slower than advertised speeds are living at the practical edge of these coverage areas. Providing fiber to just a handful of locations can attract new towers that could substantially expand network services. This would also save much of the cost compared to direct-to-home routes and reduce the time needed for these subscribers to see significant improvements.

The IIJA is written to be technology-neutral, but some broadband officials seem to be paying little attention to proven solutions that could have immediate impact. Even if the eventual goal is to offer direct-to-home fiber for everyone, we may go well beyond this decade without realizing that dream.

Aren’t satellites wireless, too?

Modern and improved satellite services are already fulfilling broadband needs for some households and businesses. Availability is limited to certain geographies but is expanding, and new competitors plan to enter the mix soon.

Throughput speeds and latency have improved dramatically, but waitlists are long, and initial equipment costs of more than $500 (that’s for do-it-yourself) and subscription fees of $100 or more per month will make this a difficult purchase decision for low-income households. There’s also limited capacity for any given geographic area, so even if there is satellite service available in your location, it may be that your neighbors have already maxed out the service and you will be waiting for additional capacity to be made available.

Without wireless, a broadband plan is just half-baked

We are many years away from realizing the full impact of the IIJA and the other recent funding sources that will deliver new fiber connections across the country. The FCC’s map is already delayed. There are early grumblings about uncertain challenge processes and many states are just now getting their planning efforts underway. The federal government has promised millions of Americans better broadband and they are expecting action soon, not in five to ten years.

Regulators and policymakers will ultimately be held accountable by voters and Congress for how the BEAD funds are spent. Two key metrics will matter most: the number of households gaining a new or improved connection and how quickly this progress is being made. Monitoring compliance will become more important as projects hit milestones and contractors get paid.

For some rural communities, wireless may be the best option right now and, perhaps, for the foreseeable future. Some households can already experience better service from their wireless provider than from DSL or satellite options. Reports are surfacing of DSL providers refusing to reconnect service to households where an interruption of service has occurred — whether for late payment or change of ownership — leaving families cut off from the digital economy.

Because satellite service is expensive and hard to acquire, wireless services are the only logical solution to get some rural households (particularly those in low-income brackets) connected before these communities wither past the point of no return. WISPs and mobile providers can fill some of this gap today and, if given the opportunity, will provide competitive options for families unhappy with their service. FWA from the traditional mobile operators is gaining public acceptance quickly in select markets and where signal levels are strong.

Think of anticipating wireless needs while planning fixed networks like an extension of a “dig once” policy. You don’t want to look back years from now and ask why wireless wasn’t considered in your planning process. Across the country, economic and community development departments spend millions of dollars every year to attract new citizens and businesses. Reliable mobile coverage is an amenity everyone – and every thing – wants.

Data from Ookla can highlight areas of need for both fixed and wireless networks. Leveraging coverage data to spotlight deficiencies can serve as an additional assessment in your middle-mile fiber planning, which can ultimately improve public safety, agricultural competitiveness, and overall quality of life.

Prepare for all the broadband needs ahead of you. It’s smart business. It’s smart government.

Bryan Darr is the Vice President of Smart Communities at Ookla. He coordinates Ookla’s outreach to local, state and federal governments and serves on CTIA’s Smart Cities Business & Technology Working Group. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

5G

David Flower: 5G and Hyper-Personalization: Too Much of a Good Thing?

5G, IoT and edge computing are giving companies the opportunity to make hyper-personalization even more ‘hyper’.

Published

on

The author of this Expert Opinion is David Flower, CEO of Volt Active Data

It’s very easy for personalization to backfire and subtract value instead of add it.

Consider the troubling fact that we may be arriving at a moment in hyper-personalization’s journey where the most hyper-personalized offer is no offer at all. Nobody likes to be constantly bombarded by content, personalized or not.

And that’s the paradox of hyper-personalization: if everyone’s doing it, then, in a sense, nobody is.

5G and related technologies such as IoT and edge computing are giving companies the opportunity to make hyper-personalization even more “hyper” via broader bandwidths and the faster processing of higher volumes of data.

This means we’re at a very interesting inflection point: where do we stop? If the promise of 5G is more data, better data, and faster data, and the result is knowing our customers even better to bug them even more, albeit in a “personal” way, when, where, and why do we say, “hold on—maybe this is going too far.”?

How do you do hyper-personalization well in a world where everyone else is doing it and where customers are becoming increasingly jaded about it and worried about how companies are using their data?

Let’s first look at what’s going wrong.

Hyper-personalization and bad data

Hyper-personalization is very easy to mess up, and when you do mess it up it has the exact opposite of its intended effect: it drives customers away instead of keeping them there.

Consider an online ad for a product that pops up for you on a website a couple days after you already bought the thing being advertised for. This is what I call “noise”. It’s simply a nuisance, and the company placing that ad—or rather, the data platform they’re using to generate the algorithms for the ads—should already know that the person has already bought this item and hence present not a “repeat offer” but an upsell or cross-sell offer.

This sounds rudimentary in the year 2022 but it’s still all too common, and you’re probably nodding your head right now because you’ve experienced this issue.

Noise usually comes from what’s known as bad data, or dirty data. Whatever you want to call it—it pretty much ruins the customer experience.

Hyper-personalization and slow data

The second major issue is slow data, which is any data being used way too slowly to be valuable, which usually includes data that has to the trip to the data warehouse before it can be incorporated into any decisions.

Slow data is one of the main reasons edge computing was invented: to be able to process data as closely to where it’s ingested as possible in order to use it before it loses any value.

Slow data produces not-so-fun customer experiences such as walking half a mile to your departure gate at the airport, only to find that the gate has been changed, and then, after you’ve walked the half mile back to where you came from, getting a text message on your phone from the airline saying your gate has been changed.

Again, whatever you want to call it—latency, slow data, annoying—the end result is a bad customer experience.

How to fix the hyper-personalization paradox

I have no doubt that the people who invented hyper-personalization had great intentions: make things as personal as possible so that your customers pay attention, stay happy, and stay loyal.

And for a lot of companies, for a long time, it worked. Then came the data deluge. And the regulations. And the jaded customers. We’re now at a stage where we need to rethink how we do personalization because the old ways are no longer effective.

It’s easy—and correct—to blame legacy technology for all of this. But the solution goes deeper than just ripping and replacing. Companies need to think holistically about all sides of their tech stacks to figure out the simplest way to get as much data as possible from A to B.

The faster you can process your data the better. But it’s not all just about speed. You also need to be able to provide quick contextual intelligence to your data so that every packet is informed by all of the packets that came before it. In this sense, your tech stack should be a little like a great storyteller: someone who knows what the customer needs and is feeling at any given moment, because it knows what’s happened up to this point and how it will affect customer decisions moving forward.

Let’s start thinking of our customer experiences as stories and our tech stacks as the storytellers—or maybe, story generators. Maybe then our personalization efforts will become truly ‘hyper-personal’— i.e., relevant, in-the-moment experiences that are a source of delight instead of annoyance.

David Flower brings more than 28 years of experience within the IT industry to the role of CEO of Volt Active Data. Flower has a track record of building significant shareholder value across multiple software sectors on a global scale through the development and execution of focused strategic plans, organizational development and product leadership. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Johnny Kampis: Democrats Needlessly Push Another Round of Net Neutrality Legislation

The Net Neutrality and Broadband Justice Act may harm the ability of broadband infrastructure to grow.

Published

on

The author of this Expert Opinion is Johnny Kampis, director of telecom policy for the Taxpayers Protection Alliance.

It ain’t broke, but Democrats keep trying to “fix” it.

July 28 saw the introduction of a bill to reimplement Title II regulations on broadband providers, paving the way for a second attempt at “net neutrality” rules for the internet.

Led by Sen. Ed Markey, D-Mass., along with co-sponsors Sen. Ron Wyden, D-Ore., and Rep. Doris Matsui, D-Calif., the comically named Net Neutrality and Broadband Justice Act would classify ISPs as common carriers and give the Federal Communications Commission significant power to regulate internet issues such as pricing, competition, and consumer privacy.

Markey claims that the deregulation of the internet under former FCC Chairman Ajit Pai left broadband consumers unprotected. But as data has shown, and Taxpayers Protection Alliance’s own investigation highlighted, no widespread throttling, blocking or other consumer harm occurred after the Title II rules were repealed.

Randolph May, president of the Free State Foundation, noted after Markey’s bill was released that nearly all service providers’ terms of service contain legally enforceable commitments to not block or throttle the access of their subscribers to lawful content.

Markey said his legislation, which would codify broadband access as an essential service, will equip the FCC with the tools it needs to increase broadband accessibility.

The country already has the tools it needs to close the digital divide, with billions in taxpayer dollars flowing to every state to boost broadband access. For example, less than $10 billion in federal funding was dedicated to broadband in 2019, but an incredible $127 billion-plus in taxpayer dollars will be dedicated to closing the digital divide in the coming years. That doesn’t even count the nearly $800 billion in COVID-19 relief and stimulus funding that could be used for multiple issues, including broadband growth.

The bill’s proponents say that the FCC can foster a more competitive market with the passage of the legislation. FCC’s data already indicate the market is extremely competitive, with 99 percent of the U.S. population able to choose between at least two broadband providers. That doesn’t even account for wireless carriers and their rapid development of 5G.

The Net Neutrality and Broadband Justice Act may instead harm the ability of broadband infrastructure to grow without funneling even more taxpayer money toward the cause. Studies have shown that private provider investment increased after the regulatory uncertainty of Title II rules were removed. Prior to the reversal of the 2015 Open Internet Order, broadband network investment dropped more than 5.6 percent, the first decline outside of a recession, the FCC reported.

US Telecom reported that capital expenditures by ISPs totalled $79.4 billion in 2020 and grew to $86.1 billion in 2021.

Michael Powell, president and CEO of NCTA – The Internet & Television Association, called the issue of net neutrality “an increasingly stale debate” with justifications for it that “seem increasingly limp.”

“In the wake of the once-in-a-lifetime infrastructure bill, we need to be focused collectively on closing the digital divide and not taking a ride on the net neutrality carousel for the umpteenth time for no discernable reason,” he said. “Building broadband to unserved parts of this country is a massive, complex, and expensive undertaking. Slapping an outdated and burdensome regulatory regime on broadband networks surely will damage the mission to deploy next-generation internet technology throughout America and get everyone connected.”

Again, the specter of Title II regulations rears its ugly head for no discernible reason other than the government’s insatiable need for control. The broadband market has proven itself as a market that functions better with a light-touch approach, so we hope that Congress says not to this misguided bill.

Johnny Kampis is director of telecom policy for the Taxpayers Protection Alliance. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending