Connect with us

Expert Opinion

Daniel Hanley: Google and Facebook Are Essential, Let’s Regulate Them That Way

Published

on

The author of this Expert Opinion is Daniel Hanley, a policy analyst at the Open Markets Institute

Google and Facebook have extraordinary control over information and communications systems in the United States. These two corporations dominate internet search, social media, and digital advertising, and each one serves as the gateway to the internet for billions of people. Google and Facebook have even become the dominant sources for how most Americans obtain their news. Google and Facebook have maintained their dominant position in these markets for more than a decade.

These two corporations can – and do – arbitrarily exercise their unrivaled monopoly power to suppress competition and crush smaller, dependent rivals. They have deprived and locked out rivals’ access to their platforms and manipulated their platforms to favor their services at the expense of smaller and dependent competitors. Without access to data and their platforms more generally, dependent competitors can find themselves unable to compete effectively, create desirable applications for consumers, provide critical product features, or even catch the attention of potential customers.

But antitrust enforcers can stop this harmful concentration of power with laws already on the books, by declaring Facebook and Google essential facilities. The Federal Trade Commission or Department of Justice could litigate an antitrust case designating Facebook and Google as essential facilities, or the courts could do so in response to a lawsuit from private citizens.

The goal of regulating corporations as essential facilities is to ensure fair competition by decreasing the power of dominant firms over smaller firms. An essential facilities designation would mandate that rivals have equal access to the corporation’s facilities or ensure that dependent firms are charged equal prices for the corporation’s goods or services.

The role of the essential facilities doctrine in antitrust

The essential facilities doctrine is based on the principle that a dominant firm should not be allowed to deny rivals access to its infrastructure. When a dominant firm is deemed an essential facility, the firm loses the ability to decide which firms to do business with, because access to the dominant firm’s facilities are necessary for competition to exist in the first place.

Historically, the essential facilities doctrine has been applied by courts and other antitrust enforcers to critical aspects of infrastructure, such as railroads, trucking, electrical facilities, news syndicates, and telecommunications firms, including telephone and telegraph companies. By withholding access to a platform, such as communications wires or railroad tracks, dominant companies stymied competitors, entrenched their monopoly positions, and often extended their dominance to adjacent markets.

Federal and state legislators have also deployed a similar regulatory designation. Legislators have designated some industries, such as hospitals, pipelines, and electrical plants, as natural monopolies, and so these industries were managed as public utilities. Similar to an essential facilities designation, lawmakers decided that they had to classify certain firms and industries as public utilities because of the difficulty of supporting competition in the industry and because of the likelihood that a dominant corporation would crush smaller rivals or exclude dependents.

Similar to railroad tracks and telephone wires, the source of Google’s and Facebook’s dominance is that each controls critical gateways to the internet – internet search and social networking, respectively – and controls extensive, unparalleled, and nonreplicable data collection infrastructure that they have woven into every aspect of the internet, far beyond their own websites.

Facebook’s familiar Like button is embedded into more than 8 million websites, from which the corporation collects extensive data on users who visit pages with the Like button embedded into it and from users who click on it.

Google has embedded tracking code in 85 percent of websites and 94 percent of Android Play Store applications. Google’s information collection efforts are so extensive and frequent that the corporation can determine whether a user is running or walking.

Google and Facebook’s data repositories serve as a choke-point

Google’s and Facebook’s data repositories are so extensive they have become critical avenues for academic research, and they form the foundation for countless software applications and enhanced software features such as frictionless user sign-on. Google’s and Facebook’s data also provide these corporations with the ability to engage in highly targeted advertising campaigns to attract the attention of the right audiences to use or purchase an advertised product or service. Access to Google’s and Facebook’s data infrastructure is necessary for any internet upstart to become a viable company in the technology sector.

Google and Facebook exploit their duopoly control over critical information and communications systems, using anti-competitive practices against current and potential rivals. Google and Facebook have routinely denied access to their essential data troves and platforms, for example. In 2013, Facebook CEO Mark Zuckerberg personally approved revoking the video application Vine’s access to Facebook’s Friends List. This cut off Vine’s access to the dominant social network, which it needed to reach potential users to become a viable competitor.

This was not the first time that Facebook abused its power. Internal Facebook documents reveal that the corporation has routinely used access to its data as a bargaining chip to leverage its dominance over potential rivals to win favorable partnerships. A recently filed class action alleges similar conduct, as the lawyer representing the plaintiffs against Facebook says that “Facebook deliberately leveraged its developer platform, an infrastructure of spyware and surveillance, and its economic power to destroy or acquire anyone that competed with them.”

Google’s anti-competitive strategy of demoting links of rival sites

Google engages in similarly anti-competitive practices. In an internal memo accidentally sent to The Wall Street Journal, FTC staff stated that Google “adopted a strategy of demoting or refusing to display” links to certain rival websites. The report concluded that Google’s conduct resulted in “real harm to consumers and to innovation.”

Google and Facebook have also repeatedly abused their dominant market positions to engage in self-dealingpromoting their own products over those of rivals. For example, a 2017 analysis by The Wall Street Journal found that 91 percent of 25,000 product searches on Google search featured Google products in the first advertisement slot. The study also found that 43 percent of the searches featured Google products in the top two advertisement slots. An analysis conduct by The Markup on Tuesday showed that 41 percent of the search results on Google search were for Google’s own services.

This ability to manipulate its dominant platforms ultimately gives Google’s services an unbeatable comparative advantage, suppresses competition, and snuffs out the innovations of alternative services. In short, Google and Facebook can leverage their platforms to pick the winners and losers in the marketplace – and they pick themselves whenever possible. Consumers are deprived of alternative services that have better features, and we are all deprived of the innovations that fair and robust competition would provide.

The COVID-19 pandemic is only heightening users’ dependence on Google and Facebook

In sum, Facebook’s and Google’s services are as critical to both work and leisure as public utilities and the telephone were in the 20th century for energy and communications. The COVID-19 pandemic has only exacerbated users’ dependence on these services. Facebook’s user base has increased by more than 11 percent since last year, and the number of monthly active users have increased by 10 percent.

The number of users of Google Classroom doubled to 100 million in March alone. Google’s videoconferencing service has experienced a 30-fold increase since January. News organizations have flocked to YouTube to broadcast their content, evidenced by a 75% increase in the number of users watching news sources from certain outlets.

Our increased reliance on these platforms only increases the need for essential facilities regulation. This designation would promote additional accountability and scrutiny over Google’s and Facebook’s conduct to ensure that their policies are equitable and fair for all users, so that rival platforms could not be arbitrarily blocked by Google and Facebook. Such accountability may also help lawmakers understand the effects of Google’s and Facebook’s conduct and promote additional regulatory actions, such as limiting their invasions of privacy for their panopticons of data collection and ad targeting.

The testimony during the House of Representatives’ investigation into online platforms on Wednesday revealed the depth of lawmakers’ concern with the economy-wide repercussions resulting from any decision made by Google and Facebook without public oversight. An essential facilities designation from enforcers would resolve most of their concerns.

The FTC has applied an essential facilities-like designation on corporations with far less power than Facebook and Google. In January, a federal judge refused to dismiss a case by the FTC against Surescripts, the dominant provider of a “must-have” e-prescribing software used by medical professionals and pharmacies, for denying rivals’ access to their essential platform by using exclusionary contracts. In deeming Surescripts as “must-have,” the FTC argued that it is an essential service for both prescribers and pharmacies.

Also, in February, the Seventh Circuit Court of Appeals affirmed the use of a closely related legal doctrine against Comcast for leveraging its dominant position to exclude Viamedia from access to for cable-television advertising services. Comcast’s actions caused Viamedia’s customers to abandon Viamedia as a supplier of advertising. Viamedia’s situation is similar to how Facebook and Google shut off access to their platforms and data, preventing rivals from becoming viable competitors in the industry.

Google and Facebook are facing at least five antitrust investigations, including a review of Google’s and Facebook’s business operations by the House of Representatives. These investigations should provide rich evidence to support later antitrust litigants such as private parties, federal agencies, and state attorneys general to impose essential facilities requirements on Google and Facebook.

This designation would invigorate competition in the sectors dominated by Google and Facebook, and it would restrain them from abusing their dominant market power to stifle competition and harm rivals.

Daniel A. Hanley is a policy analyst at the Open Markets Institute. You can follow him on Twitter @danielahanleyThis piece is exclusive to Broadband Breakfast.

BroadbandBreakfast.com accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC. 

Broadband Breakfast is a decade-old news organization based in Washington that is building a community of interest around broadband policy and internet technology, with a particular focus on better broadband infrastructure, the politics of privacy and the regulation of social media. Learn more about Broadband Breakfast.

Broadband Mapping & Data

Jeff Miller: Tools to Manage the Next-Generation Network Buildouts

Service providers that use GIS applications are able to reduce design time by 80 percent.

Published

on

The author of this expert opinion is Jeff Miller, Synchronoss Technologies CEO.

Today’s digital world is driving the insatiable need for fiber networks and connectivity, thus the thrust for widespread broadband buildouts and deployments worldwide. Broadband connectivity is the heartbeat for mobility, cloud applications, voice, video, and social media, not to mention home automation, IoT, and smart cities. As a result, service providers and operators are investing heavily in infrastructure, claiming their 5G networks are the largest or fastest or most reliable.

Initiatives like the Rural Digital Opportunity Fund are aimed at bridging the digital divide and fast-tracking investment to deploy high speed fixed broadband service to rural areas and small businesses that lack it. The Federal Communications Commission’s $20.4 billion program requires that networks stand the test of time by prioritizing higher network speeds and lower latency.

A key element in the implementation of RDOF-backed projects is broadband mapping. The Federal Communications Commission is in the process of updating its current broadband maps with more detailed and precise information on the availability of fixed and mobile broadband services. The Broadband Deployment Accuracy and Technological Availability Act, signed into law in March 2020, requires the FCC to change the way broadband data is collected, verified, and reported

As carriers build, expand, and upgrade their fiber network infrastructure, a great deal of planning is required, along with documenting the intricacies of design and engineering processes.

Streamlining and automating network planning and design processes through software can deliver accurate and timely network info for service providers, increase efficiency, and create opportunities for reducing costs.

GIS based systems are replacing volumes of paper, and outdated static CAD, Excel and Vizio files. They offer sophisticated tools to manage all aspects of network design and infrastructure management. Working with many service providers that use GIS applications, they are able to reduce design time by 80 percent and drastically cut other capital expenditures.

Automation is key

Having to rely on a system of manual processes to manage the fiber network makes it increasingly difficult to scale. Fortunately, with the introduction of automation into the network management process by utilizing an accurate physical network inventory in addition to geographic information system mapping, scalability becomes a much easier task.

Continuous planning and engineering tasks can ultimately become automated through software implementation. Automating network fiber management creates significant business value by shifting a service provider’s approach from reactive to proactive. A comprehensive and updated database for network architecture quickly allows for scenario analysis and capacity planning. Sharing automated processes across different organizations becomes much simpler and improves collaboration while reducing errors. This can allow staff to shift their focus to more pressing operational activities thus making the network more reliable.

Integration between different systems

Whether it is your enterprise GIS or outage monitoring system, it should be easy to interact with third-party systems to get the most out of the network data. Ideally, you should be able to receive an outage notification and use that location to track down the network and pinpoint the root cause to act and quickly resolve the situation before customers notice. This can help save time, money, and guarantee customer satisfaction.

Mobilize network data and increase field worker productivity

Utilizing a fiber networking and planning solution enables network information to be shared easily and quickly between the field and office to provide access to the information they need when they need it at any given time. Enterprise-wide access can provide timely and accurate network information for a wide range of communications service providers.

When it comes to service providers, expanded visibility into a network yields a greater overall awareness of the network. Automating third-party data exchange processes with accurate and up-to date inventory can optimize performance for field workers and guarantee customer satisfaction. Improved access to data can increase ROI by allowing cable locators and field techs to receive accurate confirmation before they arrive at a job. In the end, there will be fewer mistakes which ensures happier customers.

The right tools can result in improved scalability, reduced time to revenue, lower operational costs, and actionable insights that can be gleaned from network data.

Jeff Miller serves as President and CEO of Synchronoss Technologies. He previously served as President for IDEAL Industries Technology Group, following a 16-year experience with Motorola Mobility where he was Corporate Vice President of North America. Miller also serves on the Board of 1871, Chicago’s largest start-up incubator, and on the non-profit Boards of Aspire Chicago and Junior Achievement. This article is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Dmitry Sumin: What to Do About Flash Calls, the New SMS Replacement

Why are flash calls on the rise and how do operators handle them to maximize revenue?

Published

on

The author of this Expert Opinion is Dmitry Sumin, AB Handshake Corporation Head of Products

Chances are you’ve received several flash calls this week when registering for a new app or verifying a transaction. Flash calls are almost instantly dropped calls that deliver one-time passcodes to users, verifying their phone numbers and actions. Many prominent apps and companies, such as Viber, Telegram, WhatsApp, and TikTok, use flash calls as a cheaper, faster, and more user-friendly alternative to application-to-person SMS.

With the flash call volume expected to increase 25-fold from 2022 to 2026, from five to 130 billion, it’s no wonder they’re a hot topic in the telecom industry.

But what’s the problem, you may ask?

The problem is that there is currently no way for operators to bill zero-duration calls. This means operators don’t make any termination revenue from flash calls, which overload networks. What’s more, operators lose SMS termination revenues as businesses switch to flash calls. SMS business messaging accounts for up to five percent of total operator-billed revenue in 2021, so you can see the scale of potential revenue losses for operators. 

In this article, I’ll discuss why flash calls are on the rise, why it’s difficult to detect and monetize them, and what operators can do about this.

Why are flash calls overtaking SMS passcodes?

Previously, application-to-person SMS was a popular way to deliver one-time passwords. But enterprises and communication service providers are increasingly switching to flash calls because they have several disruptive advantages over SMS.

First and foremost, flash calls are considerably cheaper than SMS, sometimes costing up to eight times less. Cost of delivery is, of course, a prime concern for apps and enterprises.

Second, flash calls ensure smooth user interaction, which boosts user satisfaction and retention. On Androids, mobile apps automatically extract flash call passcodes. This makes the two-factor authentication process fast and frictionless. In comparison, SMS passcodes require users to read the SMS and sometimes insert the code manually.

Third, on average flash calls reach users within 15 seconds, while SMS sometimes take 20 seconds or longer. The delivery speed of flash calls also improves the user experience.

The problem: Flash calls erode operators’ SMS revenues

While offering notable advantages for apps, flash call service providers, and end users, flash calls create numerous challenges for operators and transit carriers.

As we discussed before, flash calls erode operators’ SMS revenues because much of the new flash call traffic will be shifted away from current SMS business messaging. The issue is only going to become more pressing as the volume of flash calls grows.

So from the operator’s standpoint, flash calls reduce revenue, disrupt relations with interconnect partners, and overload networks. However, there is still no industry consensus on how to handle flash calls: block them like spam and fraudulent traffic or find a monetization model for this verification channel, like for application-to-person SMS.

Accurate detection of flash calls is a challenge

The first crucial step that gives operators the upper hand is accurately detecting flash calls.

This is difficult because operators have no way of discerning legitimate verification flash calls from fraud schemes that rely on drop calls, such as wangiri. The wangiri fraud scheme uses instantly dropped calls to trick users into calling back premium rate numbers. In addition, flash calls need to be distinguished from genuine missed calls placed by customers.

The problem is that even advanced AI-powered fraud management systems struggle to accurately differentiate between various zero-duration calls. The task requires AI engines to be trained on large volumes of relevant traffic coupled with analysis of hundreds of specific call parameters.

Dedicated anti-fraud solutions are the answer

There are only a few solutions on the market that are capable of accurately distinguishing flash calls from other zero-duration calls. Dedicated fraud management vendors have made progress on this difficult task.

The highest accuracy of flash call detection now available on the market is 99.92 percent. Such tools allow operators to precisely determine the ranges from which flash calls are sent. As a result, operators can make an informed decision on how to treat flash calls to maximize revenue and can proactively negotiate with flash call providers.

Flash call detection creates new opportunities

Our team estimates that flash calls make up to four percent of Tier one operators’ international voice traffic. Without accurate detection and a billing strategy, this portion of traffic overloads operators’ networks and offers no revenue. However, with proper detection flash calls offer a new business opportunity.

Now is a crucial time for operators to start implementing flash call detection into their system and capitalize on the trend.

There are a few anti-fraud solutions on the market that give operators all the necessary information to negotiate a billing agreement with a flash call provider. Once an agreement has been reached, all flash calls coming from this provider will be monetized, much like SMS.

All flash calls not covered by agreements can be blocked automatically. This will help to restore SMS revenues. Once a flash call has been blocked, subscribers will most likely receive an SMS passcode sent as a fallback.

Moreover, modern solutions don’t affect any legitimate traffic because they only block selected ranges. This also helps to prevent revenue loss.

Essentially, the choice of how to handle flash calls comes down to each operator. However, without a powerful anti-fraud solution capable of accurately detecting flash calls in real time, it’s nearly impossible to monetize flash calls effectively and develop a billing strategy.

Dmitry Sumin is the Head of Products at the AB Handshake Corporation. He has more than 15 years of experience in international roaming, interconnect and fraud management. Since graduating from Moscow State University, he has worked for both vendors and network operators in the MVNO and telecommunications market. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Bjorn Capens: Strong Appetite for Rural Broadband Calls for Next Generation Fiber Technology

The first operator to bring fiber to a community creates a significant barrier to entry for competitors.

Published

on

The author of this Expert Opinion is Björn Capens, Nokia Fixed Networks European Vice President

In July, the Biden-Harris administration announced another $401 million in funding for high-speed Internet access in rural America. This was just the latest in a string of government initiatives aimed at helping close the US digital divide.

These initiatives have been essential for encouraging traditional broadband providers, communities and utility companies to deploy fiber to rural communities, with governments cognizant of the vital role broadband connectivity has in sustaining communities and improving socio-economic opportunities for citizens. 

Yet there is still work to do, even in countries with the most advanced connectivity options. For example, fixed broadband is missing from almost 30 percent of rural American homes, according to Pew Research. It’s similar in Europe where a recent European Commission’s Digital Divide report found that roughly 18 percent of rural citizens can only get broadband speeds of a maximum 30 Mb, a speed which struggles to cope with modern digital behaviors. 

Appetite for high-speed broadband in rural areas is strong

There’s no denying the appetite for high-speed broadband in rural areas. The permanent increase in working from home and the rise of modern agricultural and Industry 4.0 applications mean that there’s an increasingly attractive business case for rural fiber deployments – as the first operator to bring fiber to a community creates a significant barrier to entry for competitors. 

The first consideration, then, for a new rural fiber deployment is which passive optical network technology to use. Gigabit PON seems like an obvious first choice, being a mature and widely deployed technology. 

However, GPON services are a standard offering for nearly every fiber broadband operator. As PON is a shared medium with usually up to 30 users each taking a slice, it’s easy to see how a few Gigabit customers can quickly max out the network, and with the ever-increasing need for speed, it’s widely held that GPON will not be sufficient by about 2025. 

XGS-PON is an already mature technology

The alternative is to use XGS-PON, a more recent, but already mature, flavor of PON with a capacity of 10 Gigabits per second. With the greater capacity, broadband operators can generate higher revenues with more premium-tier residential services as well as lucrative business services. There’s even room for additional services to run alongside business and residential broadband. For example, the same network can carry traffic from four G and five G cells, known as mobile backhaul. That’s either a new revenue opportunity or a cost saving if the operator also runs a mobile network. 

This convergence of different services onto a single PON fiber network is starting to take off, with fiber-to-the-home networks evolving into fiber for everything, where homes, businesses, industries, smart cities, mobile cells and more are all running on the same infrastructure. This makes the business case even stronger. 

Whether choosing GPON or XGS-PON, the biggest cost contributor is the same for both: deploying fiber outside the plant. Therefore, the increased cost of XGS-PON over GPON is far outweighed by the capacity increase it brings, making XGS-PON the clear choice for a brand-new fiber deployment. XGS-PON protects this investment for longer as its higher capacity makes it harder for new entrants to offer a superior service. 

It also doesn’t need to be upgraded for many years, and when it comes to the business case for fiber, it pays to take a long-term view. Fiber optic cable has a shelf-life of 75 or more years, and even as one increases the speeds running on fiber, that cable can remain the same.  

Notwithstanding these arguments, fiber still comes at a cost, and operators need to carefully manage those costs in order to maximize returns. 

Recent advances in fiber technology allow operators to take a pragmatic approach to their rollouts. In the past, each port on a PON server blade could only deliver one technology. But Multi-PON has multiple modes: only GPON, only XGS-PON or both together. It even has a forward-looking 25G PON mode. 

This allows an operator to easily boost speeds as needed with minimal effort and additional investment. GPON could be the starting point for fiber-to-the-home services, XGS-PON could be added for business services, or even a move to 25G PON for a cluster of rural power users, like factories and modern warehouses – creating a seamless, future-proof upgrade path for operators. 

The decision not to invest in fiber presents a substantial business risk

Alternatively, there’s always the option for a broadband operator to stick with basic broadband in rural areas and not invest in fiber. But that actually presents a business risk, as any competitor that decides to deploy fiber will inevitably carve out a chunk of the customer base for themselves. 

Besides, most operators are not purely profit-driven; they too recognize that prolonging the current situation in underserved communities is not great. High-speed broadband makes areas more attractive for businesses, creating more jobs and stemming population flows from rural to urban centers. 

But rural broadband not only improves lives, but it also decreases the world’s carbon emissions both directly, compared to alternative broadband technologies, and indirectly by enabling online and remote activities that would otherwise involve transportation. These social and economic benefits of fiber are highly regarded by investors and stockholders who have corporate social responsibility high on their agendas. 

With the uber-connected urban world able to adopt every new wave of bandwidth-hungry application – think virtual reality headsets and the metaverse – rural communities are actually going backwards in comparison. The way forward is fiber and XGS-PON. 

Björn Capens is Nokia Fixed Networks European Vice President. Since 2017, Capens has been leading Nokia’s fixed networks business, headquartered in Antwerp, Belgium. He has more than 20 years of experience in the fixed broadband access industry and holds a Master’s degree in Electrical Engineering, Telecommunications, from KU Leuven. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Broadband Breakfast Research Partner

Trending