Connect with us

Expert Opinion

Daniel Hanley: Google and Facebook Are Essential, Let’s Regulate Them That Way



The author of this Expert Opinion is Daniel Hanley, a policy analyst at the Open Markets Institute

Google and Facebook have extraordinary control over information and communications systems in the United States. These two corporations dominate internet search, social media, and digital advertising, and each one serves as the gateway to the internet for billions of people. Google and Facebook have even become the dominant sources for how most Americans obtain their news. Google and Facebook have maintained their dominant position in these markets for more than a decade.

These two corporations can – and do – arbitrarily exercise their unrivaled monopoly power to suppress competition and crush smaller, dependent rivals. They have deprived and locked out rivals’ access to their platforms and manipulated their platforms to favor their services at the expense of smaller and dependent competitors. Without access to data and their platforms more generally, dependent competitors can find themselves unable to compete effectively, create desirable applications for consumers, provide critical product features, or even catch the attention of potential customers.

But antitrust enforcers can stop this harmful concentration of power with laws already on the books, by declaring Facebook and Google essential facilities. The Federal Trade Commission or Department of Justice could litigate an antitrust case designating Facebook and Google as essential facilities, or the courts could do so in response to a lawsuit from private citizens.

The goal of regulating corporations as essential facilities is to ensure fair competition by decreasing the power of dominant firms over smaller firms. An essential facilities designation would mandate that rivals have equal access to the corporation’s facilities or ensure that dependent firms are charged equal prices for the corporation’s goods or services.

The role of the essential facilities doctrine in antitrust

The essential facilities doctrine is based on the principle that a dominant firm should not be allowed to deny rivals access to its infrastructure. When a dominant firm is deemed an essential facility, the firm loses the ability to decide which firms to do business with, because access to the dominant firm’s facilities are necessary for competition to exist in the first place.

Historically, the essential facilities doctrine has been applied by courts and other antitrust enforcers to critical aspects of infrastructure, such as railroads, trucking, electrical facilities, news syndicates, and telecommunications firms, including telephone and telegraph companies. By withholding access to a platform, such as communications wires or railroad tracks, dominant companies stymied competitors, entrenched their monopoly positions, and often extended their dominance to adjacent markets.

Federal and state legislators have also deployed a similar regulatory designation. Legislators have designated some industries, such as hospitals, pipelines, and electrical plants, as natural monopolies, and so these industries were managed as public utilities. Similar to an essential facilities designation, lawmakers decided that they had to classify certain firms and industries as public utilities because of the difficulty of supporting competition in the industry and because of the likelihood that a dominant corporation would crush smaller rivals or exclude dependents.

Similar to railroad tracks and telephone wires, the source of Google’s and Facebook’s dominance is that each controls critical gateways to the internet – internet search and social networking, respectively – and controls extensive, unparalleled, and nonreplicable data collection infrastructure that they have woven into every aspect of the internet, far beyond their own websites.

Facebook’s familiar Like button is embedded into more than 8 million websites, from which the corporation collects extensive data on users who visit pages with the Like button embedded into it and from users who click on it.

Google has embedded tracking code in 85 percent of websites and 94 percent of Android Play Store applications. Google’s information collection efforts are so extensive and frequent that the corporation can determine whether a user is running or walking.

Google and Facebook’s data repositories serve as a choke-point

Google’s and Facebook’s data repositories are so extensive they have become critical avenues for academic research, and they form the foundation for countless software applications and enhanced software features such as frictionless user sign-on. Google’s and Facebook’s data also provide these corporations with the ability to engage in highly targeted advertising campaigns to attract the attention of the right audiences to use or purchase an advertised product or service. Access to Google’s and Facebook’s data infrastructure is necessary for any internet upstart to become a viable company in the technology sector.

Google and Facebook exploit their duopoly control over critical information and communications systems, using anti-competitive practices against current and potential rivals. Google and Facebook have routinely denied access to their essential data troves and platforms, for example. In 2013, Facebook CEO Mark Zuckerberg personally approved revoking the video application Vine’s access to Facebook’s Friends List. This cut off Vine’s access to the dominant social network, which it needed to reach potential users to become a viable competitor.

This was not the first time that Facebook abused its power. Internal Facebook documents reveal that the corporation has routinely used access to its data as a bargaining chip to leverage its dominance over potential rivals to win favorable partnerships. A recently filed class action alleges similar conduct, as the lawyer representing the plaintiffs against Facebook says that “Facebook deliberately leveraged its developer platform, an infrastructure of spyware and surveillance, and its economic power to destroy or acquire anyone that competed with them.”

Google’s anti-competitive strategy of demoting links of rival sites

Google engages in similarly anti-competitive practices. In an internal memo accidentally sent to The Wall Street Journal, FTC staff stated that Google “adopted a strategy of demoting or refusing to display” links to certain rival websites. The report concluded that Google’s conduct resulted in “real harm to consumers and to innovation.”

Google and Facebook have also repeatedly abused their dominant market positions to engage in self-dealingpromoting their own products over those of rivals. For example, a 2017 analysis by The Wall Street Journal found that 91 percent of 25,000 product searches on Google search featured Google products in the first advertisement slot. The study also found that 43 percent of the searches featured Google products in the top two advertisement slots. An analysis conduct by The Markup on Tuesday showed that 41 percent of the search results on Google search were for Google’s own services.

This ability to manipulate its dominant platforms ultimately gives Google’s services an unbeatable comparative advantage, suppresses competition, and snuffs out the innovations of alternative services. In short, Google and Facebook can leverage their platforms to pick the winners and losers in the marketplace – and they pick themselves whenever possible. Consumers are deprived of alternative services that have better features, and we are all deprived of the innovations that fair and robust competition would provide.

The COVID-19 pandemic is only heightening users’ dependence on Google and Facebook

In sum, Facebook’s and Google’s services are as critical to both work and leisure as public utilities and the telephone were in the 20th century for energy and communications. The COVID-19 pandemic has only exacerbated users’ dependence on these services. Facebook’s user base has increased by more than 11 percent since last year, and the number of monthly active users have increased by 10 percent.

The number of users of Google Classroom doubled to 100 million in March alone. Google’s videoconferencing service has experienced a 30-fold increase since January. News organizations have flocked to YouTube to broadcast their content, evidenced by a 75% increase in the number of users watching news sources from certain outlets.

Our increased reliance on these platforms only increases the need for essential facilities regulation. This designation would promote additional accountability and scrutiny over Google’s and Facebook’s conduct to ensure that their policies are equitable and fair for all users, so that rival platforms could not be arbitrarily blocked by Google and Facebook. Such accountability may also help lawmakers understand the effects of Google’s and Facebook’s conduct and promote additional regulatory actions, such as limiting their invasions of privacy for their panopticons of data collection and ad targeting.

The testimony during the House of Representatives’ investigation into online platforms on Wednesday revealed the depth of lawmakers’ concern with the economy-wide repercussions resulting from any decision made by Google and Facebook without public oversight. An essential facilities designation from enforcers would resolve most of their concerns.

The FTC has applied an essential facilities-like designation on corporations with far less power than Facebook and Google. In January, a federal judge refused to dismiss a case by the FTC against Surescripts, the dominant provider of a “must-have” e-prescribing software used by medical professionals and pharmacies, for denying rivals’ access to their essential platform by using exclusionary contracts. In deeming Surescripts as “must-have,” the FTC argued that it is an essential service for both prescribers and pharmacies.

Also, in February, the Seventh Circuit Court of Appeals affirmed the use of a closely related legal doctrine against Comcast for leveraging its dominant position to exclude Viamedia from access to for cable-television advertising services. Comcast’s actions caused Viamedia’s customers to abandon Viamedia as a supplier of advertising. Viamedia’s situation is similar to how Facebook and Google shut off access to their platforms and data, preventing rivals from becoming viable competitors in the industry.

Google and Facebook are facing at least five antitrust investigations, including a review of Google’s and Facebook’s business operations by the House of Representatives. These investigations should provide rich evidence to support later antitrust litigants such as private parties, federal agencies, and state attorneys general to impose essential facilities requirements on Google and Facebook.

This designation would invigorate competition in the sectors dominated by Google and Facebook, and it would restrain them from abusing their dominant market power to stifle competition and harm rivals.

Daniel A. Hanley is a policy analyst at the Open Markets Institute. You can follow him on Twitter @danielahanleyThis piece is exclusive to Broadband Breakfast. accepts commentary from informed observers of the broadband scene. Please send pieces to The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC. 

Broadband Breakfast is a decade-old news organization based in Washington that is building a community of interest around broadband policy and internet technology, with a particular focus on better broadband infrastructure, the politics of privacy and the regulation of social media. Learn more about Broadband Breakfast.

Continue Reading
Click to comment

Leave a Reply

Broadband Mapping & Data

Tom Reid: Accountability in Broadband Maps Necessary for BEAD to Achieve Mission

The sheer magnitude of the overstatements in the FCC’s map makes the challenge process untenable.



The author of this Expert Opinion is Tom Reid, president of Reid Consulting Group.

With millions of American households stranded in the digital desert, we need to achieve accountability in broadband to make sure the Broadband Equity, Access and Deployment funding achieves its mission. The broadband gaps can be readily identified despite the air of mystery surrounding the topic.

Broadband improvements have been constrained for decades by inaccurate maps, yet the Federal Communications Commission continues to accept dramatically exaggerated availability and capacity claims from internet service providers. The cumbersome challenge process requires consumers and units of government to prove a negative — a logical fallacy.

The Reid Consulting Group and other parties, including Microsoft, have developed robust algorithms to reliably identify actual broadband availability. RCG utilizes Ookla Speedtest Intelligence data due to the large quantity of consumer-initiated tests. In Ohio, as an example, we draw on more than 16 million speed tests reflecting the lived experience from millions of households. We combine the speed test findings with FCC and Census data to deliver irrefutable identification of unserved and underserved locations.

Such methodologies offer State Broadband Leaders the opportunity to reverse the burden of proof in the BEAD program, requiring that ISPs submit concrete evidence supporting their availability and speed claims. As an example, in Ohio, RCG’s maps were accepted as proof of unserved status for the 2022 state grant program. BroadbandOhio then required ISPs to submit substantial proof in their challenge process. In other words, the ISP’s were tasked with proving a positive instead of expecting citizens to prove a negative.

ISPs and the FCC denounce crowdsourced data unless conducted under unusually restrictive conditions. The ISPs have successfully promoted unsubstantiated myths regarding the value of consumer-initiated speed tests.

Myth: Bad tests are because of poor Wi-Fi.
Reality: RCG eliminates speed tests with weak Wi-Fi and includes GPS enabled wired devices. Even first-generation Wi-Fi would saturate a 25 Megabits per second download and 3 Mbps upload connection.

Myth: Residents only subscribe to low-speed packages.
Reality: According to the National Rural Electric Cooperative Association, in areas where rural electric cooperatives offer broadband, 25 to 33 percent of rural subscribers opt for the top speed tier offered. We can clearly see this trend in areas where fiber has been deployed in recent years, as described later in this article.

Myth: People only test when there is a problem.
Reality: Network problems prompt tests, as do resolutions of problems.  RCG recommends focusing on the maximum speed test results to eliminate this “unhappy customer effect.”

Finding the truth: Broadband and the lived experience

In Ohio, RCG analyzed more than 14 million consumer-initiated speed tests over a three-year period. The data reveals a clear pattern of carrier overstatement. The stark visual contrast between the two maps is hard to ignore — and while this study is focused on Ohio, the issue remains nationwide in scope. The sheer magnitude of the overstatements makes the FCC challenge process untenable.

Figure 1: Ohio Broadband Reality vs. FCC ISP stated coverage map.

RCG utilized the “maximum speeds ever seen” at a location for generating maps and coverage figures, but we also examined the results from the average of speed test. Switching between average and maximum speeds does not change the overall picture of broadband availability. As an example, Figure 2 focuses on an area around Bolivar, Missouri. Looking at the maximum speed turns Bolivar itself a deeper green, meaning “better served,” but the rural areas around Bolivar remain predominantly red, meaning “unserved.”  The preponderance of evidence clearly demonstrates that much of the rural area around Bolivar remains unserved, even at maximum speeds.

Figure 2: Map visualization illustrating the difference between viewing average speeds in the Bolivar, Missouri area and maximum speeds documented.

When rating broadband availability in the Bolivar area at the Census block level and overlaying with ISP coverage claims at the H3 R8 level, you can see that many of the unserved and underserved areas have been reported as served to the FCC by ISPs (Figure 3).

Figure 3: Carrier overstatement small scale in Bolivar, Missouri. RCG speed map with FCC H3 R8 hexagon overlay.

Zooming out to examine the entirety of Missouri (Figure 4), the pattern of ISP overstatement becomes quite clear. According to the FCC maps, most of the state is served, whereas the analysis conducted by RCG shows that significant areas remain in need of broadband investment. As with Ohio, the scope of the overstatement in Missouri presents an unreasonable burden on the public to challenge.

Figure 4: Missouri reality vs. ISP Reports, March 2023.

Showing Progress: Change of State Analysis

Change-of-state analysis taps progressive releases of Ookla records to identify areas where broadband speeds have set new highs. This approach works not only for grant funded projects but also private investments. The area surrounding Byesville, Ohio (Figure 5) reveals a significant uptick in test volume, test locations, and speeds from 2020 to 2022. Side-by-side comparison shows a large number of “green” (served) speed test locations where there used to be only “red” (unserved) and “orange” (underserved) results. This change is a direct result of a Charter Communications Rural Digital Opportunity Fund deployment.

Figure 5: The unserved area around Byesville, Ohio before and after broadband deployment.

State Broadband Leaders can use these capabilities to document progress and identify lagging projects. Any service area will always exhibit a mix of speed test results.  Even in an area like Byesville where fiber-to-the-home has been deployed, not all the location “dots” will turn green. However, the preponderance of evidence clearly shows that a funded ISP — in this case, Charter — has made good on its commitment to expanded broadband access. ISPs can help by conducting speed tests at the time of installation from the customer’s premises and by increasing minimum packages to 100/20 Mbps or higher.

There is no mystery to solve — we know how to identify areas lacking broadband services. For many rural Americans, even their telephone services have become unreliable, still dependent on the now-decrepit copper cables built in the 1940s through 1960s. We all depend on a healthy rural economy for our food, water and energy. Let’s make the commitment to build the infrastructure needed to bring these households into the internet age — starting by bringing reality and accountability to the availability maps.

Tom Reid is the president of Reid Consulting Group, a firm specializing in broadband. They work with clients to generate insights, create actionable plans, and identify funding sources to connect unserved and underserved areas. RCG’s engagements in eight states have delivered 6,000 miles of fiber construction with a total project value of $1.6 billion and has secured over $330 million in grant funds on behalf of clients. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Johnny Kampis: Broadband Industry Hopeful to Get Waivers from Biden Administration Protectionist Policies

The Buy America mandate could seriously hamper the Broadband Equity, Access and Deployment program.



The author of this Expert Opinion is Johnny Kampis, director of telecom policy for the Taxpayers Protection Alliance.

In a presidential administration rife with protectionist policies, the broadband internet industry is optimistic it will receive waivers from the Buy America” mandate that threatens to derail plans to close the digital divide.

The National Telecommunications and Information Administration is likely to announce state funding allocations for the $42.5 billion Broadband Equity, Access and Deployment program by the end of June. That is the biggest piece of the taxpayer-funded pie allocated by Congress to extend broadband infrastructure across the U.S. over the next several years.

But, as the Taxpayers Protection Alliance has reported, broadband industry leaders say the Buy America mandate could seriously hamper the effort. As part of the mandate, the Biden administration has said that at least 55 percent of the component parts of a product used in federal construction projects must be sourced domestically. That rule applies to any infrastructure project, but broadband has taken center stage recently with the BEAD funding imminent.

Because fiber-optic cables primarily used in broadband infrastructure projects include materials such as aluminum, copper, glass, plastic and steel that are primarily manufactured in other countries, under the current rules they would be forbidden. And many other important cogs in the broadband machine, such as routers and switches, are mostly made overseas. Even the left-leaning Brookings Institution noted the policy could put broadband deployments as risk.”

Fortunately, the Biden administration is softening on its Buy America policies — at least in the broadband industry. NTIA chose earlier this month to exempt several categories of equipment such as broadband routing equipment, transceivers and antennas from the domestic manufacturing requirements in the Enabling Middle Mile Infrastructure Program. The agency said that although there are public and private efforts underway to increase manufacturing capacity… industry will not be able to address shortages of the manufactured products and construction materials required for middle mile network deployment within the timeframes required.”

Broadband Breakfast pointed out in a recent article that it will take several years to ramp up production of semiconductors in the U.S. and the BEAD program has set a five-year timeline for project completion.

The estimates are that it would take at least, at a minimum, three to five years to bring a semiconductor chip plant to the U.S.,” said Pam Arluk, vice president of NCTA – The Internet & Television Association. And even though the BEAD program is going to be over several years, thats still just not enough time.”

The inherent difficulties in meeting the Buy America mandate, and the precedent now set with the middle mile program, provide optimism that waivers will likely be offered with BEAD. But that is just one of many infrastructure programs now being funded by taxpayers through federal recovery programs.

As President Joe Biden said in his State of the Union Address in February, American-made lumber, glass, drywall, fiber optic cables…on my watch, American roads, American bridges, and American highways will be made with American products.”

Washington Post columnist Fareed Zakaria pointed out that what he calls the Biden Doctrine” violates the spirit of the World Trade Organization and its framework of open trade. And another Post columnist, former Clinton administration Treasury Secretary Lawrence Summers, noted that protectionist policies tend to hurt more people than they help — giving as an example steel tariffs that aided 60,000 steel workers, but threatened the jobs of 6 million other workers in industries paying inflated prices for steel.

Strides in broadband waivers are a good sign, but the Biden administration must do more to curtail its protectionist policies as industries use economic recovery funds to build infrastructure in the coming years.

Johnny Kampis is director of telecom policy for the Taxpayers Protection Alliance. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Angie Kronenberg: The FCC Must Act Now to Save the USF

While the USF remains vital in an ever-increasing connected world, it is in serious jeopardy of surviving.



The author of this Expert Opinion is INCOMPAS President Angie Kronenberg.

Last week, the Senate Subcommittee on Communications, Media and Broadband held a hearing titled “The State of Universal Service.” The Universal Service Fund is our nation’s critical connectivity program that helps ensure that voice and broadband services are available and affordable throughout the country.

Since its creation by Congress in the 1996 Telecom Act, the USF has become a program that millions of families, community anchor institutions and small businesses rely on to get connected. It has been especially valuable for families and businesses that rely on it for work, school and telehealth at home.

The USF spends about $8.5 billion annually to help fund affordable connectivity in rural areas, low-income households, schools, libraries and rural hospitals. Today, the Federal Communications Commission is working to make high-speed broadband as ubiquitous as telephone service, and broadband is the essential communications technology the USF now supports.

While the USF remains vital in an ever-increasing connected world, it is in serious jeopardy of surviving. To fund the programs, telecom providers are required to pay a certain percentage of their interstate and international telecom revenues, known as the “contribution factor.” Typically, telecom providers collect these USF fees from their customers on their monthly bills.

However, the telecom revenues that fund the USF have declined over 60 percent in the last two decades. As a result, the contribution factor has skyrocketed from about 7 percent in 2001 to a historic high of about 30 percent today, as a higher portion of telecom revenues is needed to sustain the fund. That means certain consumers and businesses are now paying an additional 30 percent on top of their phone bills in order to fund the USF.

Telecom revenues continue to decline so rapidly because customers today rely more on broadband services and less on landline and mobile phone services, but broadband revenues do not pay into the USF. While the FCC has modernized each USF program to help support broadband service, it has not modernized its funding mechanism to require broadband services to pay into the Fund even though historically the agency has required supported services to be included in the contribution system.

Without intervention, the contribution factor is predicted to rise to 40 percent by 2025. This is unsustainable and puts the stability of the entire USF at risk. In fact, the contribution factor has become so high that it has led some groups to challenge the USF in federal court as unconstitutional, which also threatens the sustainability of the USF.

Reforming the USF funding mechanism is urgently needed and long overdue

Over 340 diverse stakeholders have come together as the USForward Coalition calling on the FCC to move forward with USF reform by expanding the contribution base to include broadband revenues. This solution is based on the recommendation in the USForward Report (that INCOMPAS helped commission), which was written by USF expert and former FCC official Carol Mattey.

The USForward Report explains that the most logical way to reform the contribution system and sustain the USF is to include broadband revenues in its funding assessment. Under this approach, the contribution factor is estimated to fall to less than 4 percent. It also means that the services that get USF support are paying into it, rather than solely relying on telecom customers, including those that have not made the switch to broadband, such as older Americans.

In fact, some members of Congress understand the urgency of reform and also want the FCC to act. The Reforming Broadband Connectivity Act, for example, is a bipartisan, bicameral bill that would require the FCC to reform the contribution system within one year.

Some question whether large tech companies should be assessed to contribute to the USF, and the short answer is “No.” Tech companies invest $120 billion each year in global internet infrastructure, and unlike broadband providers, these companies do not request or receive USF funding for these investments.

The FCC also lacks the authority to regulate tech companies and doing so would require Congress to act. This would further delay reform and expand the FCC’s regulatory authority over all online content and services — an overreach that many question as too broad since nearly every business today has an online presence and uses the internet to conduct business. Moreover, proposals to target certain tech companies risk skewing the online marketplace and competitive markets.

Some also question whether we still need the USF at all, and the short answer is “Yes.” While Congress allocated tens of billions for broadband, most of this investment is targeted for deployment, yet a significant portion of the USF programs focus on affordability. We not only have to make sure we build out our broadband networks, but also that communities can then afford to subscribe to these services.

The FCC should not wait to reform the USF. The USForward Report sets out a real plan that the FCC can and should implement. Congress should encourage the FCC to act now and save the nation’s critical connectivity program.

Angie Kronenberg is the president of INCOMPAS, where she manages the policy team and its work before federal, state and local governments, as well as leading the association’s efforts on membership and business development. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Signup for Broadband Breakfast News

Broadband Breakfast Research Partner