Connect with us

Expert Opinion

Expert Opinion: New Domain Names are Coming, and Present Opportunities and Risks

On June 20, the Internet Corporation for Assigned Names and Numbers (ICANN) formally approved the program it has developed for creation of new generic top-level domains (gTLDs). The new gTLD program will expand the domain name system beyond the current 22 generic top-level domain names such as .com, .net, and .org, to potentially include just about .anything and .everything to the “right of the dot” as top-level domains. The new gTLDs will likely include generic and geographic TLDs such as .bike and .paris, as well as .brand registries that correspond to trademarks and company names such as .deloitte.

Published

on

On June 20, the Internet Corporation for Assigned Names and Numbers (ICANN) formally approved the program it has developed for creation of new generic top-level domains (gTLDs).  The new gTLD program will expand the domain name system beyond the current 22 generic top-level domain names such as .com, .net, and .org, to potentially include just about .anything and .everything to the “right of the dot” as top-level domains.  The new gTLDs will likely include generic and geographic TLDs such as .bike and .paris, as well as .brand registries that correspond to trademarks and company names such as .deloitte.

ICANN’s new gTLD program has raised a number of consumer and brand protection concerns over the many years that the program has been under consideration, including from members of Congress at a recent hearing of the House Judiciary Subcommittee on Intellectual Property, Competition and the Internet. (House Subcommittee Scrutinizes Possible Domain Name Expansion).  ICANN has revised the proposed gTLD program guidelines on a number of occasions, and statements during the June 20 meeting made clear that ICANN’s leaders believe the organization has now successfully balanced the desire to expand the domain name system with the need to provide consumer and brand name protection mechanisms.  While the propriety of the nature and terms for the new gTLD program will likely be subject to continuing debate for years to come, ICANN has now provided a firm timetable for the program, and organizations of all sizes should evaluate the potential implications of the program if they have not already begun to do so.

New gTLD Program Details

The policies governing the new program are detailed in ICANN’s 352-page gTLD Applicant Guidebook, which will govern the launch of the new gTLD program.  While certain portions of the Guidebook are subject to continuing negotiations (most notably between ICANN and the Governmental Advisory Committee or “GAC”), many components of the gTLD program appear to be firmly settled.  Not just anybody will be eligible for a new gTLD under the rules. Only established corporations, organizations or institutions will even be considered, and individuals, sole proprietorships or as-yet unfounded companies will be barred from applying.

Applications for the first round of new gTLDs will be accepted by ICANN from January 12, 2012 through April 12, 2012, and ICANN’s application fee will be $185,000.  ICANN will post the public portions of all gTLD applications within two weeks of April 12, 2010 and it will then undertake a review of each application to determine whether the proposed gTLD is appropriate for approval.  ICANN will initially assess each proposed gTLD for similarity to existing gTLDs and to determine whether the proposed gTLD will present new Internet security or stability concerns.  ICANN will also evaluate the applicant organization itself to ensure that it will be able to properly handle the technical, operational, and financial responsibilities required to maintain a domain name registry.  The applicant review will also include a review of the applicant organization and its officers, directors, and controlling shareholders for any past criminal or otherwise unacceptable behavior such as cybersquatting.

Results of ICANN’s initial evaluation of the proposed gTLDs will be made public by ICANN in November 2012.  Concurrently with ICANN’s initial evaluation, there will be a procedure for filing formal objections to issuance of a new gTLD based on the following four categories:  legal rights (including intellectual property rights), string confusion with another gTLD, limited public interest, or a community objection due to the nature of the proposed gTLD.  It is anticipated that new gTLDs that are approved by ICANN, and not opposed by interested parties, will likely begin to “go live” in the first quarter of 2013.  If ICANN adheres to the foregoing timeline, second-level domain names from the new gTLDs (such as tires.bike, locks.bike, etc.) will likely go on sale to the public in mid to late 2013.

Planning for New Domain Names

Many organizations have understandably adopted a “wait and see” approach to ICANN’s new gTLD program over the last few years given the ongoing evolution of the gTLD Applicant Guidebook and the concerns that have been previously raised by national governments and other stakeholders concerning the program.  However, it is now clear that the program is moving forward, and the Chairman of ICANN’s Board of Directors has stated that the new gTLD program “will usher in a new Internet age.”  To avoid missing the launch of the “new Internet age” and the associated practical and legal implications, organizations of all types and sizes should begin to evaluate the new gTLD program.  In particular, the following steps should be taken:

1.Consider whether to pursue a gTLD application.  Given the significant business implications for pursuing or abstaining from both generic and branded gTLDs, and the costs associated with pursuing a new gTLD, discussions should include high-level executives of the organization.

2.Prepare an advance strategy for monitoring and, if necessary, responding to third-party or competitor gTLD applications through public comments and/or formal objections.

3. Develop and/or revisit the organization’s strategy for Internet brand protection given the potentially astronomical number of second-level domain names that could be released when new gTLD’s begin full operations in 2013.  Existing Internet brand protection protocols may need to be revisited based on the rights protection mechanisms in the new gTLD program, and to realistically budget for such efforts.

 

David E. Weslow is a partner in the Intellectual Property practice at Wiley Rein LLP in Washington, DC.  Mr. Weslow focuses his practice on litigation, prosecution and licensing of trademarks, copyrights and domain names.  A former software and web developer, he regularly handles cutting-edge issues involving law and technology.  Mr. Weslow can be reached at 202.719.7525 or dweslow@wileyrein.com.

*                       *                       *

This is a publication of Wiley Rein LLP providing general news about recent legal developments and should not be construed as providing legal advice or legal opinions. You should consult an attorney for any specific legal questions.

 

David E. Weslow is a partner in the Intellectual Property practice at Wiley Rein LLP in Washington, DC. Mr. Weslow focuses his practice on litigation, prosecution and licensing of trademarks, copyrights and domain names. A former software and web developer, he regularly handles cutting-edge issues involving law and technology. Mr. Weslow can be reached at 202.719.7525 or dweslow@wileyrein.com.

Expert Opinion

Kate Forscey: National Security and Global Success Depend Upon Prioritizing Telecom Funding

The Affordable Connectivity Program and the Rip-and-Replace program are both central funding needs for the industry.

Published

on

The author of this Expert Opinion is Kate Forscey, contributing fellow for the Digital Progress Institute

With the government now funded into the new year, it’s time for Congress to take another look at its broader priorities, especially when it comes to the race with China for dominance in next-generation technologies. Whether it’s AI or cloud computing or virtual reality, if the United States is to remain competitive, we need to make secure and effective communications a priority. This means finally connecting all Americans to high-speed broadband and ensuring that our connectivity cannot be undermined by foreign adversaries.

Two popular programs are central to this goal: the Affordable Connectivity Program and the Rip-and-Replace program. Both of these programs have tremendous bipartisan, bicameral support; but both have been underfunded and now risk dying on the vine. Congress has the opportunity to fully fund these programs if it has the will to do so.

Let’s break it down.

The Affordable Connectivity Program provides low-income American families and veterans with discounts on Internet service and connectivity equipment, including higher discounts for those living on Tribal lands. With affordable broadband, more Americans can get online and be a part of the digital economy.

The ACP has been wildly successful, connecting over 21 million households to essential broadband they could otherwise not afford. And it continues to garner widespread support, with the vast majority of voters (78%) calling for its extension, including 64% of Republicans, 70% of Independents, and 95% of Democrats.

Congress provided the ACP with $14.2 billion in 2021—but funding is now running low and is projected to be fully exhausted by spring 2024. Governors, lawmakers on both sides of the aisle, public interest groups, and Internet service providers are all raising the alarm about its imminent depletion. That’s why the Biden Administration in October called on Congress to replenish the program’s coffers with an additional $6 billion.

A good start, but not the whole story. Our foreign adversaries are well known for their espionage, and while a spy balloon might get the attention, a far more insidious problem lurks in our communications networks: equipment designed and produced by Chinese suppliers Huawei and ZTE. A bipartisan Congress passed the Secure and Trusted Networks Act to eradicate national security threats such as these, but sufficient funding for the Rip and Replace program has never materialized.

Again, the Biden Administration has stepped up and identified a need for $3.1 billion to fully fund the program as a “key national security priority” in its emergency supplemental funding request. It’s a narrative we can all get on-board with: that broadband falls under the umbrella of national security as a whole. American consumers and institutions both benefit from American-built networks and increased protection at home. But communications providers can’t live up to these needs on their own.

As it stands, the responsibility to get affordable, secure connectivity programs across the finish line rests with Congress. Even with a consensus of support for these two programs, the devil is in the details of how to make the price tags palatable to enough policymakers on Capitol Hill. The key is ensuring that any changes preserve the widespread efficacy of the program that has made it popular so far.

For example, Congress could cut the cost of the ACP by limiting the additional Tribal funding to rural Tribal lands. Any such change should be grounded in an evaluation of existing need in urban areas, but could be an opportunity to ensure funds are being directed to areas of greatest need. And Congress should consider indexing the ACP to inflation. The high inflation of recent years has wreaked havoc on the budgets of consumers—and inflation-proofing the program would ensure that broadband remains affordable for all Americans even should inflation come back.

As for Rip-and-Replace, those of us urging for more funds could concede putting safeguards in place to ensure the money is being used for its intended purpose – the kind of compromise needed to get such policies across the finish line

These are just some ideas as we head into the final funding fight. Not everyone is going to be on the same page on what is and isn’t working best, but shared success starts by recognizing that we all have the same endgame. Congress must ensure that adequate funding for the ACP and Rip and Replace program are included in any year-end spending package. We have an all-too-rare opportunity to win the race for high-tech dominance—we just need to provide the resources.

Kate Forscey is a contributing fellow for the Digital Progress Institute and principal and founder of KRF Strategies LLC. She has served as senior technology policy advisor for Congresswoman Anna G. Eshoo and policy counsel at Public Knowledge. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Expert Opinion

Ryan Johnston: What Happens to BEAD Without the Affordable Connectivity Program?

We’d be building broadband to no one without the ACP. The ACP extends every BEAD dollar further.

Published

on

The author of this Expert Opinion is Ryan Johnston, senior policy counsel at Next Century Cities

Congress dedicated more than $42 billion to help states and companies build out broadband networks to all Americans. This program, called the Broadband Equity, Access, and Deployment Program, marked a crucial step towards bridging the digital divide in our nation. But this program will fail if Congress doesn’t renew the Affordable Connectivity Program that states are relying on to connect low-income Americans.

Bipartisan legislation from Congress made it clear that states needed to offer a low-cost broadband plan to residents to qualify for BEAD funding. For the uninitiated, the ACP is a $30-a-month subsidy that an eligible consumer can use towards any broadband plan a participating service provider offers.

In fact, many providers have started offering broadband plans at a $30 price point so the effective cost of broadband to the consumer is zero. Using ACP is an easy way for ISPs to meet the affordability requirement, a “short-hand” of sorts for them to offer affordable plans using an existing — and successful — model.

However, the ACP is expected to exhaust its funding in the first half of next year, leaving a potentially disastrous scenario for families who may have little savings or discretionary income. Ultimately allowing the ACP to end leaves a crucial question unanswered: what good are networks if people cannot afford to connect to them?

During a congressional oversight hearing in May, National Telecommunications and Information Agency Administrator Alan Davidson explained to Members of Congress that the BEAD program will be negatively impacted if continued funding for the ACP is not found. He emphasized that for low-income rural Americans, the ACP is the lifeline ensuring they can afford to access the internet. Without it, some providers may hesitate to deploy in rural areas over fear that the investment will be sustainable. Subscribership concerns may prove to be a limiting factor on which rural areas are served.

The ACP extends every BEAD dollar further. A study conducted by Common Sense Media found that the ACP could reduce the BEAD subsidy needed to incentivize providers to build in rural areas by up to 25% per year. According to the study, ACP reduces the per-household subsidy required to incentivize ISP investment by $500. Simply put, ACP improves the economic case because it 1) effectively lowers the cost of service, 2) creates a customer base with less churn, and 3) makes subscribers easier to acquire because of the massive public and private investment in raising awareness for the program.

But if the ACP is allowed to end, the federal government could end up overspending on every broadband deployment made through BEAD. This ultimately means BEAD networks will fail to connect millions of Americans.

The ACP is more than a simple affordability program; for over 21 million households; it’s a gateway to our ever-increasing digital society. Without it, millions of Americans will be unable to see doctors, visit with family, shop, and engage with their communities online. At the same time, the ACP plays a significant role in future infrastructure deployment. Allowing the ACP to end all but ensures that millions will be disconnected and future funding dollars won’t go the distance to close the digital divide.

Ryan Johnston is senior policy counsel at Next Century Cities. He is responsible for NCC’s federal policy portfolio, building and maintaining relationships with Federal Commissions Commission officials, members of Congress and staff, and public interest allies. Working with various federal agencies, Ryan submits filings on behalf of NCC members on technology and telecommunications related issues that impact the digital divide such as broadband data mapping, benchmark speeds, spectrum policy, content moderation, privacy, and others. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Artificial Intelligence

Will Rinehart: Unpacking the Executive Order on Artificial Intelligence

Most are underweighting the legal challenges and problems to rule of law.

Published

on

The author of this Expert Opinion is Will Rinehart, senior research fellow at the Center for Growth and Opportunity

If police are working on an investigation and want to tap your phone lines, they’ll effectively need to get a warrant. They will also need to get a warrant to search your home, your business, and your mail.

But if they want to access your email, all they need is just to wait for 180 days.

Because of a 1986 law called the Electronic Communications Privacy Act, people using third-party email providers, like Gmail, only get 180 days of warrant protection. It’s an odd quirk of the law that only exists because no one in 1986 could imagine holding onto emails longer than 180 days. There simply wasn’t space for it back then!¹

ECPA is a stark illustration of consistent phenomena in government: policy choices, especially technical requirements, have durable and long-lasting effects. There are more mundane examples as well. GPS could be dramatically more accurate but when the optical system was recently upgraded, it was held back by a technical requirement in the Federal Enterprise Architecture Framework (FEAF) of 1999. More accurate headlights have been shown to be better at reducing night crashes yet adaptive headlights only just got approved last year, nearly 16 years after Europe because of technical requirements in FMVSS 108. All it takes is one law or regulation to crystallize an idea into an enduring framework that fails to keep up with developments.

I fear the approach pushed by the White House in their recent executive order on AI might represent another crystallization moment. ChatGPT has been public for a year, the models on which they are based are only five years old, and yet the administration is already working to set the terms for regulation.

The “Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence” is sprawling. It spans 13 sections, extends over 100 pages, and lays out nearly 100 deliverables for every major agency. While there are praiseworthy elements to the document, there is also a lot of cause for concern.

Among the biggest changes is the new authority the White House has claimed over newly designated “dual use foundation models.” As the EO defines it, a dual-use foundation model is

  • an AI model that is trained on broad data; generally uses self-supervision; contains at least tens of billions of parameters; is applicable across a wide range of contexts; and that exhibits, or could be easily modified to exhibit, high levels of performance at tasks that pose a serious risk to security, national economic security, national public health or safety, or any combination of those matters.

While the designation seems to be common sense, it is new and without provenance. Until last week, no one had talked about dual use foundation models. Rather, the designation does comport with the power the president has over the export of military tech.

As the EO explains it, the administration is especially interested in those models with the potential to

  • lower the barrier of entry for non-experts to design, synthesize, acquire, or use chemical, biological, radiological, or nuclear weapons;
  • enable powerful offensive cyber operations through automated vulnerability discovery and exploitation against a wide range of potential targets of cyber attacks; or
  • permit the evasion of human control or oversight through means of deception or obfuscation

The White House is justifying its regulation of these models under the Defense Production Act, a federal law first enacted in 1950 to respond to the Korean War. Modeled after World War II’s War Powers Acts, the DPA was part of a broad civil defense and war mobilization effort that gave the President the power to requisition materials and property, expand government and private defense production capacity, ration consumer goods, and fix wage and price ceilings, among other powers.

The DPA is reauthorized every five years, which has allowed Congress to expand the set of presidential powers in the DPA. Today, the allowable use of DPA extends far beyond U.S. military preparedness and includes domestic preparedness, response, and recovery from hazards, terrorist attacks, and other national emergencies. The DPA has long been intended to address market failures and slow procurement processes in times of crisis. Now the Biden Administration is using DPA to force companies to open up their AI models.

The administration’s invocation of the Defense Production Act is clearly a strategic maneuver to utilize the maximum extent of its DPA power in service of Biden’s AI policy agenda. The difficult part of this process now sits with the Department of Commerce, which has 90 days to issue regulations.

In turn, the Department will likely use the DPA’s industrial base assessment power to force companies to disclose various aspects of their AI models. Soon enough, dual use foundation models will have to report to the government tests based on guidance developed by the National Institute of Standards and Technology (NIST). But that guidance won’t be available for another 270 days. In other words, Commerce will regulate companies without knowing what they will be beholden to.

Recent news from the United Kingdom suggests that all of the major players in AI are going to be included in the new regulation. In closing out a two-day summit on AI, British Prime Minister Rishi Sunak announced that eight companies were going to give deeper access to their models in an agreement that had been signed by Australia, Canada, the European Union, France, Germany, Italy, Japan, Korea, Singapore, the U.S. and the U.K. Those eight companies included Amazon Web Services, Anthropic, Google, as well its subsidiary DeepMind, Inflection AI, Meta, Microsoft, Mistral AI, and OpenAI.

Thankfully, the administration isn’t pushing for a pause on AI development, they aren’t denouncing more advanced models, nor are they suggesting that AI needs to be licensed. But this is probably because doing so would face a tough legal challenge. Indeed, it seems little appreciated by the AI community that the demand to report on models is a kind of compelled speech, which has typically triggered First Amendment scrutiny. But the courts have occasionally recognized that compelled commercial speech may actually advance First Amendment interests more than undermine them.

The EO clearly marks a shift in AI regulation because of what will come next. In addition to the countless deliverables, the EO encourages agencies to use their full power to advance rulemaking.

For example, the EO explains that,

  • the Federal Trade Commission is encouraged to consider, as it deems appropriate, whether to exercise the Commission’s existing authorities, including its rulemaking authority under the Federal Trade Commission Act, 15 U.S.C. 41 et seq., to ensure fair competition in the AI marketplace and to ensure that consumers and workers are protected from harms that may be enabled by the use of AI.

Innocuous as it may seem, the Federal Trade Commission, as well as all of the other agencies that have been encouraged to use their power by the administration, could come under court scrutiny. In West Virginia v. EPA, the Supreme Court made it more difficult for agencies to expand their power when the court established the major questions doctrine. This new line of legal reasoning takes an ax to agency delegation. Unless there’s explicit, clear-cut authority granted by Congress, an agency cannot regulate a major economic or political issue. Agency efforts to push rules on AI could get caught up by the courts.

To be fair, there are a lot of positive actions that this EO advances.² But details matter, and it will take time for the critical details to emerge.

Meanwhile, we need to be attentive to the creep of power. As Adam Thierer described this catch-22,

  • While there is nothing wrong with federal agencies being encouraged through the EO to use NIST’s AI Risk Management Framework to help guide sensible AI governance standards, it is crucial to recall that the framework is voluntary and meant to be highly flexible and iterative—not an open-ended mandate for widespread algorithmic regulation. The Biden EO appears to empower agencies to gradually convert that voluntary guidance and other amorphous guidelines into a sort of back-door regulatory regime (a process made easier by the lack of congressional action on AI issues).

In all, the EO is a mixed bag that will take time to shake out. On this, my colleague Neil Chilson is right: some of it is good, some is bad, and some is downright ugly.

Still, the path we are currently navigating with the Executive Order on AI parallels similar paths in ECPA, GPS, and adaptive lights. It underscores a fundamental truth about legal decisions: even the technical rules we set today will shape the landscape for years, perhaps decades, to come. As we move forward, we must tread carefully, ensuring that our legal frameworks are adaptable and resilient, capable of evolving alongside the very technologies they seek to regulate.

Will Rinehart is a senior research fellow at the Center for Growth and Opportunity, where he specializes in telecommunication, internet and data policy, with a focus on emerging technologies and innovation. He was formerly the Director of Technology and Innovation Policy at the American Action Forum and before that a research fellow at TechFreedom and the director of operations at the International Center for Law & Economics. This piece originally appeared in the Exformation Newsletter on November 9, 2023, and is reprinted with permission.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Signup for Broadband Breakfast News



Broadband Breakfast Research Partner

Trending