Experts Debate Artificial Intelligence Licensing Legislation

Licensing requirements will distract from wide scale testing and will limit competition, an event heard.

Experts Debate Artificial Intelligence Licensing Legislation
Photo of B Cavello of Aspen Institute, Austin Carson of SeedAI, Aalok Mehta of OpenAI

WASHINGTON, May 23, 2023 – Experts on artificial intelligence disagree on whether licensing is the proper legislation for the technology.

If adopted, licensing requirements would require companies to obtain a federal license prior to developing AI technology. Last week, OpenAI CEO Sam Altman testified that Congress should consider a series of licensing and testing requirements for AI models above a threshold of capability.

At a Public Knowledge event Monday, Aalok Mehta, head of US Public Policy at OpenAI, added licensing is a means to ensuring that AI developers put together safety practices. By establishing licensing rules, we are developing external validation tools that will improve consumer experience, he said.

Generative AI — the model used by chatbots including OpenAI’s widely popular ChatGPT and Google’s Bard — is AI designed to produce content rather than simply processing information, which could have widespread effects on copyright disputes and disinformation, experts have said. Many industry experts have urged for more federal AI regulation, claiming that widespread AI applications could lead to broad societal risks including an uptick in online disinformation, technological displacement, algorithmic discrimination, and other harms.

Some industry leaders, however, are concerned that calls for licensing are a way of shutting the door to competition and new startups by large companies like OpenAI and Google.

B Cavello, director of emerging technologies at the Aspen Institute, said Monday that licensing requirements place burdens on competition, particularly small start-ups.

Implementing licensing requirements can place a threshold that defines a set of players allowed to play in the AI space and a set that are not, said B. Licensing can make it more difficult for smaller players to gain traction in the competitive space, B said.

Already the resources required to support these systems create a barrier that can be really tough to break through, B continued. While there should be mandates for greater testing and transparency, it can also present unique challenges we should seek to avoid, B said.

Austin Carson, founder and president of SeedAI, said a licensing model would not get to the heart of the issue, which is to make sure AI developers are consciously testing and measuring their own models.

The most important thing is to support the development of an ecosystem that revolves around assurance and testing, said Carson. Although no mechanisms currently exist for wide-scale testing, it will be critical to the support of this technology, he said.

Base-level testing at this scale will require that all parties participate, Carson emphasized. We need all parties to feel a sense of accountability for the systems they host, he said.

Christina Montgomery, AI ethics board chair at IBM, urged Congress to adopt precision regulation approach to AI that would govern AI in specific use cases, not regulating the technology itself in her testimony last week.