Trevor Wagener: State Regulation of Content Moderation Would Create Enormous Legal Costs for Platforms

Dozens of bills regulating internet content moderation have been proposed in at least 30 state legislatures, and one in Utah currently sits on the governor’s desk awaiting his veto or signature. If enacted, many of the bills would impose prescriptive state regulations governing internet content mode

Trevor Wagener: State Regulation of Content Moderation Would Create Enormous Legal Costs for Platforms
The author of this Expert Opinion is Trevor Wagener, director of research and economics at CCIA

Dozens of bills regulating internet content moderation have been proposed in at least 30 state legislatures, and one in Utah currently sits on the governor’s desk awaiting his veto or signature. If enacted, many of the bills would impose prescriptive state regulations governing internet content moderation practices, with requirements differing significantly from state to state.

Many of the bills would also create private rights of action or state enforcement action powers. This could lead to enormous legal costs for platforms. If several of these bills are enacted into law, the resulting patchwork of regulatory requirements and legal risks could create daunting compliance challenges, astronomical costs, and unpredictable legal liability, which only the largest of platforms might be able to manage.

At present, each American internet platform can choose to moderate user-generated content using a variety of approaches depending on the nature of the platform and the desired user experience. A common approach is to utilize automated content moderation to perform triage on specific categories of prohibited content that can be categorized easily. For some, databases already exist. Human moderators are then used to judge a small fraction of hard-to-categorize content.

Automation is the primary tool in content moderation because of efficiency

Using automation as the primary tool in content moderation is common because of cost-efficiency—for example, Microsoft and Alibaba both offer third-party automated content moderation services for about $0.40 per 1,000 images or posts at scale. This implies an automated content review cost of about $0.0004 per image or post. When a human moderator must review content, by contrast, it takes an average of 150 seconds for them to review a post on even the most efficient platforms.

This work requires skill and judgment: Moderators often must quickly make complex decisions with limited context. Even assuming services could retain personnel to do this challenging work at $15/hour, the cost would be at least $0.625 per reviewed post, or 1,500 times the cost of automated moderation.

Some state bills, including Florida’s H.B. 7013 and Utah’s S.B. 228 require platforms to send a written notice to the user and/or the state government every time a post is moderated. This could require a human moderator for most or all user-generated content in those states, increasing moderation costs by a factor of 1,500. In cases where moderation would affect users from multiple states, the process could require two human moderators’ time, duplicating costs.

Florida’s moderation bill also mandates that users be able to request detailed information about the moderation of their posts, while Utah’s mandates that users be allowed to appeal moderation decisions, have appeals reviewed by a human moderator, and even have that moderator explain their decision to the user.

State bills will exponentially increase the cost of content moderation

Depending on the post, the requests for information, the appeal process, and the specific requirements of each bill, that could take anywhere from 5 minutes to half an hour of a human moderator’s time for each instance. At $15/hour, that would cost between $1.25 to $7.50 on top of the $0.625 for initial review. For each request, the process could cost 3,000 to 18,000 times as much as the present baseline.

Not every moderated post would be appealed. However, with current appeal rates of one in 40 to one in 10, the costs would add up. With content moderation rates ranging from one per 20 users annually on some platforms to almost twice per user annually on others, we can establish a range of incremental compliance costs for a hypothetical startup with 20 million users.

These compliance costs multiply with the number of distinct state content moderation bills enacted. If users from multiple states engage with a post, that could require duplicative reviews of the same content based on distinctive state requirements. If only two out of fifty states enact such bills, duplicated reviews might be required for 4 percent of moderated posts; if thirty out of fifty states enact such bills, duplicated reviews might be required for 60 percent of moderated posts.

In the low case, where one in 20 users has a post moderated, one in 40 moderations are appealed, each appeal takes about 5 minutes, and only 4 percent of moderations involve duplicated review of content from users from multiple covered states, that adds up to about $0.68 million in aggregate annually, or $0.034 per user annually.

In the high case, where two posts per user are moderated, one in ten moderations are appealed, each appeal takes about half an hour, and 60 percent of moderations involve duplicated review of content from users from multiple covered states, that adds up to $70 million in aggregate annually, or $3.50 per user annually.

Even in the best case, a ‘low-cost’ compliance total could be ruinous

For a startup, even the low-case total compliance cost of about $0.68 million annually could be ruinous. The high-case total cost of $3.50 per user annually would be an enormous financial hit for platforms of all sizes, since leading platforms only have global average revenue per user of about $10 annually. Therefore, a patchwork of state content moderation bills could unwittingly create a significant barrier to entry, discouraging tech startups from operating in certain states. This would reduce competition, inhibit job creation and curtail the range of online services available in those states.

In addition, some state content moderation bills, such as Oklahomas S.B. 383, Iowas S.F. 580, and North Dakotas H.B. 1144, create legal risks for online platforms. Purported failures to follow state-specific content moderation rules could subject platforms to both government enforcement actions or private rights of action. Even in clear-cut cases with all facts in the platform’s favor, litigation could cost $80,000 through a Motion to Dismiss or $150,000 through a Motion for Summary Judgment, in addition to possible statutory and/or punitive damages. This would be on top of compliance costs.

It is difficult to forecast legal costs under a patchwork of distinct state regulations, as they scale with the number of affected users and the number of states with such bills enacted. Some of the bills would grant rights of action to the user posting content, while others would grant rights of action to users prevented from seeing content. Some would grant private rights of action to both. Many create enforcement powers for state governments in addition to private rights of action. In principle, the moderation of a single post with comments from residents of multiple states could result in a number of enforcement actions and private lawsuits greater than the total number of affected users times the number of states they reside in.

Spiraling legal costs associated with even a single post by a user in a covered state

The moderation of a single hypothetical post by a user in one covered state with comments by users in 29 other covered states would result in 30 affected users in 30 jurisdictions. This could mean more than 900 lawsuits, each plausibly costing the platform $80,000 for a Motion to Dismiss. If each were litigated, that would cost the platform at least $72 million for a single content moderation decision even if one assumes the success of every Motion to Dismiss. These legal liabilities would be capable of bankrupting a startup or midsize platform. If even one in a thousand moderated posts were fully litigated, platforms would have to account for this enormous tail risk as a recurring phenomenon, as many platforms moderate thousands of posts per day.

Startups with limited initial capital are reluctant to operate in jurisdictions with higher legal risk. They may not only avoid locating offices and jobs in high-risk states but also bar state residents from using their platforms altogether. Platforms that continue operating or offering services in such states may dramatically curtail content moderation to manage risk, resulting in a dramatic increase in financial scams, lewd content, and all-around abusive and unpleasant content. This would likely reduce use of such platforms, decreasing both consumer welfare and platform revenue.

In addition to deleterious impacts on platforms, users, and competition, a patchwork of state content moderation bills would burden taxpayers with administrative costs, enforcement costs, and possible litigation damages.

Administrative costs for most state content moderation bills would likely be in excess of Utah’s estimate of $90,000 annually for each such state. In addition, enforcement costs would likely surpass Iowa’s estimate of $700,000 annually for each such state. Given constitutional concerns around such bills and potential legal challenges to governments that force private businesses to host content to which they object, each such state could easily find itself paying damages and fees.

Trevor Wagener is the director of research and economics at the Computer & Communications Industry Association.  Wagener previously served as deputy chief economist of the U.S. Department of State. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views expressed in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Popular Tags