Children’s Online Safety Bills Criticized for Compliance Burden, Plus Speech and Privacy Risks
States are considering measures ranging from age verification to a “duty of care.”
Em McPhie
WASHINGTON, March 17, 2023 — As an increasing number of states start to consider and implement their own laws aimed at protecting children’s online safety, some experts are highlighting concerns about the practical implications of the resulting legislative “patchwork” — as well as concerns that some proposals might actually harm consumers’ digital privacy.
“States have realized that the federal government is going to be very slow in acting in this area,” said James Czerniawski, senior policy analyst at Americans for Prosperity. “So they’re going to try to take the lead here.”
Speaking at a Cato Institute forum on Wednesday, Czerniawski described the two competing approaches that have emerged among the various state laws and proposals.
The first is typified by California’s Age Appropriate Design Code Act, passed in August 2022, which requires that online platforms proactively prioritize the privacy of underage users by default and by design. Many aspects of the law are modeled after the United Kingdom’s Online Safety Bill, a controversial proposal that would establish some of the world’s most stringent internet regulations.
The second approach focuses on age verification, such as Utah legislation that will require social media companies to verify the age of Utah residents before allowing them to create or keep accounts.
In addition to those two core directions, many of the state proposals have their own unique twists, Czerniawski said. For example, the Utah legislation prohibits any design choice that “causes a minor to have an addiction to the company’s social media platform.” While the bill has not yet been signed, Gov. Spencer Cox has previously indicated his intent to do so.
For online platforms that operate nationally or internationally, complying with a growing range of disparate state privacy laws will only become more complicated, Czerniawski said. “This patchwork doesn’t work.”
Potential unintended consequences for free speech, competition and privacy
Some experts have raised concerns that legislation intended to protect children online could have unintended consequences for the privacy and speech rights of adult users.
Matthew Feeney, head of technology and innovation at the Centre for Policy Studies, argued that a heavy compliance burden could incentivize online platforms to over-moderate content. “Given the punitive fines attached to the Online Safety Bill, I think they will engage in an abundance of caution and remove a lot of legal and valuable speech.”
The task of determining which users are underage and then figuring out how to prevent them from seeing any harmful content presents a significant challenge for platforms that host a massive amount of user-generated content, Feeney said.
“Something that’s very crucial to understand is that if you require firms to treat children differently, then you’re asking them to find out which of their users are children — and that is not free; that is a cost,” he added. “And for many firms, I think it will just be cheaper to err on the side of caution and assume all users are children.”
In addition to the implications for online speech, Feeney expressed concern that the regulatory burden adds a “very worrying anti-competitive element” to the legislation. “Most of the companies that will be in scope do not have the army of lawyers and engineers that Meta and Google have,” he said.
While the age verification measures might be easier in terms of compliance, Feeney said, they might ironically create their own risk to children’s online privacy by mandating the collection of highly identifying data.
Czerniawski agreed, specifically pointing to TikTok. “From a privacy standpoint, it seems a little odd that we want to have a company that currently has some security concerns collecting more information on kids in order to continue operating in the country,” he said.
Despite agreeing that there may be legitimate concerns about TikTok’s privacy practices, Czerniawski again argued that many of the proposed solutions — such as a complete national ban — fail to address the actual problem.
“If you’re truly concerned about the privacy issues that TikTok has raised, that’s why… we need a federal data privacy law passed, right? I think that that can go a long way towards solving a lot of those issues,” he said.
In terms of child-specific legislation, Czerniawski called for a more narrowly targeted approach to address problems such as the proliferation of online child sexual abuse material without risking the privacy and free speech rights of all other internet users. “We have to be very serious when we’re looking at trade-offs that are involved here,” he said.