Connect with us

Privacy

Panelists Call Federal Privacy Legislation Necessary and Express Optimism Toward That Goal

Published

on

Photo of privacy panel at State of the Net conference by David Jelke

WASHINGTON, January 29, 2020 – Federal privacy legislation is necessary to ensure standardization of legal treatment, technology experts said Tuesday at the 2020 State of the Net conference.

On a panel on “Privacy and Preemption” at the annual tech policy gathering, speakers acknowledged the role of the California Consumer Privacy Act in spurring discussion and action. But the CCPA fails to provide rules that are clear to consumers about their rights and clear to businesses about their obligations.

Jessica Rich, former director of the Federal Trade Commission’s Bureau of Consumer Protection, highlighted the irony of a visitor from Nevada to California travelling across the border and facing a situation of vastly different rights. This created a “bad consumer experience,” Rich said.

Chris Calabrese, vice president of the Center for Democracy and Technology, praised Congress’ handling of privacy legislation thus far, citing the seriousness with which representatives have approached the issue in the wake of serious national data breaches.

The panel also addressed where small businesses fit into the tug-of-war of data privacy. Rich said many small businesses can come into contact with sensitive user data and should not be given a free pass in legislation.

“Lest we forget, Cambridge Analytica was a small company,” cracked Jason Albert, Director of Public Policy at Workday.

The panel also defended the FTC against the criticism that it is unfit to enforce data privacy legislation. Albert argued that the FTC has an internationally respected reputation that would be very difficult for a new agency to imitate.

Rich pointed to the fact that the FTC is under-sourced with about only 50 employees assigned to privacy issues for the whole country. Calabrese added to the sentiment by referencing how the Information Commissioner’s Office, Britain’s FTC equivalent, has 500 employees for a nation a quarter the population.

Calabrese also expressed a desire for companies to have the ability to check in with the FTC regarding grey area privacy practices so as to avoid accidental infringement of privacy law.

The moderator capped off the session by asking for the panelists’ predictions as to the future of privacy law. The panel was cautiously optimistic about the future of privacy and believed that the country still had time to sort through the implications that information technologies pose.

Privacy

Metaverse Technologies Could Present Unprecedented Risk to Children’s Digital Privacy

Existing digital privacy concerns are amplified in an environment designed to fully immerse users.

Published

on

Photo by Alvin Trusty used with permission

WASHINGTON, January 18, 2023 — As immersive virtual reality technologies gain popularity among children and teenagers, there is an increasing need for legislation that specifically addresses the industry’s unprecedented capacity for data collection, said attorneys at a Practicing Law Institute webinar on Friday.

Without downplaying the potential benefits of “metaverse” technology, it is important to understand how it differs from the current internet and how that will impact children, said Leeza Garber, a cybersecurity and privacy attorney.

“When you’re talking about being able to feel something with the haptic gloves, which are in advanced states of development, or even an entirely haptic suit, you’re talking about the potential for cyberbullying, harassment, assault to happen to minors in a completely different playing field — where right now there’s not so much proactive legislation,” Garber said.

Although the metaverse is often framed as a thing of the future, it actually just entails “an immersive, visual, virtual experience,” said Gail Gottehrer, founder and chairperson of the New York State Bar Association’s cybersecurity subcommittee.

Defined as such, the metaverse has already gained widespread popularity. “The next generation of children will spend approximately 10 years in virtual reality in their lives. So that’s the equivalent of around two hours and 45 minutes per day,” Gottehrer said, citing research from the Institution of Engineering and Technology.

The user base of one such platform, Roblox, “includes 50 percent of all kids under the age of 16 in the United States — so it’s huge for minors,” Garber said.

For a generation that has grown up with social media integrated into everyday life, the “interplay of personal data with gaining the benefit of using this type of platform is just simply accepted,” Garber added. “We have to be more proactive in a space where this new iteration of the internet will have the capacity to take in so much more data.”

‘Staggering’ amount of data collected in the metaverse

The data collected by metaverse technology is “staggering,” Gottehrer said. Virtual reality equipment can track eye and head movements, heart rates, muscle tension, brain activity and gait patterns. After just a few minutes of use, the “motion signature” created by this data can be used to identify people with 95 percent accuracy.

This data can also identify neurodiversity and some forms of disability that affect movement, such as Parkinson’s disease.

“If you’re a child and this data is already being collected on you, where might that down the road follow you in your life?” Gottehrer asked.

Only a handful of states have specific regulations for the collection of biometric data, but Garber predicted that more states will likely pass similar legislation, albeit “at a glacial pace.”

However, many experts worry that it will not be fast enough, particularly when it comes to protecting children’s digital privacy. “While we know technology moves at a pace that’s much faster than courts or litigation, there’s really a concern that [the Children’s Online Privacy Protection Act] is dragging behind,” Gottehrer said.

Compounding these concerns is the confusion over who should be setting these regulations in the first place. In September, as privacy legislation stalled in Congress, Sens. Ed Markey, D-Mass., and Richard Blumenthal, D-Conn., wrote a letter urging the Federal Trade Commission to use its regulatory authority to update COPPA.

The letter “does not send a great message,” Garber said. And without decisive government action, tech companies currently hold great power to set the standards and practices that will shape the industry’s regulatory development in the future.

“Self-regulation by metaverse stakeholders — is that is that viable? Is that advantageous?” Gottehrer asked. “I think it’s safe to say we have not seen tremendous success at self-regulation of the current version of the internet — that might be a dramatic understatement.”

For an example of how companies might fail to proactively protect underage users, Gottehrer pointed to Instagram. According to internal documents shown to the Wall Street Journal in September 2021, Facebook knew for some time that Instagram was harmful to the mental health of teenage users, based on internal research, and yet continued to develop products for an even younger audience.

“All of these issues become amplified in an environment where you’re supposed to be completely immersed,” Garber said.

Continue Reading

Privacy

Businesses Should Prepare for More State-Specific Privacy Laws, Attorneys Say

“The privacy landscape in the U.S. is likely to become more complicated before it gets any easier.”

Published

on

Photos of Joan Stewart, Kathleen Scott and Duane Pozza courtesy of Wiley

WASHINGTON, January 13, 2023 — In the absence of overarching federal legislation, several states are passing or considering their own privacy laws, creating an increasingly disparate legal landscape that may be difficult for national companies to navigate.

“I think the privacy landscape in the U.S. is likely to become more complicated before it gets any easier,” said Joan Stewart, an attorney specializing in privacy, data governance and regulatory compliance, at a webcast hosted by Wiley on Thursday.

New privacy laws in California and Virginia took effect on Jan. 1, and Colorado and Connecticut have privacy laws set to become effective in July. Utah’s privacy law will go into effect at the end of December.

 “We expect to see additional states actively considering both omnibus and targeted privacy laws this year,” Stewart said. “So we encourage businesses to focus now on creating universal privacy programs that can adapt to these new laws in the future.”

Although the various state laws have plenty of overlap, there are also several significant outliers, said Kathleen Scott, a privacy and cybersecurity attorney.

States take different approaches to imposing privacy

For example, the new California Privacy Rights Act — which amends and strengthens California’s existing digital privacy law, already considered the strongest in the country — requires that businesses use specific words to describe the categories of personally identifying information being collected.

“These words are unique to California; they come from the statute, and they don’t always make perfect sense outside of that context,” Scott said.

Another area of difference is the consumer’s right to appeal privacy-related decisions. Virginia, Colorado and Connecticut require businesses to offer a process through which they explain to consumers why a specific request was denied.

While implementing a universal standard make compliance easier for businesses, Scott noted that “processing appeals can be pretty resource intensive, so there may be important reasons not to extend those outlier requirements more broadly to other states.”

Generally speaking, the state privacy laws apply to for-profit businesses and make an exception for nonprofits. However, Colorado’s law applies to for-profit and nonprofit entities that meet certain thresholds, and the Virginia and Connecticut laws carve out select nonprofits as exempt instead of having a blanket exemption.

Other state-to-state differences include specific notices, link requirements and opt-in versus opt-out policies. Even key definitions, such as what qualifies as “sensitive data,” vary from state to state.

Two of the state privacy laws taking effect in 2023 authorize the development of new rules, making it likely that additional expectations are on the horizon.

California will not begin civil and administrative enforcement of the CPRA until July. In the meantime, the state’s new privacy agency is charged with developing rules for its implementation, including specific directives for required notices, automated decision-making and other issues.

“The California rulemaking has been particularly complicated… and the outcome is going to have significant impacts on business practices,” said Duane Pozza, an attorney specializing in privacy, emerging technology and financial practices.

The state’s attorney general is arguing that existing rules require a global opt-out mechanism, but the new law establishes this as optional, Pozza explained. The currently proposed rules would again require a global opt-out.

Colorado’s attorney general is undertaking a similar rulemaking process, revising a previously released draft of the rules in preparation for a February hearing.

Several additional states are expected to propose broad or targeted privacy laws during the coming legislative cycle, according to data published Thursday by the Computer and Communications Industry Association. In addition to comprehensive consumer data privacy legislation, several measures address the collection of biometric information and children’s online safety, the CCIA found.

Continue Reading

Privacy

CES 2023: Federal Privacy Standard Needed for Consumer Protection

Federal regulation is needed since companies often renege on voluntarily accepted agreements, argued Public Knowledge head.

Published

on

Photo of Public Knowledge President and CEO Chris Lewis

LAS VEGAS, January 7, 2023 – Despite certain self-imposed industry standards, a federal privacy law is necessary to ensure consistent compliance and protect consumer rights, said Chris Lewis, president and CEO of advocacy group Public Knowledge, at the Consumer Electronics Show Saturday afternoon.

Many experts and policymakers have called for a comprehensive national privacy law, but, although such a bill gained bipartisan support in 2022 before stalling, Congress has yet to act. Lewis argued that federal regulation is needed since companies often renege on voluntarily accepted privacy standards. “In the era of big data, the harms that come with [violations] are just exacerbated because of how much data is out there, both the harms to consumers and users [and] the harms to competition,” Lewis said.

Later in the panel, Lewis argued that case-by-case enforcement, as attempted by the current Federal Trade Commission, cannot keep pace with innovation. Privacy and data-protection rules would force innovators to consider those issues earlier in the product-design process, he said.

“You go up and you talk to [exhibitors at CES] and ask them things get at consumer-protection harms that could come from their technology or could come from the next iteration of their technology…you can tell that those harms are not front of mind, so we want to see them become more front of mind,” Lewis said.

The absence of federal action has led several states, including California, to pass their own privacy codes. Panelists argued that this “patchwork” of inconsistent state regulations creates uncertainty for consumers and the business community.

Continue Reading

Signup for Broadband Breakfast

Twice-weekly Breakfast Media news alerts
* = required field

Broadband Breakfast Research Partner

Trending