Connect with us

Privacy

Consumer Privacy Must Rise To Priority In Biden Agenda, Experts Urge

FCBA panelists discuss data privacy and consumer protection challenges for the Biden administration.

Published

on

Photo of Dona Fraser of Better Business Bureau

April 27, 2021 – Consumer data privacy needs to be a high priority for the Biden administration, according to panelists at a Monday event hosted by the Federal Communications Bar Association.

From gaming apps to social media to telehealth, consumer data is an essential component of the digital age and a core business model for tech companies. FCBA panelists discussed how to protect consumer privacy in online spaces.

With several states passing or discussing data privacy legislation, including California, Virginia and Washington, the pressure is mounting for the federal government to take action on this issue.

There is great anticipation for the Biden administration to push for federal privacy legislation, said Dona Fraser, senior vice president of privacy initiatives at the Better Business Bureau. One of the current challenges is that states are passing their own privacy laws, and tech companies need to solve the compliance issue across state lines, she said.

Melissa Maalouf, counsel at tech law firm Zwillgen, expressed similar sentiment. Companies need to navigate the patchwork of data laws in the U.S., she said. Taking all levels of government into consideration, including federal, state, and municipal, there’s over 300 laws on the books right now related to data privacy, and they’re all different, she said.

Social media and Section 230

Social media companies and internet liability provision Section 230 are in the spotlight right now, and changes to content moderation, competition among the tech platforms, and transparency in their algorithms are all issues that need to be dealt with, said Chris Lewis, CEO of Public Knowledge. Good policy is in the details, but right now it’s becoming a race to the bottom for government to handle tech policy, he said.

There’s very little government authority on broadband unless something changes, and when it comes to tech platforms and Section 230, there is no authority and no accountability, Lewis said. Artificial intelligence and algorithms are not magic, he said — they’re built with math and computers and the companies need to be held accountable for them.

Consumers use many apps of all types, many of them are games or other entertainment, but some of them serve essential functions, such as medical or telehealth, especially during a pandemic, said Brian Scarpelli of ACT, the App Association. Policy needs to be carefully drafted to ensure the protection of those important apps, based on evidence rather than partisan politics or hyperbole, he said.

Agency work on consumer privacy 

While legislation on data privacy is being considered, federal agencies are tackling consumer privacy under current law as well, such as the Federal Trade Commission’s work on COVID-19 scams.

They’re handling discriminatory actions in advertising and algorithms that use data for deceptive practices, said Frank Gorman, acting deputy director of the Bureau of Consumer Protection at the Federal Trade Commission. But they also include other fraud cases like fake stimulus checks or medical equipment, he said.

At the Federal Communications Commission, robocalling is still the number one complaint the agency receives, said Diane Holland, legal advisor for Commissioner Geoffrey Starks’ office at the agency.

To address that problem and help protect consumers from scams, they have to use every tool in the toolbox, she said.

Reporter Tim White studied communication and political science at the University of Utah, and previously worked on Capitol Hill for a member of Congress. A native of Salt Lake City, he escapes to the Pacific Northwest as often as he can. He is passionate about politics, Star Wars, and breakfast cereal.

Privacy

Industry Leaders Urge Biden to Tread Lightly With Federal Privacy Legislation

While generally supportive of a federal privacy law, some experts warned it could harm competition.

Published

on

Screenshot of Jules Polonetsky, executive director of the Future of Privacy Forum

WASHINGTON, January 27, 2023 — While there is a need for federal privacy regulation, such legislation could harm innovation, research and competition if not carefully designed, according to leaders of top technology trade associations and think tanks.

“In my opinion, there’s a lot of overblown statements about how bad the problems are,” said Rob Atkinson, president of the Information Technology and Innovation Foundation. “And a lot of the regulatory solutions… would actually, in our view, do more harm than good.”

Speaking at a panel hosted by the ITIF on Wednesday, industry experts responded to President Joe Biden’s recent op-ed calling on Congress to unite against Big Tech companies in three areas of concern: privacy, content moderation and competition.

“It’s gotten to this amazing point where we have a lot of the world’s most dominant tech companies, and to see a headline about uniting against Big Tech by the President of the United States, it hurts me greatly,” said Gary Shapiro, president and CEO of the Consumer Technology Association.

One of Biden’s policy recommendations was to place limitations on targeted advertising and ban the practice altogether for children. But Atkinson disagreed with the implication that targeted advertising is an inherent violation of privacy.

“Targeted advertising is almost always done in an anonymous way where the advertiser doesn’t know my name — they just know that I like to ride my bicycle to work, and so they send me a bicycle ad,” he said.

Despite concerns about the extent of certain proposals, the tech leaders were generally supportive of federal privacy legislation. “It’s incredible that the U.S. is one of the very few democracies in the entire world that does not have a comprehensive privacy law,” said Jules Polonetsky, executive director of the Future of Privacy Forum.

A proposed federal privacy law gained strong bipartisan support toward the end of 2022, but disagreements over two key issues have threatened its progress, explained TechNet CEO Linda Moore.

The first is private right of action, which Moore said was important to “make sure we protect those small and medium-sized companies from frivolous lawsuits that will drive them out of business.”

The second disagreement is over whether the federal law should preempt state privacy laws.

Steve DelBianco, executive director of NetChoice, emphasized the need for a “national standard to replace an impossible patchwork of state laws.”

“If the President is serious about data privacy, he should support the preemption of state laws and the presumption of private right of action,” DelBianco said.

Strict European privacy regulation can provide both guidance and caution

Polonetsky pointed to the European Union’s General Data Protection Regulation, often considered to be the world’s strongest privacy law, as a potential guide for U.S. legislation.

“For years, I would have said no — let’s show the American way to do it,” he said. “But now that we’ve tried to do the American way and made a bit of a jumble in many of our proposals, I have new appreciation for the thought that went into GDPR.”

But in addition to appreciating what the GDPR got right, policymakers should also look carefully at the ways in which it may have overly restricted research, Polonetsky added.

Atkinson emphasized that caution, saying that the GDPR’s “important and useful components” did not outweigh the negative impacts of its research limitations.

The implementation of GDPR also had negative implications for competition, Moore said. “The largest companies gained market share, because the smaller and medium sized companies just couldn’t adapt to the new privacy regime and all the hurdles that are put in place.”

Pursuing the strictest possible privacy standard could result in barring all data sharing and interoperability — things that are “necessary for competition,” Polonetsky claimed. “We need an understanding of the balance and the conflicts which do exist between competition and privacy.”

Privacy law is also closely intertwined with Section 230, Polonetsky said. “It’s clear that any significant change to 230 means companies doing an awful lot of surveillance to track things that are said and posted and done.”

Broadband Breakfast’s Big Tech & Speech Summit will feature panels on each of the issues identified in Biden’s op-ed. Learn more and register here.

Continue Reading

Privacy

Metaverse Technologies Could Present Unprecedented Risk to Children’s Digital Privacy

Existing digital privacy concerns are amplified in an environment designed to fully immerse users.

Published

on

Photo by Alvin Trusty used with permission

WASHINGTON, January 18, 2023 — As immersive virtual reality technologies gain popularity among children and teenagers, there is an increasing need for legislation that specifically addresses the industry’s unprecedented capacity for data collection, said attorneys at a Practicing Law Institute webinar on Friday.

Without downplaying the potential benefits of “metaverse” technology, it is important to understand how it differs from the current internet and how that will impact children, said Leeza Garber, a cybersecurity and privacy attorney.

“When you’re talking about being able to feel something with the haptic gloves, which are in advanced states of development, or even an entirely haptic suit, you’re talking about the potential for cyberbullying, harassment, assault to happen to minors in a completely different playing field — where right now there’s not so much proactive legislation,” Garber said.

Although the metaverse is often framed as a thing of the future, it actually just entails “an immersive, visual, virtual experience,” said Gail Gottehrer, founder and chairperson of the New York State Bar Association’s cybersecurity subcommittee.

Defined as such, the metaverse has already gained widespread popularity. “The next generation of children will spend approximately 10 years in virtual reality in their lives. So that’s the equivalent of around two hours and 45 minutes per day,” Gottehrer said, citing research from the Institution of Engineering and Technology.

The user base of one such platform, Roblox, “includes 50 percent of all kids under the age of 16 in the United States — so it’s huge for minors,” Garber said.

For a generation that has grown up with social media integrated into everyday life, the “interplay of personal data with gaining the benefit of using this type of platform is just simply accepted,” Garber added. “We have to be more proactive in a space where this new iteration of the internet will have the capacity to take in so much more data.”

‘Staggering’ amount of data collected in the metaverse

The data collected by metaverse technology is “staggering,” Gottehrer said. Virtual reality equipment can track eye and head movements, heart rates, muscle tension, brain activity and gait patterns. After just a few minutes of use, the “motion signature” created by this data can be used to identify people with 95 percent accuracy.

This data can also identify neurodiversity and some forms of disability that affect movement, such as Parkinson’s disease.

“If you’re a child and this data is already being collected on you, where might that down the road follow you in your life?” Gottehrer asked.

Only a handful of states have specific regulations for the collection of biometric data, but Garber predicted that more states will likely pass similar legislation, albeit “at a glacial pace.”

However, many experts worry that it will not be fast enough, particularly when it comes to protecting children’s digital privacy. “While we know technology moves at a pace that’s much faster than courts or litigation, there’s really a concern that [the Children’s Online Privacy Protection Act] is dragging behind,” Gottehrer said.

Compounding these concerns is the confusion over who should be setting these regulations in the first place. In September, as privacy legislation stalled in Congress, Sens. Ed Markey, D-Mass., and Richard Blumenthal, D-Conn., wrote a letter urging the Federal Trade Commission to use its regulatory authority to update COPPA.

The letter “does not send a great message,” Garber said. And without decisive government action, tech companies currently hold great power to set the standards and practices that will shape the industry’s regulatory development in the future.

“Self-regulation by metaverse stakeholders — is that is that viable? Is that advantageous?” Gottehrer asked. “I think it’s safe to say we have not seen tremendous success at self-regulation of the current version of the internet — that might be a dramatic understatement.”

For an example of how companies might fail to proactively protect underage users, Gottehrer pointed to Instagram. According to internal documents shown to the Wall Street Journal in September 2021, Facebook knew for some time that Instagram was harmful to the mental health of teenage users, based on internal research, and yet continued to develop products for an even younger audience.

“All of these issues become amplified in an environment where you’re supposed to be completely immersed,” Garber said.

Continue Reading

Privacy

Businesses Should Prepare for More State-Specific Privacy Laws, Attorneys Say

“The privacy landscape in the U.S. is likely to become more complicated before it gets any easier.”

Published

on

Photos of Joan Stewart, Kathleen Scott and Duane Pozza courtesy of Wiley

WASHINGTON, January 13, 2023 — In the absence of overarching federal legislation, several states are passing or considering their own privacy laws, creating an increasingly disparate legal landscape that may be difficult for national companies to navigate.

“I think the privacy landscape in the U.S. is likely to become more complicated before it gets any easier,” said Joan Stewart, an attorney specializing in privacy, data governance and regulatory compliance, at a webcast hosted by Wiley on Thursday.

New privacy laws in California and Virginia took effect on Jan. 1, and Colorado and Connecticut have privacy laws set to become effective in July. Utah’s privacy law will go into effect at the end of December.

 “We expect to see additional states actively considering both omnibus and targeted privacy laws this year,” Stewart said. “So we encourage businesses to focus now on creating universal privacy programs that can adapt to these new laws in the future.”

Although the various state laws have plenty of overlap, there are also several significant outliers, said Kathleen Scott, a privacy and cybersecurity attorney.

States take different approaches to imposing privacy

For example, the new California Privacy Rights Act — which amends and strengthens California’s existing digital privacy law, already considered the strongest in the country — requires that businesses use specific words to describe the categories of personally identifying information being collected.

“These words are unique to California; they come from the statute, and they don’t always make perfect sense outside of that context,” Scott said.

Another area of difference is the consumer’s right to appeal privacy-related decisions. Virginia, Colorado and Connecticut require businesses to offer a process through which they explain to consumers why a specific request was denied.

While implementing a universal standard make compliance easier for businesses, Scott noted that “processing appeals can be pretty resource intensive, so there may be important reasons not to extend those outlier requirements more broadly to other states.”

Generally speaking, the state privacy laws apply to for-profit businesses and make an exception for nonprofits. However, Colorado’s law applies to for-profit and nonprofit entities that meet certain thresholds, and the Virginia and Connecticut laws carve out select nonprofits as exempt instead of having a blanket exemption.

Other state-to-state differences include specific notices, link requirements and opt-in versus opt-out policies. Even key definitions, such as what qualifies as “sensitive data,” vary from state to state.

Two of the state privacy laws taking effect in 2023 authorize the development of new rules, making it likely that additional expectations are on the horizon.

California will not begin civil and administrative enforcement of the CPRA until July. In the meantime, the state’s new privacy agency is charged with developing rules for its implementation, including specific directives for required notices, automated decision-making and other issues.

“The California rulemaking has been particularly complicated… and the outcome is going to have significant impacts on business practices,” said Duane Pozza, an attorney specializing in privacy, emerging technology and financial practices.

The state’s attorney general is arguing that existing rules require a global opt-out mechanism, but the new law establishes this as optional, Pozza explained. The currently proposed rules would again require a global opt-out.

Colorado’s attorney general is undertaking a similar rulemaking process, revising a previously released draft of the rules in preparation for a February hearing.

Several additional states are expected to propose broad or targeted privacy laws during the coming legislative cycle, according to data published Thursday by the Computer and Communications Industry Association. In addition to comprehensive consumer data privacy legislation, several measures address the collection of biometric information and children’s online safety, the CCIA found.

Continue Reading

Signup for Broadband Breakfast

Twice-weekly Breakfast Media news alerts
* = required field

Broadband Breakfast Research Partner

Trending