Connect with us

Privacy

Zoom CEO Eric Yuan Pledges to Address Security Shortcomings in ‘The Next 90 Days’

Published

on

Photo of Zoom CEO Eric Yuan

April 20, 2020— When a Zoom user had his question read out during the “Ask Eric Anything” webinar on Wednesday, Zoom CEO Eric Yuan listened intently.

“Will Zoom be adding more emojis to its social features anytime soon?” the user asked.

Yuan disappointed the user immediately. “We’re not going to allocate any new features to that,” he said. Yuan then made it clear that for “the next 90 days,” Zoom will be “incredibly focused on enhancing our privacy and security.”

The “Ask Eric Anything” webinar, a weekly series in which Zoom users tune in to ask questions about Americans’ go-to video conferencing technology during the pandemic, launched in response to widespread privacy failings by the company’s flagship communications technology.

Almost as quickly as the company name became a verb, “zoombombing” entered the national lexicon to describe the act of anonymous trolls entering a Zoom meeting a neglected URL and posting pornographic, racist, or generally inappropriate material.

In fact, Zoom announced over the webinar several updates in an effort to assuage users— and shareholders— about concerns surrounding privacy. The first thing the company announced was a new hire.

Alex Stamos, the director of the Stanford Internet Observatory and former chief security officer of Facebook, announced on the webinar that he is “Zoom’s new outside advisor.”

“I want to apply my skills to the problem we are facing,” Stamos said. He called Zoom “a critical part of the lives of hundreds of millions of people” and identified education as the “most interesting area” in which Zoom can benefit society.

Defending Zoom against complaints of ‘Zoomboming’

Stamos took the time to defend Zoom from its blemishes in the press. “Every single company… will face” the problem of security failings and claimed that “there’s never been a company that’s had to scale this quickly.”

Stamos related how Zoom is taking active steps to stop the bleeding by “proactively locking” the “bad actors” before they can compromise an account.

Stamos suggested that much of the cause of Zoombombing rests on the manager of a meeting. Resultingly, he implored users to avoid making a “mistake” by using “the same password” that they use for other accounts

“Go get a password manager,” Stamos recommended.

Stamos also expressed optimism on the webinar. He said that Americans are “very versatile and when we find a problem, we find a solution.”

Zoom also announced a new feature rolling out sometime this weekend. In addition to Zoombombing, the company has also been sharply criticized for keeping one of its many data servers in China, a country with which the U.S. has privacy disagreements and with which the U.S.-based Zoom has ties.

Occasionally, Yuan admitted, Americans’ data would be sent to China when other data centers were offline, which hypothetically left them vulnerable to data harvesting by the Chinese.

In response, Oded Gal, chief product officer of Zoom, announced that by April 25, the Chinese server will be deactivated automatically for all users that have not explicitly opted to have their data routed to it.

In addition, Zoom has hired a new cybersecurity team called Luta Security to help catch bugs before users do. Luta Security is headed by Katie Moussouris, who worked on similar “bug bounty” programs for Microsoft and the Pentagon.

The CEO expressed faith in these changes. Yuan says he has “much more high confidence now.”

The question is whether this high confidence will transfer to Zoom’s users and shareholders.

David Jelke was a Reporter for Broadband Breakfast. He graduated from Dartmouth College with a degree in neuroscience. Growing up in Miami, he learned to speak Spanish during a study abroad semester in Peru. He is now teaching himself French on his iPhone.

Privacy

Industry Leaders Urge Biden to Tread Lightly With Federal Privacy Legislation

While generally supportive of a federal privacy law, some experts warned it could harm competition.

Published

on

Screenshot of Jules Polonetsky, executive director of the Future of Privacy Forum

WASHINGTON, January 27, 2023 — While there is a need for federal privacy regulation, such legislation could harm innovation, research and competition if not carefully designed, according to leaders of top technology trade associations and think tanks.

“In my opinion, there’s a lot of overblown statements about how bad the problems are,” said Rob Atkinson, president of the Information Technology and Innovation Foundation. “And a lot of the regulatory solutions… would actually, in our view, do more harm than good.”

Speaking at a panel hosted by the ITIF on Wednesday, industry experts responded to President Joe Biden’s recent op-ed calling on Congress to unite against Big Tech companies in three areas of concern: privacy, content moderation and competition.

“It’s gotten to this amazing point where we have a lot of the world’s most dominant tech companies, and to see a headline about uniting against Big Tech by the President of the United States, it hurts me greatly,” said Gary Shapiro, president and CEO of the Consumer Technology Association.

One of Biden’s policy recommendations was to place limitations on targeted advertising and ban the practice altogether for children. But Atkinson disagreed with the implication that targeted advertising is an inherent violation of privacy.

“Targeted advertising is almost always done in an anonymous way where the advertiser doesn’t know my name — they just know that I like to ride my bicycle to work, and so they send me a bicycle ad,” he said.

Despite concerns about the extent of certain proposals, the tech leaders were generally supportive of federal privacy legislation. “It’s incredible that the U.S. is one of the very few democracies in the entire world that does not have a comprehensive privacy law,” said Jules Polonetsky, executive director of the Future of Privacy Forum.

A proposed federal privacy law gained strong bipartisan support toward the end of 2022, but disagreements over two key issues have threatened its progress, explained TechNet CEO Linda Moore.

The first is private right of action, which Moore said was important to “make sure we protect those small and medium-sized companies from frivolous lawsuits that will drive them out of business.”

The second disagreement is over whether the federal law should preempt state privacy laws.

Steve DelBianco, executive director of NetChoice, emphasized the need for a “national standard to replace an impossible patchwork of state laws.”

“If the President is serious about data privacy, he should support the preemption of state laws and the presumption of private right of action,” DelBianco said.

Strict European privacy regulation can provide both guidance and caution

Polonetsky pointed to the European Union’s General Data Protection Regulation, often considered to be the world’s strongest privacy law, as a potential guide for U.S. legislation.

“For years, I would have said no — let’s show the American way to do it,” he said. “But now that we’ve tried to do the American way and made a bit of a jumble in many of our proposals, I have new appreciation for the thought that went into GDPR.”

But in addition to appreciating what the GDPR got right, policymakers should also look carefully at the ways in which it may have overly restricted research, Polonetsky added.

Atkinson emphasized that caution, saying that the GDPR’s “important and useful components” did not outweigh the negative impacts of its research limitations.

The implementation of GDPR also had negative implications for competition, Moore said. “The largest companies gained market share, because the smaller and medium sized companies just couldn’t adapt to the new privacy regime and all the hurdles that are put in place.”

Pursuing the strictest possible privacy standard could result in barring all data sharing and interoperability — things that are “necessary for competition,” Polonetsky claimed. “We need an understanding of the balance and the conflicts which do exist between competition and privacy.”

Privacy law is also closely intertwined with Section 230, Polonetsky said. “It’s clear that any significant change to 230 means companies doing an awful lot of surveillance to track things that are said and posted and done.”

Broadband Breakfast’s Big Tech & Speech Summit will feature panels on each of the issues identified in Biden’s op-ed. Learn more and register here.

Continue Reading

Privacy

Metaverse Technologies Could Present Unprecedented Risk to Children’s Digital Privacy

Existing digital privacy concerns are amplified in an environment designed to fully immerse users.

Published

on

Photo by Alvin Trusty used with permission

WASHINGTON, January 18, 2023 — As immersive virtual reality technologies gain popularity among children and teenagers, there is an increasing need for legislation that specifically addresses the industry’s unprecedented capacity for data collection, said attorneys at a Practicing Law Institute webinar on Friday.

Without downplaying the potential benefits of “metaverse” technology, it is important to understand how it differs from the current internet and how that will impact children, said Leeza Garber, a cybersecurity and privacy attorney.

“When you’re talking about being able to feel something with the haptic gloves, which are in advanced states of development, or even an entirely haptic suit, you’re talking about the potential for cyberbullying, harassment, assault to happen to minors in a completely different playing field — where right now there’s not so much proactive legislation,” Garber said.

Although the metaverse is often framed as a thing of the future, it actually just entails “an immersive, visual, virtual experience,” said Gail Gottehrer, founder and chairperson of the New York State Bar Association’s cybersecurity subcommittee.

Defined as such, the metaverse has already gained widespread popularity. “The next generation of children will spend approximately 10 years in virtual reality in their lives. So that’s the equivalent of around two hours and 45 minutes per day,” Gottehrer said, citing research from the Institution of Engineering and Technology.

The user base of one such platform, Roblox, “includes 50 percent of all kids under the age of 16 in the United States — so it’s huge for minors,” Garber said.

For a generation that has grown up with social media integrated into everyday life, the “interplay of personal data with gaining the benefit of using this type of platform is just simply accepted,” Garber added. “We have to be more proactive in a space where this new iteration of the internet will have the capacity to take in so much more data.”

‘Staggering’ amount of data collected in the metaverse

The data collected by metaverse technology is “staggering,” Gottehrer said. Virtual reality equipment can track eye and head movements, heart rates, muscle tension, brain activity and gait patterns. After just a few minutes of use, the “motion signature” created by this data can be used to identify people with 95 percent accuracy.

This data can also identify neurodiversity and some forms of disability that affect movement, such as Parkinson’s disease.

“If you’re a child and this data is already being collected on you, where might that down the road follow you in your life?” Gottehrer asked.

Only a handful of states have specific regulations for the collection of biometric data, but Garber predicted that more states will likely pass similar legislation, albeit “at a glacial pace.”

However, many experts worry that it will not be fast enough, particularly when it comes to protecting children’s digital privacy. “While we know technology moves at a pace that’s much faster than courts or litigation, there’s really a concern that [the Children’s Online Privacy Protection Act] is dragging behind,” Gottehrer said.

Compounding these concerns is the confusion over who should be setting these regulations in the first place. In September, as privacy legislation stalled in Congress, Sens. Ed Markey, D-Mass., and Richard Blumenthal, D-Conn., wrote a letter urging the Federal Trade Commission to use its regulatory authority to update COPPA.

The letter “does not send a great message,” Garber said. And without decisive government action, tech companies currently hold great power to set the standards and practices that will shape the industry’s regulatory development in the future.

“Self-regulation by metaverse stakeholders — is that is that viable? Is that advantageous?” Gottehrer asked. “I think it’s safe to say we have not seen tremendous success at self-regulation of the current version of the internet — that might be a dramatic understatement.”

For an example of how companies might fail to proactively protect underage users, Gottehrer pointed to Instagram. According to internal documents shown to the Wall Street Journal in September 2021, Facebook knew for some time that Instagram was harmful to the mental health of teenage users, based on internal research, and yet continued to develop products for an even younger audience.

“All of these issues become amplified in an environment where you’re supposed to be completely immersed,” Garber said.

Continue Reading

Privacy

Businesses Should Prepare for More State-Specific Privacy Laws, Attorneys Say

“The privacy landscape in the U.S. is likely to become more complicated before it gets any easier.”

Published

on

Photos of Joan Stewart, Kathleen Scott and Duane Pozza courtesy of Wiley

WASHINGTON, January 13, 2023 — In the absence of overarching federal legislation, several states are passing or considering their own privacy laws, creating an increasingly disparate legal landscape that may be difficult for national companies to navigate.

“I think the privacy landscape in the U.S. is likely to become more complicated before it gets any easier,” said Joan Stewart, an attorney specializing in privacy, data governance and regulatory compliance, at a webcast hosted by Wiley on Thursday.

New privacy laws in California and Virginia took effect on Jan. 1, and Colorado and Connecticut have privacy laws set to become effective in July. Utah’s privacy law will go into effect at the end of December.

 “We expect to see additional states actively considering both omnibus and targeted privacy laws this year,” Stewart said. “So we encourage businesses to focus now on creating universal privacy programs that can adapt to these new laws in the future.”

Although the various state laws have plenty of overlap, there are also several significant outliers, said Kathleen Scott, a privacy and cybersecurity attorney.

States take different approaches to imposing privacy

For example, the new California Privacy Rights Act — which amends and strengthens California’s existing digital privacy law, already considered the strongest in the country — requires that businesses use specific words to describe the categories of personally identifying information being collected.

“These words are unique to California; they come from the statute, and they don’t always make perfect sense outside of that context,” Scott said.

Another area of difference is the consumer’s right to appeal privacy-related decisions. Virginia, Colorado and Connecticut require businesses to offer a process through which they explain to consumers why a specific request was denied.

While implementing a universal standard make compliance easier for businesses, Scott noted that “processing appeals can be pretty resource intensive, so there may be important reasons not to extend those outlier requirements more broadly to other states.”

Generally speaking, the state privacy laws apply to for-profit businesses and make an exception for nonprofits. However, Colorado’s law applies to for-profit and nonprofit entities that meet certain thresholds, and the Virginia and Connecticut laws carve out select nonprofits as exempt instead of having a blanket exemption.

Other state-to-state differences include specific notices, link requirements and opt-in versus opt-out policies. Even key definitions, such as what qualifies as “sensitive data,” vary from state to state.

Two of the state privacy laws taking effect in 2023 authorize the development of new rules, making it likely that additional expectations are on the horizon.

California will not begin civil and administrative enforcement of the CPRA until July. In the meantime, the state’s new privacy agency is charged with developing rules for its implementation, including specific directives for required notices, automated decision-making and other issues.

“The California rulemaking has been particularly complicated… and the outcome is going to have significant impacts on business practices,” said Duane Pozza, an attorney specializing in privacy, emerging technology and financial practices.

The state’s attorney general is arguing that existing rules require a global opt-out mechanism, but the new law establishes this as optional, Pozza explained. The currently proposed rules would again require a global opt-out.

Colorado’s attorney general is undertaking a similar rulemaking process, revising a previously released draft of the rules in preparation for a February hearing.

Several additional states are expected to propose broad or targeted privacy laws during the coming legislative cycle, according to data published Thursday by the Computer and Communications Industry Association. In addition to comprehensive consumer data privacy legislation, several measures address the collection of biometric information and children’s online safety, the CCIA found.

Continue Reading

Signup for Broadband Breakfast

Twice-weekly Breakfast Media news alerts
* = required field

Broadband Breakfast Research Partner

Trending