Connect with us

Privacy

Privacy Policy Customization Has Both Benefits and Drawbacks, Say PrivacyCon Participants

Published

on

Screenshot of Federal Trade Commission PrivacyCon webcast

July 21, 2020 — Allowing users of online platforms to shape the use of their own private information can be a tricky practice, and not necessarily one that platforms are incentivized to employ, said participants in a Federal Trade Commission PrivacyCon webinar on Tuesday.

Speaking about the General Data Protection Regulation, a European Union law that allows users to decide how the data they give to websites is used, panelists said that such legislation is often difficult to employ and may come with adverse effects.

“On the one hand, consumers increasingly would like control over the data firms collect,” said Guy Aridor, an economics PhD candidate at Columbia University. “…On the other hand, firms are reliant on this data. There is a worry that this will impact their function.”

Aridor has done extensive research into the GDPR and recently published The Effect of Privacy Regulation on the Data Industry: Empirical Evidence from GDPR.

Garrett Johnston, who has also authored research into the consequences of the GDPR, said that customizable privacy policies could disincentivize competition.

“Our main research question is, can privacy policies hurt competition?” he said. “The GDPR is complex, but its many elements increase the logical cost and legal risk with processing personal data. This will have important consequences for the web.”

Johnson added that websites must share what data they collect, but the data can be difficult to track.

“In order to provide these services, vendors have to share what the GDPR considers personal data,” he said. “As a result, they have faced scrutiny with three countries…[But] the GDPR is challenging to study because normally we can’t observe how they use data.”

Jeff Prince, chief economist at the Federal Communications Commission, said that the GDPR decreases the number of online venders.

He said that research has shown a 15 percent reduction of vendor use post-GDPR.

Screenshot from FTC webcast

He also said that he and the agency were researching the value of users’ data.

“We are looking at how much privacy is worth around the world,” he said. “At a rough level, we can think about balancing privacy preferences for citizens with benefits for use of the data. One thing that has been emphasized is that it’s particularly difficult to measure the privacy preferences. That’s something we are trying to get at with this.”

In a companion webinar, Hana Habib, PhD student at Carnegie Mellon University, said that her research found a lack of cohesive privacy controls across platforms made choices from website to website difficult.

“Our empirical analysis found that privacy choices were often provided in privacy policies,” she said. “The downside of that, other than consumers largely ignoring privacy policies, is that the headings under which choices are presented are inconsistent from policy to policy”

When it comes to customizable privacy policies and individualized use of content, Prince said that measurements of choice can be difficult but useful.

“[We] did some measures for the value of privacy with regards to apps,” he said. “…This is one reason why quantification is valuable. A lot of times [the choice to surrender data] might not line up with what quantifiable metrics would be.”

Privacy controls are not merely desirable in the United States, but across the scope of his research, Prince continued.

“That was one of the big takeaways for me,” he said. “When we think about privacy policies and how people value privacy in a relative sense across countries and different types, there wasn’t that big of a difference across those countries.”

Privacy experts and users of platforms like Facebook and Google have often accused them of abusing user data while offering nothing in return. While Facebook and Google have both made public statements expressing their privacy practices and promising to take data collection practices seriously, some experts believe that companies are not sufficiently incentivized to make major changes.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published.

Privacy

Led by Wyden, Democrats Call on NTIA to Reform Privacy Standards for .US Domains

The Democratic legislator called on NTIA end the automatic disclosure of .US web domain users’ personal information.

Published

on

Photo of Sen. Ron Wyden, D-Ore., obtained from Flickr

WASHINGTON, September 21, 2022 – A bicameral coalition led by Sen. Ron Wyden, D-Ore., on Wednesday called on the National Telecommunications and Information Administration to end the automatic disclosure of .US web domain users’ sensitive personal information.

The all-Democrat coalition – including Sen. Elizabeth Warren, D-Mass.; Sen. Brian Schatz, D-Hawaii; and Rep. Anna Eshoo, D-Calif. – laid out its concerns Wednesday in a letter to Administrator Alan Davidson of the NTIA:

“It is highly concerning that NTIA, since at least 2005, has not directed its contractors administering .US to adopt any protections for this sensitive information. The automatic public disclosure of users’ personal information puts them at enhanced risk for becoming victims of identity theft, spamming, spoofing, doxxing, online harassment, and even physical harm,” the coalition wrote.

Rejecting the NTIA’s current disclosure policy, the coalition called anonymity “a necessary component of free speech” and argued that with better privacy protections, .US domains would be more attractive to new website creators.

Besides making .US users’ information private, the letter recommends requiring users’ “affirmative, informed consent” for all third-party data transfers, strengthening barriers against law-enforcement investigations, and notifying users if a foreign government seeks access to their data. The coalition stated that instituting stronger privacy measures wouldn’t increase rates of online crime.

“A privacy- protective .US should support NTIA in these negotiations by providing a model for best practices in the broader domain name ecosystem. We urge you to continue the fight for privacy, expression, and human rights,” the letter said.

Continue Reading

Privacy

EU’s Digital Services Act May Be a Model for the United States

The Digital Services Act imposes transparency requirements and other accountability measures for tech platforms.

Published

on

Photo of Mathias Vermeulen, public policy director at the AWO Agency, obtained from Flickr.

September 16, 2022 – European Union’s Digital Service Act, particularly its data-sharing requirements, may become the model for future American future tech policy, said Mathias Vermeulen, public policy director at the AWO Agency, at a German Marshall Fund web panel Monday.

Now in the final stages of becoming law, the DSA aims to create a safer internet by introducing transparency requirements and other accountability measures for covered platforms. Of note to the German Marshall Fund paneliests was the DSA’s provision that, when cleared by regulators, “very large online platforms” – e.g., Facebook and Twitter – must provide data to third-party researchers for the purpose of ensuring DSA compliance.

In addition, the EU’s voluntary Code of Practice on Disinformation was unveiled in June, requiring opted-in platforms to combat disinformation by introducing bot-elimination schemes, demonetizing sources of alleged misinformation, and labeling political advertisements, among other measures. Signatories of the Code of Practice – including American tech giants Google Search, LinkedIn, Meta, Microsoft Bing, and Twitter – also agreed to proactively share data with researchers.

Vermeulen said that he expects the EU will soon draft new legislation to address the privacy concerns raised by the Digital Service Act’s data-sharing requirements.

The risks of large-scale data sharing

To protect user privacy, the DSA requires data handed over to researchers to be anonymized. Many experts believe that “anonymous” data is generally traceable to its source, however. Even the EU’s recommendations on data-anonymization best practices acknowledges the inherent privacy risks:

“Data controllers should consider that an anonymised dataset can still present residual risks to data subjects. Indeed, on the one hand, anonymisation and re-identification are active fields of research and new discoveries are regularly published, and on the other hand even anonymised data, like statistics, may be used to enrich existing profiles of individuals, thus creating new data protection issues.”

An essay from the Brookings Institution – generally supportive of the DSA’s data-sharing provisions – argues that many private researchers do not have the experience necessary to securely store sensitive data, recommending that the EU Commission establish or subsidize of secure centralized databases.

Continue Reading

Expert Opinion

Jeff Pulver and Noah Rafalko: A Humble Request to the FCC on Robocalls

Blocking bad actors requires a whole new way of thinking, the authors say in this ExpertOp exclusive to Broadband Breakfast.

Published

on

The authors of this Expert Opinion are Jeff Pulver (left), innovator in VoIP and Noah Rafalko, is a pioneer in TNID

Should the Federal Communications Commission seek out alternative platforms to solve their 2022 spam, scam and robocall issues? Yes! Does Blockchain offer valuable solutions? Yes! We would like to ask the FCC to increase the width of their lens when it comes to deploying solutions to solve their growing number of systemic challenges.

Any action to stop robocall insanity and tech-driven scams would be welcome. While Americans deal with the linger pandemic, mass shootings, an uncertain economy and war in Europe, the constant annoyance from scammers and 4.1 billion robocalls a month is just too much. Most people have responded by literally giving up voice communications all together.

Recently implemented legislation called STIR/SHAKEN is a step in the right direction, but it is not a long-term solution. The FCC  is simply taking old standards and applying them to new technologies. New thinking is needed; the next generation of technology must be explored. And the most promising of the new tools to protect our telecommunications system from fraudulent players lies in blockchain.

The key to stopping these nefarious acts lies in a digital identity solution powered by blockchain – a shard database or ledger. An identity solution enables customers to be confident that the communication is truly from enterprises they know and trust.

With blockchain, only authorized and verified messages get through. Spam and robocalls are virtually eliminated in one shot. All that’s required is a slight change in how we approach communications.

In a world where consumers are already doing whatever they can to self-manage their identity, it isn’t a large leap of faith to imagine adding a certified, digital ID to our telephone numbers.

Consumers freely use their telephone numbers to attest and manage their identity – even more than they use their Social Security numbers, birthdays, mother’s maiden name and secret questions. In our current digital universe, consumers use their phone numbers to register for store discounts, receive health and safety alerts and even transfer money to others.

And in their effort to stop spam and robocalls, consumers willingly add apps such as Hiya, paying over $300 million a year to these intermediaries.

The FCC needs to evolve and embrace the technology that allows consumers and mobile carriers who have a shared stake in attesting their identities. They need to recognize that blockchain technology offers an elegant, all-encompassing solution to the $40 billion in fraud that consumers fall victim to every year.

It’s time we leveraged a solution that’s already being used in other countries such as India, where blockchain technology helps protect over 600 million citizens from spam and robocalls.

Back in 2004, when the future of telecommunications was being written, the FCC was challenged with laying down rules governing Voice over Internet Protocol (VoIP). At that time, we hosted brown-bag lunches for Congress, and held open demonstration days at the FCC as well as a mini-trade show on the Hill in our effort to inform and educate Congress, staffers and other government employees on the latest and greatest innovations in Internet communications technology.

The FCC would be wise to revisit this practice of show and tell where they hear from the innovators of new game-changing technologies that can solve their biggest concerns. It certainly is wiser than simply taking advice handed down from lobbyists and relying on legislation that’s severely limited and unenforceable.

When the FCC uses its influence to investigate and embrace new and innovative technologies, they can finally make significant headway in restoring trust in the quality of service associated with our communications.

Jeff Pulver is an innovator in the field of Voice over Internet Protocol (VoIP). He was instrumental in changing how the FCC classified VoIP in 2004, paving the way for the development of video and voice internet communications. The co-founder of Vonage, Jeff has invested in over 400 start-ups. 

Noah Rafalko is a pioneer in TNID (Telephone Number ID), a blockchain solution that restores trust in communications. Noah is founder and CEO of TSG Global, Inc. which provides voice, messaging and identity management services for SaaS companies and large enterprises. This piece is exclusive to Broadband Breakfast.

Broadband Breakfast accepts commentary from informed observers of the broadband scene. Please send pieces to commentary@breakfast.media. The views reflected in Expert Opinion pieces do not necessarily reflect the views of Broadband Breakfast and Breakfast Media LLC.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending