Connect with us

Big Tech

The Future of Privacy in a Social Media and Networking World

Published

on

WASHINGTON Thursday April 19, 2012 – First search, then social media, and now privacy concerns?  The digital world has transformed the way consumers access content; one can search an article they are interested in, subscribe to Google Reader, follow a friend’s recommendation from Twitter or read an article a co worker read via the Washington Post Social Reader on Facebook.  With content companies and social media companies competing for ad dollars and the eyes of consumers, where do the privacy concerns come in, and how is government and industry dealing with these consumer concerns?

Event Highlights

“The Future of Privacy in a Social Media and Networking World” from BroadbandBreakfast.com

Complete Program

“The Future of Privacy in a Social Media and Networking World” from BroadbandBreakfast.com

At Tuesday’s Broadband Breakfast “Social Networking, the End of Media and Future of Privacy,” government officials, industry representatives and advertising experts all weighed in on this question and many more.

Julie Brill, Commissioner of the Federal Trade Commission gave the opening Keynote remarks summarizing the FTC’s recent report setting out new industry privacy framework for a nation that loves to share.

“Before our children can walk or talk, we teach them to share,” said Brill, “So it is no wonder that we have flocked to social media, a platform based on sharing, to share everything from our birth dates to films of our child’s birth.”

Social media has changed the model for news and the way businesses interact with consumers. Brill cited an Ad Age report showing that businesses will spend 27% of their digital budget on social media advertising in the upcoming year which is 20% more than the previous 12 months.

While these numbers are no surprise, concerns about privacy in social media landscape continue to escalate. Brill explains this with a simple analogy, “taking is not sharing…Many privacy problems online arise when companies forget the basic principles of the playroom.”

Brill used the recent Facebook, Google Buzz, and Twitter settlements to highlight the need for industry wide comprehensive privacy programs to disclose changes to privacy policies, changes to information sharing practices and protect against breaches of private user data. These as well as “cases we’ve brought involving new platforms like mobile apps, children’s online services, and data brokers – led us to realize it was time to update our approach to protecting consumers’ privacy.” Brill continued, “We had to take account of the vast changes in technology, the myriad new ways that consumers’ information is collected and used, and the need to better communicate these new practices to consumers.”

The Framework laid out by the FTC a couple weeks ago articulates the best practices for companies that collect consumer data and will help companies develop comprehensive privacy and data security practices.

There are three components of the framework:

Privacy by Design – A call for companies to integrate privacy and security protections into new products.

Simplified Choices for Businesses and Consumers – consumers should be given clear and simple choices that they can implement at convenient and relevant times.

Greater Transparency – companies should provide more information about how they collect and use personal data.

One way, according to Brill, that they can simplify choice, is by calling on industry to develop a Do Not Track mechanism. By developing browser tools, icon-and-cookie based mechanisms, promising to make these mechanisms interoperable and creating technical standards, Do No Track, “has the potential to provide consumers with simple and clear information about online data and use practices, and to allow consumers to make choices in connection with those practices.”

Brill does not believe that Do Not Track will automatically cause users to opt out of tracking. She added, “Google offers its users the ability to refine the types of ads they see through its ‘Ad Preferences’ dashboard, and it also offers its users the ability to opt out of tracking entirely. Consumers seem to appreciate knowing how Google has sized up their interests, and they overwhelmingly exercise more granular choices to adjust the ads they will see, rather than opt out.”

The Commissioner hopes that with a Do Not Track mechanism in place by the end of the year, companies will see improvements in user experience that will lead to greater consumer trust.

With Do Not Track being the first action item to implement recommendations in the privacy report, the FTC lays out four others.

Second, the Commission is calling on mobile companies to work towards improved privacy protection including the development of short and meaningful disclosure . On May 30th, FTC staff will be holding a workshop to address the issues of making mobile privacy disclosures short and effective.

Third, the Commission supports legislation to provide consumers with access to information about them held by so called data brokers. The Commission asks data brokers that compile data for marketing purposes to explore creating a centralized database where they could identify themselves to consumers and explain how they collect data. Additionally these brokers should detail the access rights and choices they provide with respect to consumer data they hold.

The fourth action item, calls on the Commission to hold workshops to recognize the concerns that derive from comprehensive tracking of online consumers done by ISPs and large social networks.

Finally the last item asks the FTC to work with the Department of Commerce to help develop the sector specific codes of conduct articulated in the Administration’s White Paper on Privacy

In addressing the concerns of a reporter about whether the FTC and industry were on the same page regarding collection in terms of what Do Not Track really means, Brill acknowledged that an important issues is how much collection will be part of the choice consumers are given through Do Not Track. Brill explained that the issue of collection is what prompted the FTC to change its language and propose that the choices given to consumers through Do Not Track would depend on the context of the transaction as opposed to the industry’s commonly accepted practices. Context of the transaction is a much more objective test and is designed not to be a fixed list but rather to allow for innovation and growth in what industry can do and what will be appropriate for collection.

“There is consensus around the notion that there needs to be collection limitation, the question is where the boundaries are going to be, but there is sure but slow progress being made in this area.” said Brill.

Finally when asked about the FTC’s role in determining who is an actual data broker, Brill noted that there is a lot the FTC still does not know and that is why they need to hear more from industry. Brill wants to focus on the low hanging fruit and those companies that recognize themselves as data brokers. “Let’s get them to engage in more transparent activities, to have some sort of website or portal where they inform consumers who they are…there’s one place where consumers can go to find them and then consumers can find out about what access and collection rights that entity offers. Then I think we can start to talk about drawing the lines, where the grey areas are and who is on each side of that line.”

Drew Clark, Founder and Publisher of Broadband Breakfast.com then turned the discussion of privacy, social networks and the media over to a panel of industry experts to get their take. The panel of experts included Bruce Gottlieb, General Counsel Atlantic Media Company, Sarah Hudgins, Director of Public Policy at the Interactive Advertising Bureau and Jules Polonetsky, Director and Co Chair of the Future of Privacy Forum.

Gottlieb believes that social networking and media has been a plus for Atlantic because it creates a powerful tool to increase traffic and build a new economic foundation. Small companies can expand rapidly and grow their audiences and revenue. He dismissed the notion that behavioral advertising is bad for publishers of premium content because it takes viewers away from the content.

There are $32 billion in online ad revenue that did not exist 5 years ago, noted Gottlieb, “the issue is that 68% of that pie goes to five players (Google, Yahoo, Facebook …) and that becomes a threat.”

Clark asked Hudgins to go into further detail about how interactive advertising has changed with the advent of search and social.

“Organic search,” explained Hudgins, “still drives a lot of viewers. Just as search was disruptive for publishers, social media will be disruptive for publishers. One constant is that it is still just a new front page. In this instance, one advantage for publishers is the engagement factor – content is still king it is just that engagement is the new queen.”

“Social media now drives users to content and content can become advertising for a brand,” next step noted Hudgins is how you keep them there, and that is where the site’s inventory comes into play.

At the same time, it seems like drivers of ad revenue are becoming more and more complex focusing on, time on the site, unique viewers, and much more. Image and displays, consumer behavior, social engagement, gamification and mobile platforms are all key components of keeping viewers on sites and driving up ad revenue.

Clark moved the discussion towards privacy by asking Polonetsky, about how he views privacy from a consumer standpoint versus a publisher standpoint.

Polonetsky who represents an industry supported think tank believes that people are always looking for the perfect solution to a problem. “Companies are trying to do useful things” said Polonetsky but the ecosystem tend to hold them back. His organization put forward the idea of Do Not Track for behavioral advertisements over an year ago but at that time the industry and ecosystem did not accept that idea. Some people do not want any tracking, on the other hand tracking is essential to return on investment.

Hudgins believes that companies now have to compete based on privacy controls. She suggested that consumers are becoming savvier about privacy settings and therefore transparency when it comes to privacy settings and application data sharing needs is increasingly essential.

Polonetsky was weary about too much choice for consumers when it comes to data collection and privacy. If it is useful for sites to have that information that provides very important data, like when a browser crashes, “Why ask?” he said, “People will say no just for the hell of it.”

People need to understand that there are things that will happen by default that are ok. “Too much choice on micro issues leads to uninformed decisions,” said Polonetsky.

Gottlieb chimed in on a personal note about choices and privacy settings, he would like to make a set of decisions once and have them transfer across all the platforms and devices he works with. “Developing common languages and letting that flow though the technologies would be very useful.”

Moving on to the discussion of data brokers and data aggregators, Hudgins saw a real challenge in drawing the line between the two. While many companies have data about consumers most of it is in aggregated formats. Most of these companies may have no way to parse that data out for consumers in identifiable forms. “The term data brokers can sweep in a lot of companies for practices that may of may not be a big part of what they do.”

Before we start to legislate on privacy, Hudgins believe that we need to wait and see what industry can do. Hudgins also added that she believes the most effective regulator is the media. If there is a change, there is a consumer outcry and the media reporting can create or destroy brand trust.

Gottlieb is skeptical that consumers choose between services based on privacy rather than price and added features. “Privacy is more of a 0 or 1, you don’t notice it till it pisses you off – what pisses you off is what shows up on the front page of the Times.” He believes the bully pulpit has let to more reform than anything else.

As Deputy Editor, Chris Naoum is curating expert opinions, and writing and editing articles on Broadband Breakfast issue areas. Chris served as Policy Counsel for Future of Music Coalition, Legal Research Fellow for the Benton Foundation and law clerk for a media company, and previously worked as a legal clerk in the office of Federal Communications Commissioner Jonathan Adelstein. He received his B.A. from Emory University and his J.D. and M.A. in Television Radio and Film Policy from Syracuse University.

Antitrust

Federal Trade Commission Will Likely Not Be Able to Implement Competition Rules, Panelists Say

Panelists at TechFreedom event said judiciary will prevent the FTC from developing proposed antitrust policies.

Published

on

Photo of Peter Wallison from C-SPAN

WASHINGTON, October 22, 2021 –The Federal Trade Commission’s attempts to use rulemaking authority to issue antitrust policy governing technology companies will be struck down in federal courts, said panelists participating in a TechFreedom event on Thursday.

Recently formed conservative majorities on the Supreme Court and other panels have expressed opposition to the idea that the FTC possesses such rulemaking authority, these panelists said.

Hence, unlike past supreme courts, they current bench is likely to strike down FTC-issued binding rules.

Panelists highlighted former President Donald Trump appointees Brett Kavanaugh and Neil Gorsuch as justices who have opposed legal reasoning often used to permit FTC rulemaking.

Indeed, some panelists said early 20th Century legislation governing the FTC makes the case that the agency was created as an investigative body rather than a regulatory one.

Peter Wallison, senior fellow emeritus at the American Enterprise Institute, said that between five and six Supreme Court justices would ultimately vote to weaken precedents that allow for FTC rulemaking.

The Judiciary Committee of the House of Representatives recently advanced six antitrust bills that attempt to regulate the tech industry and foster greater competition, including the Ending Platform Monopolies Act and the Platform Competition and Opportunity Act.

FTC rules have taken on increased importance in terms of economic regulation due to the frequent inability of Congress to pass major legislation due to partisan gridlock. The FTC has proposed new procedures to ensure competition since Lina Khan was appointed as chair.

However, NERA Economic Consulting on Wednesday concluded that legislative proposals to regulate competition would impose costs of around $300 billion while impacting 13 additional American companies in the near term and more than 100 companies in the next decade.

Study author Christian Dippon contends that the legislation would limit American startup growth and international competitiveness while at the same time increasing costs for Americans.

Continue Reading

Section 230

Democrats Use Whistleblower Testimony to Launch New Effort at Changing Section 230

The Justice Against Malicious Algorithms Act seeks to target large online platforms that push harmful content.

Published

on

Rep. Anna Eshoo, D-California

WASHINGTON, October 14, 2021 – House Democrats are preparing to introduce legislation Friday that would remove legal immunities for companies that knowingly allow content that is physically or emotionally damaging to its users, following testimony last week from a Facebook whistleblower who claimed the company is able to push harmful content because of such legal protections.

The Justice Against Malicious Algorithms Act would amend Section 230 of the Communications Decency Act – which provides legal liability protections to companies for the content their users post on their platform – to remove that shield when the platform “knowingly or recklessly uses an algorithm or other technology to recommend content that materially contributes to physical or severe emotional injury,” according to a Thursday press release, which noted that the legislation will not apply to small online platforms with fewer than five million unique monthly visitors or users.

The legislation is relatively narrow in its target: algorithms that rely on the personal user’s history to recommend content. It won’t apply to search features or algorithms that do not rely on that personalization and won’t apply to web hosting or data storage and transfer.

Reps. Anna Eshoo, D-California, Frank Pallone Jr., D-New Jersey, Mike Doyle, D-Pennsylvania, and Jan Schakowsky, D-Illinois, plan to introduce the legislation a little over a week after Facebook whistleblower Frances Haugen alleged that the company misrepresents how much offending content it terminates.

Citing Haugen’s testimony before the Senate on October 5, Eshoo said in the release that “Facebook is knowingly amplifying harmful content and abusing the immunity of Section 230 well beyond congressional intent.

“The Justice Against Malicious Algorithms Act ensures courts can hold platforms accountable when they knowingly or recklessly recommend content that materially contributes to harm. This approach builds on my bill, the Protecting Americans from Dangerous Algorithms Act, and I’m proud to partner with my colleagues on this important legislation.”

The Protecting Americans from Dangerous Algorithms Act was introduced with Rep. Tom Malinowski, D-New Jersey, last October to hold companies responsible for “algorithmic amplification of harmful, radicalizing content that leads to offline violence.”

From Haugen testimony to legislation

Haugen claimed in her Senate testimony that according to internal research estimates, Facebook acts against just three to five percent of hate speech and 0.6 percent of violence incitement.

“The reality is that we’ve seen from repeated documents in my disclosures is that Facebook’s AI systems only catch a very tiny minority of offending content and best content scenario in the case of something like hate speech at most they will ever get 10 to 20 percent,” Haugen testified.

Haugen was catapulted into the national spotlight after she revealed herself on the television program 60 Minutes to be the person who leaked documents to the Wall Street Journal and the Securities and Exchange Commission that reportedly showed Facebook knew about the mental health harm its photo-sharing app Instagram has on teens but allegedly ignored them because it inconvenienced its profit-driven motive.

Earlier this year, Facebook CEO Mark Zuckerberg said the company was developing an Instagram version for kids under 13. But following the Journal story and calls by lawmakers to backdown from pursuing the app, Facebook suspended the app’s development and said it was making changes to its apps to “nudge” users away from content that they find may be harmful to them.

Haugen’s testimony versus Zuckerberg’s Section 230 vision

In his testimony before the House Energy and Commerce committee in March, Zuckerberg claimed that the company’s hate speech removal policy “has long been the broadest and most aggressive in the industry.”

This claim has been the basis for the CEO’s suggestion that Section 230 be amended to punish companies for not creating systems proportional in size and effectiveness to the company’s or platform’s size for removal of violent and hateful content. In other words, larger sites would have more regulation and smaller sites would face fewer regulations.

Or in Zuckerberg’s words to Congress, “platforms’ intermediary liability protection for certain types of unlawful content [should be made] conditional on companies’ ability to meet best practices to combat the spread of harmful content.”

Facebook has previously pushed for FOSTA-SESTA, a controversial 2018 law which created an exception for Section 230 in the case of advertisements related prostitution. Lawmakers have proposed other modifications to the liability provision, including removing protections in the case for content that the platform is paid for and for allowing the spread of vaccine misinformation.

Zuckerberg said companies shouldn’t be held responsible for individual pieces of content which could or would evade the systems in place so long as the company has demonstrated the ability and procedure of “adequate systems to address unlawful content.” That, he said, is predicated on transparency.

But according to Haugen, “Facebook’s closed design means it has no oversight — even from its own Oversight Board, which is as blind as the public. Only Facebook knows how it personalizes your feed for you. It hides behind walls that keep the eyes of researchers and regulators from understanding the true dynamics of the system.” She also alleges that Facebook’s leadership hides “vital information” from the public and global governments.

An Electronic Frontier Foundation study found that Facebook lags behind competitors on issues of transparency.

Where the parties agree

Zuckerberg and Haugen do agree that Section 230 should be amended. Haugen would amend Section 230 “to make Facebook responsible for the consequences of their intentional ranking decisions,” meaning that practices such as engagement-based ranking would be evaluated for the incendiary or violent content they promote above more mundane content. If Facebook is choosing to promote content which damages mental health or incites violence, Haugen’s vision of Section 230 would hold them accountable. This change would not hold Facebook responsible for user-generated content, only the promotion of harmful content.

Both have also called for a third-party body to be created by the legislature which provides oversight on platforms like Facebook.

Haugen asks that this body be able to conduct independent audits of Facebook’s data, algorithms, and research and that the information be made available to the public, scholars and researchers to interpret with adequate privacy protection and anonymization in place. Beside taking into account the size and scope of the platforms it regulates, Zuckerberg asks that the practices of the body be “fair and clear” and that unrelated issues “like encryption or privacy changes” are dealt with separately.

With reporting from Riley Steward

Continue Reading

Big Tech

OECD Ratifies Global 15% Digital Tax Rate, Aims For 2023 Implementation

The OECD finalized an earlier agreement that would impose a 15% tax on companies operating in 136 member nations.

Published

on

US Treasury Secretary Janet Yellen.

WASHINGTON, October 11, 2021 – The Organization for Economic Cooperation and Development on Friday finalized an agreement to levy a 15 percent tax rate on digital multinational businesses, like Amazon, Apple, Google, and Facebook, starting in 2023.

The ratification of the tax rate comes after years of negotiations and after individual countries have proposed their own tax systems to keep up with internet businesses that have long skirted the tax of laws of nations they operate in because they don’t necessarily have a physical connection inside those borders. The Liberal Party in Canada, for example, had proposed a 3 percent tax on revenues obtained inside the country, while Britain, France, Italy, and Spain had been contemplating digital sales taxes on their own.

The 15 percent tax rate has been signed by 136 member nations, all OECD and G20 countries, out of 140 states (Kenya, Nigeria, Sri Lanka, and Pakistan did not join) and finalizes a July political agreement to reform international tax rules. The United States had proposed the 15 percent global corporate tax rate earlier this year.

Hungary and Ireland, the latter of which is a corporate tax haven for companies like Apple and Google, were two of the last holdouts. Hungary agreed to join Friday after they were guaranteed a ten-year rollout period for the regulation, and Ireland agreed Thursday after guarantees that the rate would not be subsequently increased.

The new tax rate is expected to generate US $150 billion annually for the countries involved and targets companies with revenues of over 750 million Euros. “The global minimum tax agreement does not seek to eliminate tax competition, but puts multilaterally agreed limitations on it,” the OECD said, adding the tax will not only stabilize the international tax system but also provide companies with more certainty as to their obligations.

The regulation would be the first foundational cross-border corporate tax rate regulatory change in over a century. Some are skeptical of President Joe Biden’s and Congress’s ability to ratify the agreement. The OECD hopes to sign a multilateral convention by 2022 and implement the reform by 2023.

The final agreement will be delivered to the G20 finance ministers meeting in Washington D.C. on Wednesday, then it will be charted off to the G20 Leaders’ Summit in Rome at the end of this month, according to a OECD press release.

The United States was in a bit of a defensive pattern under former President Donald Trump, after the country made tariff threats if the European nations, particularly France, decided to tax its big homegrown corporations.

French Finance Minister Bruno Le Maire said that the agreement, “opens the path to a true fiscal revolution.” US Treasury Secretary Janet Yellen said that the OECD has “decided to end the race to the bottom on corporate taxation,” referring to the practice of attracting large companies to headquarter in one’s country through purposefully incentivized lower tax rates.

Continue Reading

Recent

Signup for Broadband Breakfast

Get twice-weekly Breakfast Media news alerts.
* = required field

Trending