Facebook Faces A Skeptical Audience on Data Privacy and Terror Transparency

WASHINGTON, June 7, 2018 – A top Facebook policy official on Wednesday defended the social media giant’s new policies about safeguarding privacy and data transparency against the doubts of an audience at the New America Foundation, a think tank generally friendly to Facebook. In recent months, Faceb

Facebook Faces A Skeptical Audience on Data Privacy and Terror Transparency
Photo of Discord CEO Jason Citron at the Wednesday hearing

WASHINGTON, June 7, 2018 – A top Facebook policy official on Wednesday defended the social media giant’s new policies about safeguarding privacy and data transparency against the doubts of an audience at the New America Foundation, a think tank generally friendly to Facebook.

In recent months, Facebook has faced heavy scrutiny from Congress for potential data privacy violations, as well as its role in spreading disinformation during the 2016 elections.

Speaking at New America Foundation, Monika Bickert said that Facebook’s deals with numerous companies – including a recently-disclosed data-sharing arrangement with phone manufacturer Huawei – are “completely different” from the deal struck with Cambridge Analytica.

That’s because the data is stored on the Huawei phone held by the consumer, and not on Cambridge Analytica’s servers, said Bickert, the company’s vice president of global privacy.

She stressed that unlike the freewheeling days of Facebook’s earlier years, new policies regarding the sharing of user data have been put in place.

Finding the balance to protect data privacy with new research initiatives

However, new measures to protect data privacy of users may prove difficult to balance with Facebook’s development of new research initiatives aimed towards creating new counterterrorism efforts.

One thing Facebook is looking at is how Facebook can do research transparently, and yet not threaten user privacy.

Facebook has removed 1.9 million pieces of content for violating its policies against terrorist-related speech in the past quarter, she said.

Due to the sheer volume of live posts, content reviewers at Facebook do not look at every post that goes live. Instead, rather than relying on users to flag the content, they rely on technical tools to do a large amount of the work.

“We use technical tools to find content likely to violate policies,” she said.

Facebook and others use the hash-sharing database

One of these tools is a “hash” sharing database that Facebook launched in 2016 along with Microsoft, Twitter, and YouTube. This allows companies to share the “hash,” a unique digital fingerprint or signature, of terrorist images or videos with one another, so that social media websites can prevent the content from being uploaded.

But it is much more difficult to stop hate speech on the platform, she said, because something like hate speech is heavily dependent on context.

While the social media giant faces criticisms for potentially creating a monopoly within the industry, there may be advantages to Facebook’s power as an authority in the industry. “It cannot be a one company approach,” said Bickert, responding to concerns about the spread of terrorist propaganda on social media.

The benefits of bigness in rapidly identifying and removing terrorist propaganda

With ISIS, she said, they observed that the better the big companies such as Facebook become at rapidly finding and taking down content containing terrorist propaganda, the more those malicious users begin to move towards and target smaller social media companies, which may not have the technology and manpower necessary to combat those groups.

Companies must work together on the issue of counterterrorism efforts. “The sophistication and coordination of the terror groups really brought that lesson home,” she said.

More than 99 percent of what they remove for terror propaganda is flagged by technical tools, Bickert claimed.

Changes in the disclosure and display of political ads

Facebook has also recently launched new policies regarding how political ads will be displayed on the platform. Political ads will be clearly labeled with information about the sponsor of the ad. Viewers of the ad can also click on an icon to find more information, such as the budget of the campaign for the ad as well as data statistics of other people who have viewed the ad.

When asked about how Facebook intends to deal with the disinformation that may increase during the 2018 midterm elections, Bickert said, “We are focused on midterm elections, but there are so many elections around the world where this is a problem.”

Facebook has focused in the past German and French elections, she said, on removing fake accounts beforehand in order to prevent those accounts from spreading disinformation.

(Photo of Monika Bickert at SXSW in 2017 by nrkbeta used with permission.)