EU’s Digital Services Act May Be a Model for the United States

The Digital Services Act imposes transparency requirements and other accountability measures for tech platforms.

EU’s Digital Services Act May Be a Model for the United States
Photo of Mathias Vermeulen, public policy director at the AWO Agency, obtained from Flickr.

September 16, 2022 – European Union’s Digital Service Act, particularly its data-sharing requirements, may become the model for future American future tech policy, said Mathias Vermeulen, public policy director at the AWO Agency, at a German Marshall Fund web panel Monday.

Now in the final stages of becoming law, the DSA aims to create a safer internet by introducing transparency requirements and other accountability measures for covered platforms. Of note to the German Marshall Fund paneliests was the DSA’s provision that, when cleared by regulators, “very large online platforms” – e.g., Facebook and Twitter – must provide data to third-party researchers for the purpose of ensuring DSA compliance.

In addition, the EU’s voluntary Code of Practice on Disinformation was unveiled in June, requiring opted-in platforms to combat disinformation by introducing bot-elimination schemes, demonetizing sources of alleged misinformation, and labeling political advertisements, among other measures. Signatories of the Code of Practice – including American tech giants Google Search, LinkedIn, Meta, Microsoft Bing, and Twitter – also agreed to proactively share data with researchers.

Vermeulen said that he expects the EU will soon draft new legislation to address the privacy concerns raised by the Digital Service Act’s data-sharing requirements.

The risks of large-scale data sharing

To protect user privacy, the DSA requires data handed over to researchers to be anonymized. Many experts believe that “anonymous” data is generally traceable to its source, however. Even the EU’s recommendations on data-anonymization best practices acknowledges the inherent privacy risks:

“Data controllers should consider that an anonymised dataset can still present residual risks to data subjects. Indeed, on the one hand, anonymisation and re-identification are active fields of research and new discoveries are regularly published, and on the other hand even anonymised data, like statistics, may be used to enrich existing profiles of individuals, thus creating new data protection issues.”

An essay from the Brookings Institution – generally supportive of the DSA’s data-sharing provisions – argues that many private researchers do not have the experience necessary to securely store sensitive data, recommending that the EU Commission establish or subsidize of secure centralized databases.

Popular Tags