European Lessons in Self-Experimentation: From the GDPR to European Platform Regulation

June 20, 2022
03_Hoboken_1-BG 03_Hoboken_2-MG 03_Hoboken_3-FG

This essay is part of The Four Domains of Global Platform Governance, an essay series that examines platform governance from four distinct policy angles: content, data, competition and infrastructure.

One of the most pointed commentaries on Brexit the author has read was a comparison to medical self-experimentation. Significant progress in medicine has been the result of pioneers picking themselves as a subject, with some important discoveries being made.1 The United Kingdom, with its Brexit strategy to leave the European Union, is doing us a similar favour, wrote Simon Kuper in 2017, as “advanced societies rarely do anything so reckless, which is why the Brexit experiment is so valuable” (Kuper 2017).

Considering the stakes and challenges in platform regulation and technology law and policy more generally, similar self-experimentation in this area may be needed. Although the European Union may not like being compared to a reckless self-experimenter for its regulatory appetite in this area, it may be worth considering its most recent endeavours precisely in this light.

The EU project2 on platform regulation is building on the successes and failures of EU data protection law, centrally the General Data Protection Regulation (GDPR).3 The GDPR was a major example of the possibility to set global standards and strive for the so-called Brussels effect (Bradford 2020; Mahieu et al. 2021). Among the main innovations in the GDPR was an emphasis on accountability and risk management, sidestepping more classical legal approaches rooted in European legal traditions in favour of EU harmonization and alignment with emerging corporate compliance practices.

The GDPR updated its data protection rules in a heavily politicized process in the mid-2010s. Instead of paving the way for more streamlined global personal data flows and service ecosystems, due to a combination of factors, including the Snowden revelations and a US-dominated tech industry, the updated rules are strengthened in several respects: the rights of individuals in relation to the processing of their data, the obligations on market players when processing personal data and the enforcement powers of independent data protection authorities. The experiment Europe is offering the world is as follows: Considering the pervasive individualized information flows generated by current technological infrastructures and the reliance on these infrastructures for innovations in commerce and public administration, is regulation that favours privacy still possible, and what would be needed to uphold such rules?

Although the resulting enforcement culture is still developing and it is too early to have a final say, it is apparent that the European Union’s data protection project is running into some walls.

Even if the glass is half-empty, perhaps we should take a sip first and appreciate what the GDPR has to offer. First, the GDPR doubles down on the central consideration of fundamental rights in relation to data-driven power dynamics. On a related note, the GDPR requires that as soon as personal data is being collected (van Hoboken 2016), those in charge (the controllers) must have good reasons for doing so, and for any analysis and use of the relevant personal data. A variety of data subject rights gives individuals the possibility to directly engage controllers with requests to gain access, correction, deletion, portability and (some) explanation. Access to justice is broadly supported, at least in theory, as individual and collective access to independent enforcement authorities through complaint procedures4 as well as through the courts (Mildebrath 2020) are part of the design (La Quadrature du Net 2021; noyb 2020; Ryan 2021).

Although the resulting enforcement culture is still developing and it is too early to have a final say, it is apparent that the European Union’s data protection project is running into some walls. A first significant problem is the GDPR’s wide (Purtova 2018) applicability5 to the processing of personal data in all but a few areas, in combination with its relatively abstract rules. The result of this is that much of the privacy work that is happening by those wanting or needing to comply with European rules amounts to interpretation and concretization of such general rules in a specific (typically dynamic) context. Many smaller organizations or organizations without advanced law and technology expertise lack the skills to do this in a meaningful and creative way. Many of the organizations that do have the skills tend to do so in ways that are not fully in line (Waldman 2021) with underlying principles (Solove 2021), using a lack of legal certainty around basic concepts as a shield against robust enforcement. Data protection authorities that should be leading on enforcement, such as the Irish Data Protection Act (DPA) (Bertuzzi 2021a), in particular, lack the incentives for strict enforcement. As a result of mounting complexity and a dearth of enforcement capacity and leadership in the DPA community, enforcement of the DPA starts showing signs of a significant dumbing down in focus. Whereas some interesting cases are proceeding through the courts, DPAs are fining companies (Walter 2021) for not having privacy policies in local languages.

The European Union appears determined to learn from some of its past successes and failures in its materializing ambitions for platform regulation. Setting a global standard is a main political driver for the process of regulating content moderation practices in the Digital Services Act (DSA). And the protection of fundamental rights (freedom of expression, privacy, non-discrimination) is again one of the main anchor points for the regulation. The enforcement chapter in the DSA will likely steer clear of the predictable incentives of national enforcement authorities in dominant platform companies’ favourite EU nations. For these platform companies, enforcement is likely to be centralized at the EU level (Bertuzzi 2021b), paving the way for more centralized enforcement in relation to big tech in other areas. Moreover, the significant political momentum (Zuboff 2019) to address the excesses of advertisement-based business models, which have continued partly as a result of a lack of GDPR enforcement, is translating into proposals6 for bans on personalized advertising and restrictions on the use of user data for ad targeting (Armitage, Ryan and Buri 2021).

But setting a standard is one thing; enforcing it meaningfully is another. The DSA puts a lot of its eggs in the transparency basket. As information asymmetries in relation to online platforms are staggering, well-designed transparency and accountability mechanisms are easy to support. But who will make effective use of the explosion of transparency reporting and notices on content moderation practices? The DSA combines continued (bad?) faith in consumer choice with independent regulatory oversight and a special access regime (Dutkiewicz 2021) to data by researchers and, perhaps, civil society and journalists. Here it seems to underestimate the challenge of ensuring meaningful accountability in practice. First, there is the challenge of sufficiently funding and staffing public enforcement authorities in this new complex area of law. Second, the EU legislature appears to assume there is an army of well-funded journalists,7 civil society8 (York 2022) and academic researchers (Rieder and Hofmann 2020; Armitage, Ryan and Buri 2021) standing ready to make effective use of the information that will become available. Without significant and structural public funding (Cavazzini 2021) to accompany the DSA’s framework, EU regulation may remain an ambitious experiment for the world to watch, instead of to profit from or aspire to.

  1. See https://en.wikipedia.org/wiki/Self-experimentation_in_medicine.
  2. See https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.
  3. See EC, Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), [2016] OJ, L 119, online: <https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:02016R0679-20160504&from=EN>.
  4. See https://theprivacycollective.eu/en/.
  5. See Reference for a preliminary ruling — Protection of natural persons with regard to the processing of personal data — Regulation (EU) 2016/679 (2021), Opinion of Advocate General Bobek, No C-245/20, online: <https://curia.europa.eu/juris/document/document.jsf?text=&docid=247105&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=5763643>.
  6. See https://trackingfreeads.eu.
  7. See https://themarkup.org.
  8. See https://algorithmwatch.org/en/.

Works Cited

Armitage, Catherine, Johnny Ryan and Ilaria Buri. 2021. “Online advertising: These three policy ideas could stop tech amplifying hate.” DSA Observatory, July 5. https://dsa-observatory.eu/2021/07/05/online-advertising-these-three-policy-ideas-could-stop-tech-amplifying-hate/.

Bertuzzi, Luca. 2021a. “MEPs call for infringement procedure against Ireland.” Euractiv, May 20. www.euractiv.com/section/data-protection/news/european-parliament-calls-for-infringement-procedure-against-ireland/.

———. 2021b. “DSA: enforcement for very large online platforms moves toward EU Commission.” Euractiv, November 3. www.euractiv.com/section/digital/news/dsa-enforcement-for-very-large-online-platforms-moves-toward-eu-commission/.

Bradford, Anu. 2020. The Brussels Effect: How the European Union Rules the World. New York, NY: Oxford University Press.

Cavazzini, Anna. 2021. “Internal Market and Consumer Protection (IMCO).” Background paper presented at “The Digital Services Act and the Digital Markets Act: A forward-looking and consumer-centred perspective” workshop, May 26. www.europarl.europa.eu/cmsdata/234761/21-05-19%20Background%20note%20REV%20final.pdf.

Dutkiewicz, Lidia. 2021. “From the DSA to Media Data Space: the possible solutions for the access to platforms’ data to tackle disinformation.” European Law Blog, October 19. https://europeanlawblog.eu/2021/10/19/from-the-dsa-to-media-data-space-the-possible-solutions-for-the-access-to-platforms-data-to-tackle-disinformation/.

Kuper, Simon. 2017. “Brexit is Britain’s gift to the world.” Financial Times, September 21. www.ft.com/content/a6b1f948-9d8e-11e7-9a86-4d5a475ba4c5.

La Quadrature du Net. 2021. “Amazon fined 746 million euros following our collective legal action.” La Quadrature du Net, July 30. www.laquadrature.net/en/2021/07/30/amazon-fined-746-million-euros-following-our-collective-legal-action/.

Mahieu, René, Hadi Asghari, Christopher Parsons, Joris van Hoboken, Masashi Crete-Nishihata, Andrew Hilts and Siena Anstis. 2021. “Measuring the Brussels Effect through Access Requests: Has the European General Data Protection Regulation Influenced the Data Protection Rights of Canadian Citizens?” Journal of Information Policy 11: 301–49. www.jstor.org/stable/10.5325/jinfopoli.11.2021.0301#metadata_info_tab_contents.

Mildebrath, Hendrik. 2020. “The CJEU judgment in the Schrems II case.” European Parliamentary Research Service, September. www.europarl.europa.eu/RegData/etudes/ATAG/2020/652073/EPRS_ATA(2020)652073_EN.pdf.

noyb. 2020. “First decision on noyb’s streaming complaints.” noyb, September 16. https://noyb.eu/en/first-decision-noybs-streaming-complaints.

Purtova, Nadezhda. 2018. “The law of everything. Broad concept of personal data and future of EU data protection law.” Law, Innovation and Technology 10 (1): 40–81.

Rieder, Bernhard and Jeanette Hofmann. 2020. “Towards platform observability.” Internet Policy Review: Journal on Internet Regulation 9 (4): 1–28. https://policyreview.info/pdf/policyreview-2020-4-1535.pdf.

Ryan, Johnny. 2021. “Big news. Google and the entire tracking industry relies on IAB Europe’s consent system, which will now be found to be illegal.” (Twitter thread). Twitter, November 5, 10:32 a.m. https://twitter.com/johnnyryan/status/1456630527721680902.

Solove, Daniel. 2021. “A Provocative Critique of Privacy Law: An Interview with Ari Waldman.” Privacy & Security Blog, November 7. https://teachprivacy.com/a-provocative-critique-of-privacy-law-an-interview-with-ari-waldman/.

van Hoboken, Joris. 2016. “From Collection to Use in Privacy Regulation? A Forward-Looking Comparison of European and US Frameworks for Personal Data Processing.” In Exploring the Boundaries of Big Data, edited by Bart van der Sloot, Dennis Broeders and Erik Schrijvers, 231–52. Amsterdam, the Netherlands: Amsterdam University Press.

Waldman, Ari Ezra. 2021. Industry Unbound: The Inside Story of Privacy, Data, and Corporate Power. Cambridge, UK: Cambridge University Press.

Walter, Andre. 2021. “English privacy notice leads to Dutch data protection fine.” Pinsent Masons, July 28. www.pinsentmasons.com/out-law/news/english-privacy-notice-leads-to-dutch-data-protection-fine.

York, Jillian C. 2022. Silicon Values: The Future of Free Speech Under Surveillance Capitalism. London, UK: Verso.

Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York, NY: PublicAffairs.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

The Four Domains of Global Platform Governance

In the span of 15 years, the online public sphere has been largely privatized and is now dominated by a small number of platform companies. This has allowed the interests of publicly traded companies to determine the quality of our civic discourse, the character of our digital economy and, ultimately, the integrity of our democracies. This essay series brings together a global group of scholars working in four distinct domains of the platform governance policy discourse: content, data, competition and infrastructure.