One of the most pointed commentaries on Brexit the author has read was a comparison to medical self-experimentation. Significant progress in medicine has been the result of pioneers picking themselves as a subject, with some important discoveries being made.1 The United Kingdom, with its Brexit strategy to leave the European Union, is doing us a similar favour, wrote Simon Kuper in 2017, as “advanced societies rarely do anything so reckless, which is why the Brexit experiment is so valuable” (Kuper 2017).
Considering the stakes and challenges in platform regulation and technology law and policy more generally, similar self-experimentation in this area may be needed. Although the European Union may not like being compared to a reckless self-experimenter for its regulatory appetite in this area, it may be worth considering its most recent endeavours precisely in this light.
The EU project2 on platform regulation is building on the successes and failures of EU data protection law, centrally the General Data Protection Regulation (GDPR).3 The GDPR was a major example of the possibility to set global standards and strive for the so-called Brussels effect (Bradford 2020; Mahieu et al. 2021). Among the main innovations in the GDPR was an emphasis on accountability and risk management, sidestepping more classical legal approaches rooted in European legal traditions in favour of EU harmonization and alignment with emerging corporate compliance practices.
The GDPR updated its data protection rules in a heavily politicized process in the mid-2010s. Instead of paving the way for more streamlined global personal data flows and service ecosystems, due to a combination of factors, including the Snowden revelations and a US-dominated tech industry, the updated rules are strengthened in several respects: the rights of individuals in relation to the processing of their data, the obligations on market players when processing personal data and the enforcement powers of independent data protection authorities. The experiment Europe is offering the world is as follows: Considering the pervasive individualized information flows generated by current technological infrastructures and the reliance on these infrastructures for innovations in commerce and public administration, is regulation that favours privacy still possible, and what would be needed to uphold such rules?
Although the resulting enforcement culture is still developing and it is too early to have a final say, it is apparent that the European Union’s data protection project is running into some walls.
Even if the glass is half-empty, perhaps we should take a sip first and appreciate what the GDPR has to offer. First, the GDPR doubles down on the central consideration of fundamental rights in relation to data-driven power dynamics. On a related note, the GDPR requires that as soon as personal data is being collected (van Hoboken 2016), those in charge (the controllers) must have good reasons for doing so, and for any analysis and use of the relevant personal data. A variety of data subject rights gives individuals the possibility to directly engage controllers with requests to gain access, correction, deletion, portability and (some) explanation. Access to justice is broadly supported, at least in theory, as individual and collective access to independent enforcement authorities through complaint procedures4 as well as through the courts (Mildebrath 2020) are part of the design (La Quadrature du Net 2021; noyb 2020; Ryan 2021).
Although the resulting enforcement culture is still developing and it is too early to have a final say, it is apparent that the European Union’s data protection project is running into some walls. A first significant problem is the GDPR’s wide (Purtova 2018) applicability5 to the processing of personal data in all but a few areas, in combination with its relatively abstract rules. The result of this is that much of the privacy work that is happening by those wanting or needing to comply with European rules amounts to interpretation and concretization of such general rules in a specific (typically dynamic) context. Many smaller organizations or organizations without advanced law and technology expertise lack the skills to do this in a meaningful and creative way. Many of the organizations that do have the skills tend to do so in ways that are not fully in line (Waldman 2021) with underlying principles (Solove 2021), using a lack of legal certainty around basic concepts as a shield against robust enforcement. Data protection authorities that should be leading on enforcement, such as the Irish Data Protection Act (DPA) (Bertuzzi 2021a), in particular, lack the incentives for strict enforcement. As a result of mounting complexity and a dearth of enforcement capacity and leadership in the DPA community, enforcement of the DPA starts showing signs of a significant dumbing down in focus. Whereas some interesting cases are proceeding through the courts, DPAs are fining companies (Walter 2021) for not having privacy policies in local languages.
The European Union appears determined to learn from some of its past successes and failures in its materializing ambitions for platform regulation. Setting a global standard is a main political driver for the process of regulating content moderation practices in the Digital Services Act (DSA). And the protection of fundamental rights (freedom of expression, privacy, non-discrimination) is again one of the main anchor points for the regulation. The enforcement chapter in the DSA will likely steer clear of the predictable incentives of national enforcement authorities in dominant platform companies’ favourite EU nations. For these platform companies, enforcement is likely to be centralized at the EU level (Bertuzzi 2021b), paving the way for more centralized enforcement in relation to big tech in other areas. Moreover, the significant political momentum (Zuboff 2019) to address the excesses of advertisement-based business models, which have continued partly as a result of a lack of GDPR enforcement, is translating into proposals6 for bans on personalized advertising and restrictions on the use of user data for ad targeting (Armitage, Ryan and Buri 2021).
But setting a standard is one thing; enforcing it meaningfully is another. The DSA puts a lot of its eggs in the transparency basket. As information asymmetries in relation to online platforms are staggering, well-designed transparency and accountability mechanisms are easy to support. But who will make effective use of the explosion of transparency reporting and notices on content moderation practices? The DSA combines continued (bad?) faith in consumer choice with independent regulatory oversight and a special access regime (Dutkiewicz 2021) to data by researchers and, perhaps, civil society and journalists. Here it seems to underestimate the challenge of ensuring meaningful accountability in practice. First, there is the challenge of sufficiently funding and staffing public enforcement authorities in this new complex area of law. Second, the EU legislature appears to assume there is an army of well-funded journalists,7 civil society8 (York 2022) and academic researchers (Rieder and Hofmann 2020; Armitage, Ryan and Buri 2021) standing ready to make effective use of the information that will become available. Without significant and structural public funding (Cavazzini 2021) to accompany the DSA’s framework, EU regulation may remain an ambitious experiment for the world to watch, instead of to profit from or aspire to.