What Is Stalling Better Data Governance?

There is a flurry of action around developing data governance frameworks, but the approach is dangerously scattered

Published: June 7, 2019

Author: Sean Martin McDonald

Data governance is quickly becoming a global policy priority, and rightly so. Established interests and powerful companies are framing and reframing the public debate over data governance at a dizzying pace, and to an extent, defining the models used to govern data. While it’s important that meaningful action is (finally) underway for digital regulation, the approach is scattered, and the scope is narrow. A more cohesive understanding of — and approach to — effective, ethical and international data governance is needed, and fast. Without it, we risk digitally replicating the power asymmetries at the root of many of today's governance challenges. 

At present, it’s hard to imagine there's enough consensus on the best-fit governance mechanism, or even the definition of data, to base global rules upon. There are well-placed data advocates calling for a universal declaration of digital human rights, a digital Geneva conventiona digital stability board, a digital bill of rights, and, by implication, a digital international atomic energy agency, among others. There are also dozens of competing analogies for data — oil, water, plutonium, labour, currency or the world’s most valuable commodity, to name a few.

Tech companies are feeling the cost of that chaos; they are spending record amounts lobbying for their interests and are staffing up on policy expertise.

Tech companies are feeling the cost of that chaos. Amid ongoing investigations into a number of platforms' operations, they are spending record amounts lobbying for their interests and are staffing up on policy expertise. Just this week, The Washington Post reported that the US Justice Department is initiating antitrust investigations into Apple, Google, Amazon and Facebook; the news caused the technology giants’ combined stocks to drop by US$133 billion in one day. The European Union has already handed out significant fines to big tech platforms, with more promised in 2019. Technology companies have become so desperate for continuity in regulatory regimes that they are advocating for regulatory frameworks — perhaps a self-serving action, but action nonetheless.

Lost in the rush to act in an unregulated space, however, are public participation and informed debate about the best way to act. Without some legitimate pathway to public participation, the debate around data governance design will remain contested and controlled by entrenched political and market interests.

Defining Success for Data Governance

In the race for dominance and advantage in the politics around data, markets and infrastructure, we are at the beginning of defining digital political science. Governments are actively developing a digitalpolitik — the realpolitik of the digital world. Building a discipline around intentional data governance design is a vital, missing link between the international order that exists and whatever we build to govern data and digital platforms.

In these uncharted waters, governments and lawmakers don’t have effective ways to figure out which data governance models are “good.” And, as any political scientist or sociologist or lawyer can tell you, collectively defining good is a non-trivial endeavour.

For all the energy and investment channelled into discussing the need for data governance rules, there is far less going into the development of an evidence base to support the selection or adaptation of a model that will work. Research in this area has been notoriously hard because platforms are reticent to share data; there are significant legal issues including data protection, privacy, and the potential for liability.

Thankfully, there are signs of progress on platform company interest — both because the liabilities for harmful governance decisions are increasing, and because data governance for public good is quickly becoming an industry itself. At a structural level, platforms are beginning to develop partnerships with charitable foundations, academic institutions and other international organizations to manage data sharing. While these structures represent progress, they usually are heavily influenced by the donating data platform, and aren’t particularly focused on the governance of those platforms.

The Challenge of Experimenting with Data Governance Models

Practical experimentation with data governance models is difficult for a number of reasons. First, the platforms that could benefit from improved regulation are operating at a global level; digital governance frameworks are typically developed at the national level. Second, corporate processes for data management, privacy or security, for example, can’t be tested very well without being deployed to the public. Platform companies routinely experiment with users and their data — often without being noticed — and yet, aren’t subject to the types of legal restrictions we require of human experimentation. As ethicists Kate Crawford and Jacob Metcalfe wrote in Where Are the Human Subjects in Big Data Research?, big data experimentation has enormous impacts, but we haven’t built any of the same separations between experimentation and market access, or the kinds of protections usually afforded to research subjects. While there are good, emerging models across a range of disciplines, major policy questions remain when it comes to conducting ethical data governance research.

As a result, there aren’t a huge number of case studies that empirically prove the impact of different data governance models. While most recognize the value of data governance from a public equity and representation perspective, there is no strong evidence base to prove that governance implicitly improves platform's complex social issues — like discrimination, misinformation, and surveillance. That’s especially important now because there are a number of powerful, public and commercial policy interests suggesting that data governance will achieve social policy outcomes. Ideally, in order to validate any of those claims, there would be an independent and applied research arm to produce case studies and comparative analysis, within ethical obligations and the appropriate oversight. Instead, the research environment largely mirrors the reactive nature of the digital policy environment, focusing on punishing companies for easy-to-describe harms, instead of building power structures that enable people to take meaningful action. There is impressive progress from content moderation scholars like Daphne Keller, Tarleton Gillespie and Eric Goldman on comparative approaches and applied platform governance.

One promising sign of the maturity of the debate is that the analogies are moving from declarations and treaties to multi-stakeholder bodies — and it’s no mistake that the first set of proposals focus on institutional models that manage collective harms, investments and shared stability. While these are incredibly important policy priorities, they are also reactive and focus on the priorities of those with the capacity and platform to define data governance. They’re understandably driven by market realities, escalating security concerns and ballooning digital power asymmetries; the priorities of vulnerable communities, however, are too rarely considered as well. In other words, data governance proposals themselves are explicitly competitive, political and complex and are largely moving forward without the meaningful participation of the people they impact.

The Need for Digital Rights Scholarship

Significant investment is flowing into ethical technology non-profits, relevant university departments and industry initiatives. Digitization has led to a resurgence in fields like philosophy, corporate governance and fiduciary law, but most digital scholarship is based on analogies, pre-existing frameworks and well-researched opinions.

The American scholar Kate Klonick recently wrote in Lawfare that we need to raise the bar on digital rights and governance scholarship, from anecdote and opinion to empirical research. As Klonick highlights, there are promising early signs; journalists, such as Julia Angwin, and scholars, such as Danielle Citron and Mary Anne Franks, have produced great legal surveys. In a more applied sense, the United Kingdom’s Information Commissioner’s Office has created a regulatory sandbox — an experimental, safe space to test applied approaches. Similarly, the work required to implement the European Union’s General Data Protection Regulation has created a wide range of market approaches to implementing data governance. Even more promising, the 2009 Nobel Prize in Economics celebrated Elinor Ostrom's empirical approach to researching governance models, starting with community-led commons. The politics of digital spaces may need to become a science, and with it, build the institutional architecture necessary to ensure our values survive its practice.

The limits of experimentation and early-stage scholarship are certainly stalling the development of effective data governance models, but nothing is a greater threat than politics.

There are promising signs, from the recent update of the US Government’s Common Rule, to the growing movement around humanitarian data ethics and preventing the digital exploitation of disaster victims. Both examples also illustrate, however, the challenge of extrapolating regulatory approaches across industries with different professional management structures. Medicine, for example, has some of the most developed professional institutions in the world, with special certification, ethical review processes, fiduciary responsibilities, governmental sponsorship and oversight, data privacy and portability laws and dispute resolution, among others. It is a significant challenge to adapt the same level of intellectual and procedural rigour to fields with less institutional machinery.

Conclusion

The limits of experimentation and early-stage scholarship are certainly stalling the development of effective data governance models, but nothing is a greater threat than politics. Already, data is understood to be one of the highest-stakes arenas for political contests. Data protection, security and adequacy measures already shape access to the most valuable markets in the world — and disagreements over those systems are already proving to be extraordinarily damaging to the global economy.

While there’s an inevitable element of realpolitik in the making of any international order, let alone one intended to be as far-reaching or operational as required by data, we shouldn’t squander thousands of years of political progress by letting force or wealth drive progress. The best way to avoid this is to invest in apolitical, multilateral data governance research toward building as much of an objective science as possible.

Data governance, like most forms of modern governance, will need checks and balances — these are best built as a collective. That’s not to suggest that political science has removed the realpolitik from government, rather it suggests that political science has created an independent view into the motivations, tactics and infrastructure that powerful interests use to control governments. Without a field, and applied experimentation, around data governance, it will take decades — if not centuries — to define, agree and implement public interest systems. Data governance is already happening, it's just being done behind closed doors in the interest of a small minority of people. We are at the precipice of transitioning that power, which will be complicated and political on its own. Rather than repeat the lessons of the first few millennia of governance, data governance advocates and policymakers should create more than rules — they should build the foundations for digital political science.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Sean Martin McDonald is the co-founder of Digital Public, which builds legal trusts to protect and govern digital assets.