Key Points
- Canada is in a governance vacuum regarding the management of its data and digital infrastructure.
- Policy to proactively manage data and technology is urgently needed.
- This policy could be imagined as a set of three planks: a national data and digital infrastructure policy; the self-regulation of software engineers; and procurement reform for government technology.
- Without these types of reforms, Canada is vulnerable and in danger of democratic erosion and the commercialization of its public service.
n the year 2000, Lawrence Lessig, a lawyer and a technologist, wrote an essay entitled “Code Is Law.” In it he warned of the governance vacuum that we find ourselves in today — a place where technology has hurtled ahead of governance, making software code created for commercial ends part of our de facto law (Lessig 2000).
This rapid technological development of the internet era has created immense vulnerability within our Western democracy. To protect our democracy, and to ethically advance its vast potential, governance is required that will both embrace the opportunities inherent to this time and manage a mounting number of technology-related challenges.
Technology is fundamentally shifting the way our society functions. The time is now to backfill existing policies and laws on the management of technology — in particular, in the areas of digital infrastructure and data.
Several issues related to data capture and usage have informed public debate of late, including the erosion of privacy, state surveillance, political interference, the decline of journalism, network effects and big tech monopolies. There are increased calls for new thinking on consumer protection and updated antitrust laws. But there is an emerging phenomenon in governance that is of a different magnitude in terms of impact. In the absence of policy and law to manage data and digital infrastructure, tech firms are building themselves up as parallel government structures.
In the absence of policy and law to manage data and digital infrastructure, tech firms are building themselves up as parallel government structures.
A new range of products and services are coming to market — solutions to support or supplant government operations with analytics, machine learning and artificial intelligence (AI). Data is the main ingredient in all of these products.
This trend necessitates a re-examination of how the public and private sectors function together in a liberal democracy, and a proactive evolution in public service delivery. Government technology must support democratically informed policies and procedures, not override them. This work is starting late. As such, flexible policy is required that can strike the right balance of speed and rigour.
At its core, a digital infrastructure and data policy must define four things: who can own data (personal, government, aggregate, environmental and more); how it can be collected; who can use it; and under what terms. This framework must be organized nationally and developed at both provincial and municipal levels. The rise of smart city technology coupled with the Internet of Things coming online is creating urgency. But this issue extends well beyond smart city technology. Every industry sector and public service are impacted, as are fundamentals such as labour and commerce. They are all rapidly changing.
There are two additional policy measures that can be explored to augment a digital infrastructure and data policy: the self-regulation of software engineering and procurement reform for government technology purchases. The three planks of this policy suite can begin to manage the change Canada is facing. Policy must be created to protect our government and democracy from erosion and the commercialization of the public service while enabling a thriving innovation ecosystem, one that is intentional about maximizing public good.
Plank One: A Policy Approach to Manage Digital Public Infrastructure and Data
Within this policy framework, one basic tenet to consider regarding data ownership relates to infrastructure. Hardware in public spaces, such as sensors, that collect environmental or human data must be either owned by, or wholly accessible to, government. Hardware that collects this data must be understood as critical state infrastructure.
According to Kurtis McBride, CEO of Miovision, it is important to get the architecture right when talking about public digital infrastructure — it must be open (McBride, quoted in Pender 2017). This approach can add immense capital value to the public sector’s ledger rather than handing it over to private markets.
Building on an open architecture that is owned by government, the ways in which data is collected and shared can be debated and refined. In the case of personal data, “Residents can co-design the terms and conditions for the use of their data,” explains Pamela Robinson, professor of urban planning at Ryerson University (Robinson, quoted in Wylie 2018).
This conversation will include important questions of whether personal data should be collected at all in certain scenarios. Not collecting personal data is a policy option, too. This will also open up a much-needed public discussion about revisions and updates required of both the Privacy Act and the Personal Information Protection and Electronic Documents Act, in particular around the notion of consent.
Data ownership can and must sit with the government and its people. As global adoption of open data policies continues, there will be a growing set of case studies to help define how much of our data should be made open, with a default toward openness, and an evolving set of requirements for cases where data should not be published. Proceeding with anything less than this approach is the equivalent of enabling private ownership of critical government infrastructure, civic intellectual property and our civic census. The arguments for openness related to digital infrastructure and data are numerous.
Data ownership can and must sit with the government and its people.
As Gavin Starks (2016), entrepreneur and open data pioneer, has long argued, a commitment to the openness of data, both by the government and the private sector, is a way to level the playing field for many data users. It is a way to unlock value and capacity for innovation, in particular in the face of big technology, AI and monopolistic data powers.
The current approach taken by governments as they slowly move toward “open by default” data publishing stands in severe contrast to the ever-increasing market privatization of raw data — data that is captured, held and sold by the private sector.
Digital infrastructure and data policy can level the terms that define the data arena, including the types of high-value core raw data that must be public. Raw trip data held by private transportation companies is an example that comes to mind. These businesses exist through the use of roads, and they significantly impact the delivery of public transportation services. The rationale is there to require their raw trip data be made publicly available to support planning and service delivery by both private and public sector actors.
It is not economically sensible to allow high-value unprocessed data such as this to be locked away in proprietary models. According to Starks (2016), data is not the new oil because data is not scarce, it can be duplicated at little to no cost and it increases in value as it is linked together. These are all special qualities that spur innovation.
Intentional management of data to preserve public ownership and access will support the creation of data with high public value. Without it, we risk veering toward the privatization of policy development and public service delivery through the purchase of proprietary products and services that the government, as consumer, neither understands nor can build itself.
Consider transportation planning again. Using a mix of private and public data as training data, tech companies are able to offer transportation planning services and modelling products that governments cannot match, and few firms can compete with. It would be counterproductive to public service delivery to reject the best product on the market because it is not government-produced. The first related problem, and downstream outcome, is vendor lock-in. The second is government purchase of proprietary products that are closed in terms of their methodology and handling of data.
Creating policy for openness in algorithms, as New York City has begun to do, is one option for management, although the approach is rife with challenges. Beyond algorithms is AI, where the rationale for decision making can become incomprehensible. As these types of products expand and are used as inputs to public service planning and delivery, vendor dependency, product opaqueness and a range of unknown social impacts, including the future role, size and shape of the public service, loom large. These issues will continue to emerge in every public service delivery context, from health care to housing and from education to criminal justice.
This is an opportunity to create policies and laws to support broad open data sets that would enable more competition and more transparent products. In addition, governments would be able to create their own comparable products and services. Part of this work will be to define the granularity and nature of data that cannot be held privately because it is fact, not property.
Plank Two: From Civil Engineering to Software Engineering
As Lessig (2000) wrote, there is power that sits with the people who write software code, code that uses data and makes rule-based decisions. Historically, when individuals had awareness of their professional impact on public safety, they found ways to attach a site-of-care principle to their work. Well-known examples include the Hippocratic oath in medicine and the self-regulation of civil engineering. Given the implications of applying data and decision-making software to public service delivery, training in the humanities — ethics, anthropology and sociology, among others — should be required for individuals to work on certain types of software.
Rather than tend toward the historical norm of self-regulation in the engineering world, Ian Bogost (2015) writes in The Atlantic that: “software development has become institutionally hermetic. And that’s the opposite of what ‘engineering’ ought to mean: a collaboration with the world, rather than a separate domain bent on overtaking it.”
Plank Three: Procurement Reform — Buy versus Build and Other Considerations
The final plank of this proposed policy trifecta is procurement reform for government technology. As the workforce evolves and matures, there will be numerous digital natives joining the public service. Space should be protected for current and future public service technologists to design and develop the next generation of public sector tech, in particular in critical areas of government operations.
This will involve revisiting buy versus build conversations. Some solutions should be purchased, others should be built in-house and some cases will be a mix of the two options. Different licensing agreements and open source software should be explored to enable efficiencies of scale and shared code among governments.
There has been severe underinvestment in technical capacity within government over the past two decades. Government tech debt and the state of legacy information technology in government is troubling. Beyond the varied impacts of not building some tech solutions in-house, a lack of technology capacity is also impeding the government's ability to properly manage technology procurement as a customer.
The new software products for sale in every public sector vertical market will increasingly leverage automated decision making, machine learning and AI. As such, this is the right time to put a moratorium on the purchase of non-critical software related to public service delivery.
Borrowing from context provided for those working in bioethics, consider the idea of primum non nocere (first, do no harm). This idea that sometimes doing nothing is better or safer than doing something is appropriate for our time. The stakes are too high to be making purchasing decisions without thoughtful guidance.
A related theme to be considered in this work is the growing and troubling unchecked global consensus around the merits of technocratic governance and data-driven decision making, an approach that informs the creation of government software.
This consensus threatens to normalize an efficiency obsession and entrench governance that dilutes and misunderstands the power of political decision making. Some processes and policies are inherently inefficient. Values-based leadership and decision making must be protected.
Regulating to Safeguard Democracy
The regulation of data and digital infrastructure will not stall economic development and growth. Conversely, it will enable it. By using regulation to manage social and democratic risk and inadvertent outcomes, the private sector can participate in the data and digital infrastructure economy in an organized and productive way. It saves businesses from being caught up in unintentional consumer protection disasters and allows the focus of research and development to occur in a targeted way, to bring the full power of innovation to bear upon a broad range of public sector needs.
End Game: Uphold Democracy and Its Institutions or Drift to Code as Law
The tone of late has been one of awakening — a cultural realization that technology may be going too far, too fast and that we are unclear on how to address it. It is critical to understand this current context and act fast to address the governance void. As Starks (Gorynski 2017) calls for, we need public debate about the social contract between residents and the state, between residents and companies, and between companies and the state.
Consultation among and between the government, the citizenry and the private sector is key. The government answers to its people through legal mechanisms in a way that corporations do not, making it the preferred steward of data. This is not to downplay the dangers of the state’s use of data and the need to safeguard against the many nefarious and abusive practices it can enable. This is also not to underestimate the power of lobbyists to exert market will on government, which is indeed the rule not the exception historically and currently.
Individual ownership and control of personal data is a space to watch. The mechanisms that this model can use to assert power are currently too underdeveloped to make individuals the lead actors in this policy work, in particular given the urgency of the situation. The mechanism is also limited in that it speaks primarily to personal data. It falls short of managing the much larger sets of data that are not personal, such as aggregate data, data about government assets, environmental data and more. Regardless, there is a growing movement to enable individuals’ control of their data. The influence of this movement in the policy space can also be expected to grow.
For now, so long as robust mechanisms exist for public input on policy and politics, government ownership of digital infrastructure and data, as well as strong guidance on related policy, is the most democratically informed approach possible. Now we must come together as a nation to discuss what we want to protect in our democracy given these new technological forces at play, how to best do so and how to enable our society and economy to thrive using technology and data, not despite them.