At a May 30 press briefing on the ongoing protests in Minneapolis about the police killing of George Floyd, Minnesota Public Safety Commissioner John Harrington described how they were going to track the protesters they had arrested. “Who are they associated with? What platforms are they advocating for?...Is this organized crime?...We are in the process right now of building that information network.” And, just like that, the language of epidemiological contact tracing leapfrogged to overmilitarized US domestic policing.
Months after the outbreak of COVID-19, as the world begins to consider reopening, it has become conventional wisdom that, until a vaccine is developed, the safest way to resume economic and social activity is through a rigorous process of testing the population and then tracing and notifying contacts of those who test positive.
Contact tracing is a standard disease control measure that has been used for decades. What is new is the potential for technology to turbocharge this process. Instead of a public health worker conducting a lengthy interview with someone who has tested positive for COVID-19 to reconstruct where they’ve been and with whom for the past two weeks, either the individual who has tested positive or a public health department could draw on data from a tracing app downloaded on the person’s mobile device. Using this data, they would know exactly where the device’s user had been and who they had been in contact with. This same data (either stored only on the individual’s device, or centralized) could also be used to notify the user if they had been in the area of someone who has tested positive (called exposure notification). The platform managing the application could also upload the data from many individual users to a centralized authority, creating a society-wide data set. Such a system could not only collect location data but also draw on other data about users to build sophisticated epidemiological models. And — the better to understand this vast wealth of public health data — they could deploy artificial intelligence (AI) to study patterns of spread, and to potentially better understand the virus.
If this all sounds familiar, it’s no surprise. “Solutionism” — the idea that every problem has a technological solution — is increasingly prevalent. But it’s not that simple.
The challenge with this solutionism is not that digital technologies are inherently problematic, or that big data and AI aren’t tremendously useful and powerful. Rather, it’s that embedded within these technologies are the biases, subjectivities and politics of those who design and build them. And the flaws in the algorithms very often mirror the flaws that underpin the social, political and institutional problems these technologies are being deployed to solve. They can even cause new problems, as complex technological systems interact with human, social and political realities. What’s more, because of the sheer scale and reach of the companies offering these technological solutions, each new use of a biased program can further embed its flawed assumptions into the fabric of our societies. Solving one problem can lead to others.
But — we are in a global pandemic. These are exceptional times and the potential for digital contact tracing or exposure notification is immensely enticing. There is no doubt that the computer in our pocket collecting infinite data about our movement and social lives seems purpose-built to supply the very knowledge needed to help us out of this truly wicked public health emergency.
How should governments think about the deployment of these technologies? While considering the tremendous potential of these technologies, I would urge four additional cautions.
First, not all the technologies available for use in fighting COVID-19 are equal or interchangeable. There is a big difference between centralized contact tracing, whereby data about a society is collected en masse and used by public health authorities to control the epidemic, and exposure notification apps, which tell an individual user whether they may have been near someone who has tested positive. Beyond these, another category of surveillance technology is being offered, by companies such as Palantir and Clearview AI, which promises to use AI and facial recognition to make sense of these new types of data. As always, the promise is that with more data comes greater insight. We must evaluate each use case and proposed tech application on its own and be very careful of enabling a period of rapid and broad surveillance tech adoption.
Second, governments must look closely at the challenges of implementing digital contact tracing. If either the system’s design or its operation fails, any trade-off between civil liberties and public health or between data privacy and collective good is moot. In terms of design, contact tracing is not simply an app that can be quickly developed and rolled out (although there are many such products on offer), but rather a product more akin to a platform — a system that needs to reliably collect, store and manage very large data sets and sustain a vast network infrastructure. How this system is designed will have widespread downstream effects on the utility of the exercise. As for implementation, governments are proposing national rollouts of a very intrusive and complex technological platform that demands 60 percent adoption rate among the population to be effective. It is worth noting that no country currently working with voluntary digital contact-tracing systems has hit this target yet. We also must keep in mind the track record of governments implanting complex technology platforms. It doesn’t always end well.
Third, consider how power will be reflected in the choices made. There are economic and political factors underlying our decisions around these technologies. On the last episode of Big Tech, Joseph Stiglitz drew a parallel between the concentration of economic and political power seen during the Great Depression and the trustbusting that followed. Big tech is getting bigger through the pandemic, and we know that these companies have long lusted for the data troves of the financial sector and the health system. Are we entrenching this trend? All of this is occurring in a moment when the tensions in the geopolitics of technology infrastructure are coming to a head. Fractures between US and Chinese technologies, and their respective countries’ governance systems, are emerging. The choices nation-states make about technology infrastructures and how or whether to govern them are inseparable from this power struggle.
Fourth, we need to ask how the deployment of these technologies could embed new norms in our governance system and exacerbate existing inequities and abuses. Which brings us back to the Minnesota public safety commissioner. Contact tracing might seem like a good idea to a wealthy computer programmer in Palo Alto, or to a comfortable civil servant in Ottawa. It may look very different to communities that have long experienced the costs of data being weaponized against them. It may look different to the communities that are experiencing disproportionate harm from the COVID-19 virus. When the language and technologies of surveillance become normalized in a public health emergency, will they also be deployed in the service of other spaces, such as in policing?
These concerns have led our guest on this week’s Big Tech podcast, Carly Kind, to conclude that there is an absence of evidence to support an immediate national take-up of the technical solutions under consideration. Governments will have to make decisions based on often opaque trade-offs. It may be that the deployment of some technologies being offered is simply not worthwhile.
In the end, the answer may come down to governance. For too long, about too many issues, we have simply left the governance of tech to those who design it. Instead, we need to think critically about the ways in which the deployment of digital technology in our society bumps up against our existing democratic laws, norms and regulations and how it could change them. Until we are willing and able to have that conversation, we should be cautious about the “solutions” we adopt.