Episode 1

CTRL+ALT+Revolt: The Tech Coup, with Marietje Schaake

Uncovering Schaake's strategies to push back against Silicon Valley's unrelenting expansion.

PP_S1_EP1_Marietje Schaake

Episode Description

Marietje Schaake joins the hosts to discuss her book The Tech Coup: How to Save Democracy from Silicon Valley (Princeton University Press, 2024). Informed by Marietje’s experience working at the forefront of tech governance, the conversation explores strategies for effective government regulation and ways citizens can counterbalance the immense power wielded by today’s tech giants, to promote a more democratic digital landscape.


59 Minutes
\
Published September 16, 2024
\ \

Chapters

1 0:00:00

Welcome to CIGI’s Policy Prompt

2 0:01:20

Introduction to The Tech Coup and guest Marietje Schaake

3 0:02:49

How is the show Suits about the astounding power of technology firms?

4 0:05:07

Exploring public-private sector dynamics

5 0:07:29

What made Schaake decide to write the book?

6 0:08:40

What Schaake hopes to accomplish with The Tech Coup

7 0:09:49

Book description

8 0:10:34

Is being pro-tech a generational thing?

9 0:13:23

How do we address the “benign neglect” of regulators?

10 0:16:35

Evolving views of politicians on tech regulation over time — was there a crossroads in the past?

11 0:21:47

Where did big tech’s superiority complex come from?

12 0:24:42

How do you decide who to engage with in this space, and when?

13 0:27:55

What is “The Real Facebook Oversight Board”? What does it aim to accomplish?

14 0:31:35

Next steps following recommendations outlined in The Tech Coup?

15 0:36:17

How Schaake determined which policy recommendations make the cut

16 0:41:54

US court decision relating to TikTok

17 0:45:28

The landscape of public-private sector roles in tech governance abroad

18 0:49:56

State-level versus global-level approaches

19 0:55:12

Debrief with Paul and Vass


Vass Bednar (host)

Welcome to Policy Prompt. I'm Vass Bednar and this is CIGI's new podcast for long-form conversations with leading thinkers on transformative technologies.

Paul Samson (host)

And I'm Paul Samson. Vass and I will be speaking with writers, policymakers, business leaders and technologists working across technology, society and public policy. We'll be prospecting for good ideas and for action. What are you really excited about with this new podcast, Vass?

Vass Bednar (host)

Good ideas only. I'm excited about this gap that you and I have both observed out there, where we think we can do a little more to create really nice opportunities for awesome authors and thinkers to dig into their work and bring it to life a little bit before us and with us. Now, you and I are both a little bit, just a little bit, obsessed with policy action and improving governance and how the right rules just make everything better and those are some of the dots we're going to link up together as we're speaking to people.

Paul Samson (host)

Yeah, definitely. So CIGI is looking to fill a gap here, and so we got the two of us together here, two of the busiest people that I can tell, as hosts and we're working at the coalface on a lot of different aspects of this issue and it's going to be fun.

Vass Bednar (host)

Absolutely.

To that end, Paul, I wanted to read you a little something from a new book called The Tech Coup. Do you mind?

Paul Samson (host)

No, go for it.

Vass Bednar (host)

Okay. "The digital technologies that once promised to liberate people worldwide have instead contributed to declining standards and freedoms as well as weakened institutions. And unless democracies begin to claw back their power from such companies, they'll continue to experience the erosion of their sovereign power."

Our guest today, that's the end of the quote actually, our guest today was first elected to the European Parliament at age 30 and they've watched tech seep into every corner of our lives. She was crowned Europe's most wired politician and called the liberal stalwart. Marietje Schaake is the international policy director at Stanford University's Cyber Policy Center and International Policy Fellow at Stanford's Institute for Human Centered Artificial Intelligence, and she's the author of the Tech Coup: How to Save Democracy from Silicon Valley.

Paul Samson (host)

Thanks Vass. Marietje also writes a monthly column for the Financial Times and serves on the UN AI Advisory Body. She's worked with CIGI before on the Global Commission on Internet Governance a few years back-

Vass Bednar (host)

That's right.

Paul Samson (host)

... but it's great to have her back. As always, there's a ton to talk about, so let's dive right into it.

Vass Bednar (host)

Marietje, welcome to Policy Prompt.

I was wondering could you tell us why is the show Suits in a book about the astounding power of technology firms?

Marietje Schaake (guest)

Oh, that's a great question. The anecdote about the choices that a Netflix or another big platform makes and how it ripples through cultures and societies is just one illustration. And the example of Suits, which was on the B rounds of repeat until Netflix made it cool again, is a tongue in cheek example of how this power plays out. Because in countries like India or other parts of the world where the government doesn't want to see certain content, these platforms also have to make decisions about whether they want to abide by those orders, and if they do, they actually contribute to censorship ironically, because that's often what they push back against.

So it's just to illustrate how the power of algorithmic settings, there's also a story about Instagram in there, which can influence purchasing behavior very easily by promoting influencers products on the homepage of users when they start looking in the app. And I asked, "Do your engineers and designers also think about not only moving markets but moving masses?" What if an influencer says, "Eugh, don't go voting, it's so uncool?" Or what if an influencer says, "Go vote on Monday," but the vote is actually on Saturday and if you come on Monday you will have missed your opportunity? Just the civic impact, not just the commercial impact was something that I was interested in and those were questions that were just not discussed by those people designing those products. So I just used the examples, The tech Coup is really written with life examples as much as possible to illustrate the deeper challenges of power that have really been grabbed from civic, democratic, societal leaders by tech companies, large and small.

Vass Bednar (host)

I loved reading all those stories. It really brings all the issues, thorny issues and discomfort associated with that to life in a super tangible way.

Paul Samson (host)

So the book, The Tech Coup has lots of stories and examples in there, and it's great. One of the ones that struck me early on was you talked about the global commission on internet governance, where you were working with CIGI, and you noted this day set of meetings with academics and researchers who were saying, "Yeah, we can do something here. We've got some ideas, let's get a handle on these issues as they're emerging." And you seem to be fairly optimistic at that point, and then in the evening you had a bunch of dinner conversation with private sector and it seemed to be much different tone, completely different planet, and a light bulb went off for you. Is that a good description of a light bulb moment for you?

Marietje Schaake (guest)

It was just so striking to me that, whereas I was known as being tech-savvy as a politician, Vass just mentioned being named by The Wall Street Journal as Europe's Most Wired Politician, and here I was sitting between really the hot shots of Silicon Valley. I mean people that still really have powerful jobs at tech companies and I was the punching bag of the evening, I can't say it otherwise. They were just like, "Oh my gosh, here's this crazy European," and just all the cliches were coming my way. And what made it interesting in retrospect, I mean it was eye-opening to me just the arrogance, for lack of better terms. I was just like, "Okay," because I'm comfortable with disagreement. When you're in a parliament, you disagree with people all the time, it's totally fine. But it was just fairly superficial arguments, fairly cliche approaches to why Europeans might be interested in regulating tech.

And it's more interesting in retrospect because actually that dinner was probably at one of the peak moments of when Silicon Valley was still seen in a positive light, when a lot of the problems had not yet surfaced, certainly not to a broader audience. So I feel like it was almost like their Titanic moment, that the orchestra was still playing and everything was still lavish and wonderful, and that the downturn would be coming quickly. So it was interesting at the time, it was maybe even more interesting in retrospect, which is why I shared that story in the book as well.

Paul Samson (host)

Right. Convenient guest at the right time for sure, it sounds like.

Vass Bednar (host)

It's fascinating that you'd be both invited and then addressed with some hostility, right? And observing that dynamic, is that around the time, that peak, when you knew you had to write this book?

Marietje Schaake (guest)

Well, the book has probably been percolating for a long time, especially after I stepped down from politics in 2019 because since then, and that's been really one of the wonderful things I've been able to do since stepping out of active political office, I've been approached by local governments, multilateral organizations, presidents, ministers, regulatory bodies, to think with them about what does AI mean? What does technology mean? How can we regulate it? And as much as I love contributing to those discussions, I also often found myself trying to start over again with analyzing what I felt the problem was in order to come up with the best recommendations I could make to these people in terms of what the solutions are.

And I'm not necessarily interested in hearing myself over and over again. So I thought, why don't I just write it down and try to shift the starting point of conversations, also reaching a larger audience because not everybody's in a government. So I hope that by taking a sharp lens, the lens of power, by looking at this dynamic between the private sector, the tech companies and the public interest democracy, that it will become clear that this is a systemic problem. It's not an incident. And another example of one CEO or the other, there's a lot of talk about Elon Musk these days, but the way in which the tech who is written really goes beyond these personalities, beyond these incidents and tries to identify this as a systemic problem of power seeping away from democratically legitimate and democratically accountable leaders and institutions into private hands. And that's a problem, and it doesn't stop there. I also offer solutions, and so hopefully, I can contribute to a shift in the conversation with a sharper focus on the fact that this is a power and a democracy problem.

Vass Bednar (host)

We are sitting down with Marietje Schaake on her gripping new book, The Tech Coup: How to Save Democracy from Silicon Valley. Marietje takes us behind the scenes with tech moguls, politicians, human rights defenders and more, to show how big technology firms in the US have gone from innovative startups to all pervasive power brokers with very little, if any accountability. Marietje exposes the very real danger of this wild west environment that tech companies have grown into over the last few decades and how that's eroded our democracy and most importantly, what we can do about it. You can find The tech Coup: How to save Democracy from Silicon Valley at your local bookstore.

Paul Samson (host)

So when you were elected as a parliamentarian in EU Parliament at 30 and you were perceived, you've already mentioned it, as a pro-tech in a way, at that point by many. Is there a generational thing here that always pits the new generation and the older generation? Is that still playing out? Is there something about that that we're always stuck with?

Marietje Schaake (guest)

If you mean in terms of being used to engaging with technologies, I think maybe there's part truth to that, but I believe that at Facebook, for example, the majority are now on the older side, majority of users and when I was just elected

Paul Samson (host)

Definitely. Grandparents, my kids say. Only grandparents are on Facebook according to my kids.

Marietje Schaake (guest)

Well, I'm off so that says a lot. No, just kidding. But I also think on the regulatory side, you can't really say that. I mean, Neelie Kroes was then Commissioner for Digital Affairs on the EU level, and she's now well in her eighties, so she would've been well in her seventies at the time, and she had key responsibility for the digital agenda for Europe at the time. I also don't think it's necessarily helpful to make this a generational thing. I also don't like the argument that I often hear, which is politicians don't understand technology.

Everybody has a stake in how tech impacts our society. You don't have to be an expert to voice your opinion. And the argument we don't hear so much about is whether a lot of engineers and tech executives actually are versed in the rule of law and democracy. So if we really want to go down that hole of you need to be an expert in order to have agency over things, well then lots of tech leaders are completely disqualified. So I would prefer to have a much more broad tent approach where everybody has a voice, everybody has a stake, and we're not going to dismiss people's concerns just because they're not experts.

Vass Bednar (host)

Paul, do you see that divide as generational or does it persist? Where do you see a more aggressive techno optimism?

Paul Samson (host)

Yeah, I think there is a bit of a thing there with younger kids knowing how to use more of the tools and applications and things like that, that's inevitable. But the generational divide I would agree, is not really there. There potentially is a challenge between countries though, where the demographics are very different and there's a lot of data that shows the techno optimism rides pretty strong in a lot of emerging economies from data that I'm seeing. So what the reason for that is, whether it's generational or something else, I think people are still unpacking.

Vass Bednar (host)

Something foundational in The Tech Coup that you emphasize is how technology companies, big and small, but mostly the largest, the giants that we're maybe most familiar with, have successfully resisted capital R regulation for decades. And through that, they've begun to then seize power from governments after a period of what you called, benign neglect, which I thought so captured that really perfectly. Here's a quick quote:

"The impact of regulatory abstention puts important norm-setting powers in the hands of engineers and corporate strategists." Totally freaky. Were the companies just lucky, or did regulators really prefer a free market at that time? And I ask you because I think it's such a fascinating circle to square because we're partially pointing to the failure of regulators while we call for smart regulation, right? The state failed us, but now we also need a better role for the state.

Marietje Schaake (guest)

Yes, I fully agree. So I think tech companies took the space that was given to them essentially.

(sound bite - Yahoo Finance: CEO talks Big Tech and rights of users) And what we've done so far in the US has been less than fair and basically means self-regulation to date. You're right, it's changing now, but I would characterize it to date as being tech companies saying, "We got it, don't worry. It's okay."

Marietje Schaake (guest)

And I think sometimes they may even struggle with the enormous responsibility that they have, being involved in geopolitical conflicts, being threatened by dictators. It's not necessarily the dream that people with laptops in their garages may have thought about. So it's a two-way problem, and I really do not let politicians off the hook in The Tech Coup. They are really scrutinized for not adopting laws for not being more clear eyed on the fact that democracy is not a self-fulfilling prophecy, that it needs work and that with growing power should come growing countervailing powers. That whole mechanism hasn't worked for the tech sector to a large extent. The main responsible here are US politicians just because on the one hand, Silicon Valley is in the US and is so powerful, but on the other hand it is democratic and Republican leaders that have abdicated their responsibilities and have actually chosen to give so much power to the market.

To really think that hands-off approach by government would lead to the best results not only economically, but also politically and geopolitically, and it has simply taken too long for the realization to hit home that that was a wrongly guided approach. And I think Americans have paid a high price. A lot of problems that we see, harms that we see coming from unaccountable tech is also hitting home in the United States.

Paul Samson (host)

I liked how the book got into different presidents in the US and you don't let the politicians off the hook by any means, and you are very systematic in talking about how they weighed in and sometimes how their views evolved. Was there a specific moment when we should have regulated in the past? Was there a crossroads moment that we somehow missed or is this really the boiling water and the frog and it's just been... Everyone was optimistic at the beginning, because it was new toys, but it's just slowly come in and now everyone's realizing, but there wasn't a crossroads, was there, at some point that we missed?

Marietje Schaake (guest)

Well, one thing I would say, and I briefly mentioned in the book, although it could have written a lot more about it, is that there has been a lack of willingness to take human rights defenders, journalists, opposition figures, civil society leaders seriously, who were faced with the harms of tech, whether it was social media platforms or spyware or other kinds of problems like Cambridge Analytica manipulating elections in other parts of the world. And so a lot of the lessons, look at Myanmar where genocidal language really was steered on Facebook and led to violence.

(sound bite - Al Jazeera English: “Social media blamed for Myanmar’s tribal dispute”)

According to the Shan Youth Group, Kordai Foundation, social media has been used to stoke hatred between the ethnic Shan and Palong Hill tribe.

Marietje Schaake (guest)

Manipulation of elections and Cambridge Analytica type of services that were used in other parts of the world like Kenya for example, just those people were raising red flags for a long time and were certainly not heard in the United States, not getting the attention that they deserved. And I think there was some perceived immunity from these problems in the United States, which was naive. So I think on the one hand, not listening to examples from the rest of the world, on the other hand, having economic interests trump everything else. And lastly, what I think is also mismatch that has caused us to be at the point where we are is a lot of the currently big tech companies started as relatively small disruptors of other established giants.

Paul Samson (host)

Yeah, that's right.

Marietje Schaake (guest)

And that sort of identity and that sort of narrative has sustained while these companies grew to be incredibly large, incredibly powerful, and actually their behavior became, as the incumbents, being as bullyish to newcomers, as anti innovation as they were criticizing the old powers that be, the publishers, the taxi industry, and what have you for. So the growth has not really been appreciated for what it would lead to and I think, now with AI, new wave of technologies that are being thrown out or applications of AI, it's the last tip of the pyramid of bunch of under regulated companies have been able to amass incredible amounts of data, incredible amounts of capital, compute talent, resources, power, and on top of that, very powerful under regulated position they have, they could build the next power position. And so it's like an engine that keeps turning faster and faster. It accelerates to this point where those that were already very powerful are becoming more powerful, and therefore the urgency that I feel to break that excessive power on the part of tech companies, some big, some smaller is really growing too.

Paul Samson (host)

A digital engine that actually is exponential in the way that it grows and so you don't see it. We're not used to exponential change and growth, so something that's small one year is big pretty quickly right now in this context.

Marietje Schaake (guest)

I also think that the companies and incidents may have been seen individually, but the notion that this is an ecosystem that really also influences each other and supports each other has not been appreciated either.

Vass Bednar (host)

Policy Prompt is produced by the Center for International Governance Innovation. CIGI is a nonpartisan think tank based in Waterloo Canada with an international network of fellows, experts, and contributors, CIGI tackles the governance challenges and opportunities of data and digital technologies, including AI and their impact on the economy, security, democracy, and ultimately, our societies. Learn more at CIGIonline.org.

Picking up on that engine and that flywheel, you're very careful in the book, and very direct, to frame corporate leaders as, and this is another direct quote, "Believing deeply that they can serve their users even better than governments can serve their citizens." Why do you think big tech became anti-democratic? Where does this undemocratic audacity or superiority come from, from people who fundamentally are making and coding software?

Marietje Schaake (guest)

It seems to me that a number of pioneers, but also entrepreneurs in Silicon Valley actually believe that what they were doing was democratic from a principal's point of view. So if you look at the whole encryption debate, obviously the idea is we are here to protect your privacy, cyberpunks and all those, because the government's not going to protect you. They are going to use technology for surveillance. And unfortunately, that's true.

(sound bite - CNBC, “Why The U.S. Government And Big Tech Disagree On Encryption”)

We did not expect to be in this position, at odds with our own government. But we believe strongly that we have a responsibility to help you protect your data and protect your privacy.

Marietje Schaake (guest)

The disrespect and the disdain has really come towards democratic governance and the idea that governments had any say at all. I think we often hear explanations, it's hot again now with the Telegram founders arrest or CEOs arrest, I think we often hear the best examples as a justification for this anti-government behaviour, but we don't zoom out sufficiently to see what the consequences are, ultimately. If you look at the steady decline of democracy in the world for the past 20 years, democracy is very fragile. And yes, it is great that we have phenomenal end-to-end encryption in apps like Signal, but that doesn't justify hands-off approach of governments in everything and anything related to tech. So it's also an opportunistic use of arguments on the part of some of these tech leaders where they're happy to use the best of examples and discard the status quo of abuse of power by tech companies, for example, and point to the flaws of government.

So it's also easier to hold those two account in government that are actually accountable, to point to their flaws because yes, we can do something about it. Whereas when it comes to the tech leaders, a lot of their decisions are not to be scrutinized. There are no proceedings to hold them to account and so I think it's a growing mismatch in that sense.

Paul Samson (host)

In several parts of the book, you talk about that challenge that many of us face about when to engage and when not to engage, whether it's an elite dinner of exclusive people or something like Facebook's oversight board. How do you decide when to engage and when not to engage on those kinds of things? It's always a judgment call.

Marietje Schaake (guest)

It is a judgment call, but it has really helped me to step back and really try to look at all these challenges from the level of principles and questions of legitimacy, of agency, and of accountability. So let me try to explain. Some people have said about the Facebook oversight board, "Well, it's a step in the right direction. There was nothing and at least now there's something." I'm simplifying, but that's really an argument that I've heard quite a bit.

(sound bite - CNBC Television: “Facebook lays out details for content oversight board”)

Kayla, Facebook unveiling details of his independent oversight board is the supreme court of sorts for Facebook content decisions. The board in a 46 page document sharing its bylaws, including that anyone who disagrees with Facebook's decision to take down their content, will have 15 days to submit appeal.

Marietje Schaake (guest)

Or there are those who say, "Well, I would rather have someone like Mark Zuckerberg's decide rather than the government decide." But I think it's important to step back and ask yourselves, what is a rule of law based system based on? What do checks and balances look like? And not just to look at, "Do I like an Elon Musk unmask or not? Do I like a Mark Zuckerberg or not? Does he serve, or do they serve, my club this time? Are they fighting back against the people I dislike?" Think it's really clear with Donald Trump how quickly the tides can turn. He used to go off about social media allegedly being against him, and now that it's actually working out quite well for him-

Paul Samson (host)

TikTok.

Marietje Schaake (guest)

Yeah, TikTok, but also X. He changed his mind. This a good example of a non-principled view, whereas I've always said we have to worry about the outside power of social media companies and their leaders. What if someone comes with a very strong political agenda? It doesn't matter whether they go to the far left or the far right or anything in between, we just shouldn't want to give them this power at all.

I try to approach a number of questions around tech governance, not so much from, is the outcome in this incident in my favor or not, or something I agree with or not. But I'd rather look at, is this legitimate? Is this appropriate in a democratic system? Are there independent oversights, transparency, accountability mechanisms? And if not, what should they look like? So just looking at it from a different level, I guess, has been what has guided me in approaching those questions of whether or not to engage or whether or not to support the idea or not.

Vass Bednar (host)

I mean, I'm curious what it even feels like to receive that invitation. I'm assuming it's an email that you have to click twice on, but soon after you decline that invitation, you joyfully trolled that whole configuration with a significant effort through what's known as the real Facebook oversight board. Could you tell us a little bit more about that work and what you and others hoped it could accomplish?

Marietje Schaake (guest)

Yes. So we need to step back a little bit, so when Facebook announced that it would set up a oversight board, one of the key moments was the 2020 election, right? People were worried about how that would go, how disinformation would play out this time with all the lessons learned from 2016, but this oversight word was certainly not going to be able to hear cases or deliberate cases on content moderation by the time the elections would happen.

So a number of people came together, and I was actually asked later when they had already decided on this initiative, said, what we need is an independent oversight body, so something that's not at all linked to Facebook as its own oversight body is, and we need to scrutinize what's going to happen before the elections and not start months after because the elections were such an important flashpoint. So that's when the real Facebook oversight board came about, and it's actually a group of well respected journalists, a lot of them civil society leaders in the civil rights and civil liberties space, some academics like Shoshana Zuboff for example. And so what we try to do is to point to these accountability gaps and also to point to where accountability is missing in a structural sense.

The Facebook oversight board that it runs itself has a very limited mandate. It can look in second instance at decisions about content moderation that Facebook has made, but Facebook makes many more decisions than about content moderation. It has groups. It decides about its algorithmic settings, it decides about data use, it decides about advertising. So ultimately, its oversight board has a very limited scope, whereas with the real Facebook oversight board may have been launched as a bit of a humorous antidote, but it's actually dead serious in pointing out where money goes during campaigns from a company like Facebook, but also how this information continues to spread, not just about elections, and really looks at mental health issues of young people and stuff like that. So it really tries to point to the areas where there is a lack of oversight. We don't pretend that we can solve everything, but we shine a light where there's not enough of that light being shown.

Paul Samson (host)

So one critical thing in your book is as you transition from the events and the way things that unfolded, your observations, is you start to get into, "Okay, so where do we go from here? What are the policy recommendations?" You're very clear in not discounting things due to political feasibility, which I think is totally bang on. You don't want to leave anything off the table or dilute advice before you even give it, right? So that makes sense, but it does favor that researcher versus the policy maker. Right? So in reading the book, you spend a lot of time on the recommendations, but do you think there's a next step as we unpack a few of them in a second here, do you think there's a next step required? Is it your next book? Is it should somebody else do this and weave together... How does that play out? How do people follow up on your book, I guess?

Marietje Schaake (guest)

So I have sought to strike a balance, and this was not easy between, having been a policymaker, so really being familiar with the nitty-gritty- gritty-gritty, which a lot of people either don't care about or just won't follow. Always checkout at some point. So I tried to say somewhat high level, also reduce the amount of solutions I could have done a hundred more, but to highlight a couple of approaches that could work to break the dependencies on tech, to increase transparency oversight and accountability, and to really do so directly from the perspective of, okay, this dependence, this power grab hurts democracy, how can we strengthen democracy? And this is really a different approach from what I also see happening, which is people hope that a side effect of other actions, like economic policies or antitrust policies, might be that democracy improves or they hope that data protection will have broader impact on protecting democracy.

And don't get me wrong, I think antitrust is incredibly important. I think data protection is incredibly important, but I simply think we must address the threats to democracy head on. We must identify it as the key problem and solve it as the key problem and not see it as a ripple effect of other challenges or solutions and so that's what I try to do. Now, what needs to happen next is I would love to spend more time with people or myself deepening and unpacking some of these solutions, because there's a wide variety. Some of them are very applied and practical. I recommend that parliaments have an independent tech service where parliamentarians and their staff can ask for advice that is not lobbied, but that is well-informed. Now that's something that it doesn't take a whole lot of extra work. A parliament could do this tomorrow. If the budget is available, they can say, "Yep, great idea. We're going to bring in independent technology expertise to improve our information position and also to improve the legislation that comes out of here." It doesn't require a lot of research.

I have another solution which is much more philosophical, I guess, and would require more developing, which is what I call the public accountability extension, which is to say, if a government or public entity, uses technology in its name, so for example, a police service uses technology, a tax authority uses technology, et cetera, et cetera, the accountability or the transparency that applies, so that government agency should apply equally for the part that's tech related.

So it should no longer, and this happens on a daily basis, be possible for a police service to say, "Oh, no, we did not hack the phone of this criminal. It was the tech company that did it." Or, "Oh, we had no idea that the tax authority discriminated because it was the algorithm," that deflection or that divorce between the analog and the digital should simply not fly from a notion of freedom of information requests of journalists or accountability to parliaments.

If a government wants to go to war, it needs a mandate from parliament usually. When there's a cyber attack or cyber operations happening, no such mandate exists. So it's about closing that gap between the analog and the digital in terms of accountability and transparency. That whole idea would have so many reverberations. That's why I think it's very powerful, but it would also benefit from a lot of unpacking and making case studies of how it would apply here and here and here and bringing it to life in that way.

Paul Samson (host)

Yeah, I have to say there that the risks seem to be increasing as well, as you note that the power of AI is increasing the inability to potentially understand the algorithm, it puts a premium on that, like the dog ate my homework, the algorithm did it and we don't really know how or why, and sorry, that's not going to fly.

Vass Bednar (host)

I wondered if there was one policy prescription or idea in particular that you think is the most radical or feels like your wild card, maybe you considered not sneaking it into the book or you're most curious how people will receive it?

Marietje Schaake (guest)

Well, what I just mentioned, the public accountability extension I think is most far-reaching, but I was wondering whether I should include bans. That's radical to say ban a certain technology. People would probably say, "It's impossible," or, "She's crazy," but I actually think the case can be made to ban spyware. Spyware is designed and actually also marketed and sold to violate people's human rights like the right to privacy, but has big implications for journalists and their sources, opposition figures in their networks, the safety of people. And yet the steps that have been taken to curb spyware have been piecemeal. They've been modest. It's a growing industry. We just saw in recent days reports that the proliferation of spyware of the kind that NSO group with its Pegasus program used has been used by Russia. No surprises there.

This technology does not stay in a box. It is not only to find terror suspects or criminals, it actually proliferates or it gets abused. In Europe, we've seen in Poland spyware used against judges, against opposition figures, against journalists. This is a real, sophisticated tool for intimidation and for silencing and censoring people.

(sound bite - Euronews: “Poland to investigate alleged use of Pegasus spyware by last government”)

On Monday, the Pegasus and illegal surveillance committee will convene in Polish parliament. Opponents of the previous government claim they were spied on. One is Bartosz Kramek, who claims he was illegally wiretapped due to his opposition activities.

Marietje Schaake (guest)

I don't think any legitimate uses like crime fighting, weigh up to the way it's abused and the way it proliferates. It puts us all in danger. And it's great to see steps being taken by the US government that now bans the use of commercial spyware for government. If you are a business tycoon, you can probably still buy it legally. The EU has export controls that I help work on so that there's an assessment of human rights before a license is given. But there's still many, many ways in which spyware is used legally and is not bound by any guardrails. Now, I've come to the point where I can say, "Yes, it is legitimate to ban," but I'm sure people will be like, "Oh gosh, you can't ban technology." So those are things that I was deliberating when I was writing the book and when I was doing the research.

Vass Bednar (host)

That's fascinating. Thank you, Paul. I wondered if there was a policy proposal that similarly stood out for you.

Paul Samson (host)

Yeah, there was. I think I had two takeaways from those sections that I really liked. One was you talked about the geopolitics of it in a way that I thought was very important and interesting. This is my interpretation, but that you said that, the EU on its own is not big enough ultimately to do this, it's going to have to be transatlantic plus probably to have enough geopolitical oomph to do something that would have a global impact, right? China will probably do its own thing. India may do its own thing, but you need that transatlantic alliance. That was one takeaway.

The other one was that, as you just said, if you take a little bit of that framework on risk of high to low, like the EU AI act a little bit and say there's some things that you actually might want to ban. I read it as spyware is a no-brainer, that even if you didn't use the word ban. And then there's some things that'll be so minimal risk that they're just not an issue. Now, they can change too, things can evolve, but you've got that middle space where you apply the precautionary principle. So I thought when you put your stuff together, there was a framework there that was interesting, and that was, I thought, very useful contribution.

Marietje Schaake (guest)

Well, that's great. I hope there, there's tools for many people to pick up and do something. I don't want the reader to feel disempowered. There's so much that can be done, and if people have more ideas or if they say, "Well, this idea is mediocre, but I have a better idea," that's great. That's exactly the response that I hope will happen to the tech pool.

Vass Bednar (host)

I don't think anyone's going to read that chapter and think that anything there is mediocre. For me, the element of building a public stack really resonates. It strikes me that here in Canada, with some of the public elements, like public markets that we create, whether it's in health or our expanding child care system, we don't have public digital architecture that underpins these systems. So electronic health, electronic medical records, it's a private duopoly. And then who pays for that? Well, big surprise, it's us. So having that as less of an afterthought and more at the forefront I think, is definitely something I hope a lot of people and practitioners grab onto.

Paul Samson (host)

As always, in the tech space, you have to check the news before you start a podcast to see what happened overnight or in the last couple hours. And there have been some developments recently, and you're probably still absorbing them, but one of the big ones was the US Court decision relating to TikTok being responsible for some cases of manipulating children to harm themselves in that they couldn't claim immunity from liability under section 230 of the US Communications Decency Act when content harms users.

(sound bite - CBS News: “Appeals court allows TikTok lawsuit over girl’s death in viral challenge”)

Morning, this could be groundbreaking. For years, laws have protected social media companies from liability for user generated content, the stuff that other people put on the platform. This lawsuit centres around TikTok's algorithm and holding the company accountable for what it promotes to users. The outcome of this case could expose TikTok and other social media companies to a wave of new legal challenges.

Paul Samson (host)

And you talk about section 230 a lot in your book. Where do you think this impact might be going? Is this a big deal? What just happened? Or is it too early to tell? What's your reaction to that latest news if you've thought about it?

Marietje Schaake (guest)

Well, I haven't seen the details, but the general response would be it would be a big deal if this immunity would not apply from a legal perspective. And it also points to something bigger, which is I think the sense of urgency to stop this sort of unaccountable power grab is beginning to really gain in popularity. Now, I'm not sure that in this case, the perspective was, these are very powerful decisions shouldn't be made by TikTok. This is a very narrow interpretation as legal cases often are of whether in this case the immunity applied or not. But I think it points to a broader trend that people are sick of The Tech Coup, and if they aren't already, I hope they are. And also that something can be done, it's not like we have to undergo this erosion of democracy at the hands of tech companies. And of course, well-being of children online is not a democracy issue, but it's certainly a societal issue that should really matter to a lot of people.

So I just hope that this is part of the signaling that people are finding ways to push back against this unaccountable and outsized power of tech companies.

Paul Samson (host)

And the courts are often at the forefront in doing so, right, or at least starting that motion that then becomes sometimes policy-driven afterwards, after they understand what the court was doing.

Marietje Schaake (guest)

In the US, certainly, and I think it's something that people in Europe often miss because the courts play a very different role in the United States, and they also are a space to watch, particularly as the politics are so frozen and so polarized. So while Congress is not acting the way that I think it should, doesn't mean nothing is happening. And similarly on state level, individual states are adopting really interesting laws like opt-out options for having your data used for training AI in California, there are really interesting spaces to watch big antitrust cases against Google, so it's a very dynamic field. I just hope the different parts will also lead to a full picture of what is at stake here.

Vass Bednar (host)

Part of that full picture has been your work looking at other jurisdictions and reminding us of what's happening there. As you said, not only of the best ideas but also of challenges. You do spend a fair amount of time in the book looking at the Chinese model for internet and data governance. I wondered what you may want to convey to listeners about that, and also that the Indian model for digital public infrastructure, is that where we're seeing a model or sweet spot for public-private roles? Or is this approach maybe creating too much new risk?

Marietje Schaake (guest)

To start with China, what is important to see is on the one hand of course, how unaccountable power on the part of governments when using tech can put these harms on steroids, and that's what the book touches on. But I also think it's important to see that states, like China, but also all of our states, can still be very powerful if they want to be. And what I worry about deeply is that authoritarian states have very much claimed this role in governing tech, but democratic states have not. And so that has an effect on the harms that we face in our own societies, but it also has an effect on the ability of, let's say, a United States to negotiate with others in the world because they're not even putting a model on the table. They're not saying, "Hey, we propose to hold tech companies accountable this way or to protect speech online that way, or to have oversight over offensive operations by companies this way," because the United States currently basically has no model, and it's really hard to say, "Hey, let's build a coalition around our non-model."

So those are two lenses to which I look at this phenomenon in the book.

India is unique in many ways because one, it has its own approach to governing technology with India Stack, where there's a lot of concern of how that could be abused and how it's vulnerable as a system. But the thinking in terms of what is digital public infrastructure certainly originated there to a large extent. India as a country I think is really the space to watch because, clearly, western democracies would love to count India as an ally in the big club of democracies. India is not playing ball, and it's very unfortunate, but that's the reality. And so it's a big question mark of where India will position itself. Will it become a bigger power broker when it comes to tech governance? Will it stake out a position for itself in the world or will its more state heavy, authoritarian behaviour, make it drift away from a lot of global efforts? Certainly because of its size. It's very interesting for the global democratic balance to see where India falls and people in India are very concerned about the rise of China. So will it become a bipolar, a tripolar, tech governance dynamic there? We'll have to see, but I think when we talk about tech developments and tech policy at all, there is really broadly speaking, too little focus on India.

Paul Samson (host)

Especially given the demographics and the growth share of the global economy that's generated there right now, so absolutely.

You are listening to Policy Prompt, a podcast from the Center for International Governance Innovation Policy. Prompt goes deep with extensive interviews with prominent international scholars, writers, policymakers, business leaders, and technologists as we examine what it means for our public policies and society as a whole. Our goal at Policy Prompt is to explore effective policy solutions for pressing global challenges, tune into policy Prompt wherever you listen to podcasts.

So this segues to the big final topic, really, which is at the global level that if a lot of the pieces can be done at the national level appropriately and need to be right, or even at the state level as you were describing, there's always that question of, "Is there something needed at the global level?" I think our view clearly is that there are certain things, whether it's autonomous weapons or certain risks to society that would come from somewhere but still be a global risk. You're on the UN Secretary General's AI Advisory Committee, and an interim report came out. I know there's a final report planned. Can you tell us anything about where the framing might be going or just anything maybe on the global context for AI agreements?

Marietje Schaake (guest)

Well, I believe a lot of people see the need for global agreements. I personally think we would already be helped a great deal if we could see a authoritative interpretation of universal human rights as they apply or are at stake in digital contexts. Not just AI, but broadly speaking, because in today's divided world, it will be hard to get agreement around new things. It's easier to build on established norms such as the Universal Declaration of Human Rights, for example. And as basic as that sounds, that effort has not been done. I think it would be hugely helpful if it was.

There's a lot of focus in our advisory body on the global south for good reason. The fact that the majority of the world doesn't have super computers. That, for a lot of people, it's very theoretical to talk about AI if you don't have access to electricity or the internet to begin with.

So we must acknowledge that there are real differences in the lived realities that people have that inform how they think about artificial intelligence. And that simply rolling out AI without having a framework of rights protections, or without having education, or equal participation of people is going to exacerbate a lot of the inequalities that are already there. And AI is not going to be some magic wand that's going to make all those inequalities disappear. So governance matters, and ideally this is global, but with a focus for the unique situation in the global south and within the global south, but really with an extra effort to make sure that some of these historic inequalities are corrected and not exacerbated now with this last wave of technology. So there's thinking about that.

There's of course, thinking about how to create trusted resources when it comes to understanding AI. So that's where the idea of some sort of scientific panel similar to the climate world comes from because there's so much disagreement within AI expert communities about what level of risk is the most urgent, within which timeframes, what can be done practically to mitigate those risks. And so if there could be a trusted group of people that actually assesses the development of AI, speaks to the consequences, then maybe that could simply provide a resource that many, many in the world can benefit from to make their own political decisions. Even if you have the same research body, people will draw different conclusions about what needs to be done next, but at least there's a starting point of facts, and I think that would be most helpful. So those are some of the directions that the body's thinking about.

Paul Samson (host)

Yeah, thank you. And I mean, of course, it doesn't have to be a UN body either, even though it's difficult for the UN to say that. It could be a group of like-minded countries that get things started because the UN is a supertanker and is a tough one to navigate.

Marietje Schaake (guest)

This is true, but the advisory body's independent, so the advisory body can say anything and everything. We don't speak on behalf of the UN. We make recommendations to the UN, and I think that that makes it different from the process of the digital global compact that's also going on at the UN level. We are not states. We are members as individuals and hopefully, that will lead to more bold and out-of-the-box thinking, that is not constrained by political realities here or there.

Paul Samson (host)

Excellent.

Vass Bednar (host)

We are so excited for people to read The Tech Coup, even people working in and on technology. I think that's also been a big shift over the past 5, 10 years in terms of where attitudes are changing and the shared vision for a better future with technology, so we're so appreciative of the work and we're looking forward to continuing to follow it. Thank you so much for joining us.

Paul Samson (host)

Good to see you.

Marietje Schaake (guest)

Thank you too.

Vass Bednar (host)

Something I really appreciated just of the framing of The Tech Coup was not, again, that title was not just how we've been overthrown in terms of that shift in democracy and power, but typically, a coup is very sudden, but this coup has been incremental and relentless in our lives. So that's definitely something that I'd been thinking of throughout reading the book.

I loved when she said these two words and put them together, "Think with." Framing thinking as a collaborative exercise, thinking with parliamentarians, thinking with academics, I thought was important to hear, right? The idea space doesn't have to be this competition where it's highly personalized. Also, I mean my policy brain really appreciated this first principles, what is the problem? What is the core problem we're really trying to solve? And to what extent is that related to business models? And to what extent is that related to just the practices of these firms that could be amended or curbed or, I wanted to use the word tamed, but maybe that's too animalistic. What was sticking out for you, Paul?

Paul Samson (host)

Yeah, I think you're right that she really did get her arms around pretty much everything, which is what's nice in that book. It seems to me that she's such a well-informed and credible voice on these issues having been a parliamentarian. She knows the political side of it, the realities of, "Okay, well, what are you doing on legislation and regulations and things?" So she understands that side. She's an academic, she's in an academic institution. She understands the rigour of the research there. And she's a European living in Silicon Valley, which is interesting. So she's got a lot of things going on there that make it a very interesting journey that she's had and where she's come to. And then on the issues, she does look at the big picture, she looks at the origins and then how it fits together, whether it's the geopolitics or this fairly long list of prescriptive policy directions that she has, gives a nice menu to unpack and there is more unpacking needed, as she said right now, what would we really drill into? But I really enjoyed the book and the discussion.

Vass Bednar (host)

And hey, maybe a future edition of The Tech Coup will have some of Canada's spicy policy ideas. We're often, so often, too often overlooked in those jurisdictional scans of major activities. And maybe our policy proposals don't always completely shake things up or we're seen as falling in between California and Europe, but not a criticism, just an observation from me.

Paul Samson (host)

Yeah, it's a super interesting one because Canada doesn't come up in this analysis very often, but when I'm out there talking to countries, wherever, they're like, we need Canada in this debate. We need Canada to actually be one of the leaders because people want to listen to you. You've got some credibility. We have to make sure we actually are credible. These can't be just superficial proposals and things, but I think there actually is a strong play for Canada in these spaces. So hopefully we'll be stepping up.

Vass Bednar (host)

Policy Prompt is produced by me, Vass Bednar and Paul Samson. Tim Lewis and Mel Wiersma are our technical producers. Background research is contributed by Reanne Cayenne, marketing by Kahlan Thompson, brand design by Abhilasha Dewan and Creative direction from Som Tsoi. The original theme music is by Josh Snethlage. Sound mixing by François Goudreault. And special thanks to Creative Consultant Ken Ogasawara.

Please subscribe and rate Policy Prompt wherever you listen to podcasts and stay tuned for future episodes.

\*for any transcript errors please contact [[email protected]](mailto:[email protected])