This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.
Nicole Perlroth: Our elections have been hacked, our power plants, our nuclear plants, our hospitals. And so, where is ransomware going, because we're just digitizing our lives. At what point is it going to be your self-driving car, or your insulin pump, or your pacemaker? And that's a very real possibility. That's not just stoking fear for fear's sake. That's a possibility.
Taylor Owen: Hi, I'm Taylor Owen, and this is Big Tech.
Last week, Google discovered a group of hackers had been breaking into some of the most widely-used tech in the world, iPhones, Androids, and computers running Windows. Like with many security breaches, Google shut it down, but there was a problem: the hacking was actually part of a counterterrorism mission conducted by Western governments. Now, unless you're intimately familiar with the world of cybersecurity, this might have you scratching your head. Why are Western governments hacking iPhones, and how does Google have the power to shut down counterterrorism operations? But this story actually gets at a core tension in cybersecurity. Government agencies like the NSA will pay millions of dollars to hackers who find bugs in hardware and software. But they don't buy these bugs, which are called zero-days, so that they can fix them. They buy them so that they can spy on terrorists, or drug smugglers, or child pornographers. But here's the catch: almost everyone on the planet uses the same suite of technologies. So, that iPhone bug that allows the NSA to snoop on terrorists can also be used by Saudi Arabia to threaten its dissidents, or by China to spy on the Uighurs. And those zero-days can be used against us, too. If the NSA leaves a bug in Windows, our adversaries could find it and exploit it as well, which could mean access to our banks, our industrial secrets, or even our nuclear power plants. In other words, in the world of cyber warfare, an offensive advantage is also a glaring defensive vulnerability. This is the world that Nicole Perlroth has been immersed in for nearly a decade. Nicole is a cybersecurity reporter for the New York Times, and the author of a fascinating new book, called This is How They Tell Me the World Ends. A lot of the conversations that I have on this podcast are scary in a big picture, existential way, but this conversation is scary in a far more visceral, immediate way, because as we move closer and closer to a world where everything is online, Nicole makes it pretty clear that we haven't done nearly enough to protect ourselves.
The one thing I kept thinking, reading this book, and being peripherally aware of a lot of the issues and topics you're talking about, was just how crazy it must have been being that close to it all. I mean, I look at it from a fair amount of distance as an academic, and there's that scene at the end where you're talking, I think, to the Moscow Bureau Chief, and he kind of says, "Were you scared?"
Nicole Perlroth: Yeah.
Taylor Owen: And I was kind of thinking that the whole book, and you address it at the end. But I'm just wondering what it was like being in the middle of that for seven or eight years of your life, and still are.
Nicole Perlroth: It's just been ten years of running around with my head cut off, from one attack to the next. Usually they're not that similar. You're sort of just jumping back and forth, and each new attack, you have to learn a whole new field, and a whole new terminology, and then something crazy happens over here, and you have to run back and handle that. And so, I didn't get a lot of time to process what the hell just happened until I sat down to write this book, and what had just happened was just this convergence of all kinds of attacks, by nation-states, by cyber criminals, by some kind of convergence of the two in a lot of cases, on American companies, on our intellectual property, on our government agencies, on our elections, on our psyche. And it's just become such a mess, and I feel so dizzy.
Taylor Owen: I can only imagine.
Nicole Perlroth: Yeah.
Taylor Owen: And I wonder if, do you think you were able to see those connections, and draw those connections, and tell the story as you did because you were kind of an outsider to that world, and didn't come from a cybersecurity background, and aren't a white male in your 20s living in Silicon Valley? Or, I mean, in many ways, you were an outsider to a lot of the communities you were talking to, right?
Nicole Perlroth: Yes, and I ... Yes, I do think it helped to be an outsider. It also is very difficult to not be an outsider, you know? I constantly have this heartburn because I write for a lay audience. The translation is always hard, and I always hear about it from the technical community, that I didn't describe things with utmost technical detail. That said, I got into business journalism in particular because I took a class, in journalism school, actually, I took a class at Stanford Business School that was about bioethics, and I just remember, this guy came in and he talked about, they were testing a new birth control, and he had to decide what they would do in cases where the birth control didn't work. Would they pay for the abortion? Would they pay for that kid, if the mother wanted to have that kid, would they pay for their schooling, or their college education? There were just these fascinating ethical decisions at play, and I remember thinking, huh, that could be a really cool niche, to go cover business, but really be keeping an eye on sort of the moral hazards, and these ethical decisions that business leaders have to make all the time. So, that's why I got into business journalism in the first place, and what I saw in cybersecurity wasn't just that the problem was getting a lot worse, and the attacks were happening with more frequency, and that businesses were suddenly being expected to defend themselves from nation-states. I saw that here in the United States, all of our incentive structures and models were leading us down this path of further vulnerability. Here in Silicon Valley, we all bought into this promise from these companies of a frictionless society, where we could order up an Uber, order our groceries, get restaurant delivery through our phones, and businesses, their incentive was just to get things to market in the most efficient, low-cost way possible, and it wasn't necessarily to test all that code and lock it up before they rolled it out, and all that code wasn't just getting into our phones and our computers, it was getting into the grid, and our airplanes, and air traffic control. And then, government was supposed to keep us safe, but I was fascinated by these murmurings that I was hearing of this underground market for vulnerabilities, where we would actually pay hackers, with taxpayer dollars, to hand over holes in this software, not so we could fix them, but so that we could exploit them for espionage and for battlefield preparations. So, all of these incentives were leading us down this dangerous path, and the thing I always just found fascinating was this moral hazard, particularly at the government level, of when do we decide to leave Americans more vulnerable to preserve our espionage advantages and our battlefield advantages, and at what point are we going to decide, wait a minute, we are the most targeted nation on Earth. We need to recalibrate and focus on defence. So, a lot of these issues begged for more transparency, and broader discussions, and I don't think that these communities were really solving for them on their own.
Taylor Owen: And certainly weren't speaking to a wider public, right? I mean, it struck me all the way through that, at the same time as you're writing front page stories for the New York Times, I don't feel that this discussion really broke through to the zeitgeist about how we understood tech and the state. And was that part of the motivation for writing this for a broad, public audience?
Nicole Perlroth: Yes. I mean, I can barely keep track, and I was covering these, and I had a front row seat to a lot of these attacks. So, I knew ... Even though the New York Times was putting these on the front page, they were putting them in the Sunday paper, but on their own, they weren't landing with people, and I felt immense frustration with that, because I was the one translating this and putting it in the paper. And so, I knew it was going to have to take a narrative to get this into ... real storytelling, to tie this together and to make it accessible, and so that's why there's a book.
Taylor Owen: Right, so, I have a few broad questions about the implications of all this, but first, I was just hoping we could reset a little bit, and maybe just talk through what the core narrative is here that you're laying out. And so, the whole book revolves around zero-days, or vulnerabilities in our tech infrastructure. Can you explain quickly what those are?
Nicole Perlroth: Yes, and I apologize to listeners. I promise that this is the most technical part of the conversation in the book. But it is important to know what a zero-day is, because so much of the narrative is based on it. So, a zero-day is just a hole in a hardware or software that the software manufacturer or hardware manufacturer doesn't know about. So, let's just take the easiest example: I find a flaw in your iOS, iPhone mobile software. I, hacker, can develop the code to exploit it, and that zero-day exploit allows me to get into your phone remotely, it could let me read your text messages, it could let me turn on your microphone without you knowing about it, or your camera, track your location. And so, it has immense value for cyber criminals, but also for nation-states, who want to spy on terrorists, or law enforcement agencies, who want to get into a criminal, or a drug cartel, or a child predator's phone. And so, there's this market that has popped up around zero-days. Governments buy zero-day exploits from hackers or brokers. The going rate for a remote zero-day exploit that can get me into your iPhone is $2.5 million in the US, and it's even higher if you want to sell that code to a broker in the Middle East. The Saudi Arabia or the United Arab Emirates will pay $3 million. That's the going rate there. And these prices just keep going up, up, up.
Taylor Owen: But that market, it emerged over time, and the hackers in this world seem to have a pretty complicated relationship, both with tech companies and with governments. Can you sort of lay out that landscape of how that emerged, and where this hacking community fits within all of those actors who might want to buy these?
Nicole Perlroth: Yes. So, I started off the book talking about a dinner I went to with these two Italians, very colorful guys, who had started this startup called [inaudible 00:12:40], which was actually developing zero-day exploits, not for your iPhone or Android phone, but for critical infrastructure, for industrial systems. They were selling the ability to break into a factory floor, or a power plant, or the grid. And I asked them at that dinner, "Who will you not sell this to? Iran, China, Russia?" And they wouldn't answer my question, and I also knew that they could not answer my question, because so many of these deals around zero-day sales are wrapped in non-disclosure agreements, and a lot of cases, the clients are governments that roll them into these classified programs. And the reason that everyone is so secretive about this is not just because of how these tools get used, but what the tool is. And the tool is essentially an invisible way to spy on or break into systems. So, the minute that Apple learns about that zero-day in your phone, in their iOS software, they will patch it. They will release a software update. You will get an annoying prompt on your phone, and you'll have to click it, and then you're fixed. So, they don't want to tell anyone about their zero-day exploits, or how they're using them, because the minute they do, that $3 million investment they just made turns to mud. So, I knew that there was a market for this. I knew that the market was basically trading in our vulnerabilities. I knew that these vulnerabilities were showing up in the attacks I was covering for the New York Times all day. And I knew that no one wanted to talk about the market. And so, I basically went and interviewed anyone who would talk to me about this market, and what I learned was that there is a long, complicated, bitter history here. In the early '90s, hackers would find these holes, and there was no 1-800 number for them to call at Microsoft, or Sun Microsystems, and say, "Hey, I just found this hole in your software, and I think I can use it to break into NASA." That didn't exist, and in fact, when they would reach someone at the company, the typical reaction was, "Don't poke around our systems." Or worse, "We'll sue you if you keep poking around our systems." So, hackers started releasing these tools on hacking forums like Bugtraq, and they did it for street cred, they did it as a hobby. Sometimes they did it to shame vendors like Microsoft into patching these systems. And over that same time period as these hackers were basically getting sort of penalized or threatened for discovering these flaws, I learned that US government agencies saw immense espionage value in them. If they wanted to break into the computers at the Russian embassy in Kiev, the best, most reliable way to do that was to use a zero-day exploit to get in, and then drop a payload, or plant backdoors to stay a while, and make sure that you maintained that access. And so, they would start going on these hacker forums, and say, "Hey, could you develop something special for me?" So, at the same time these hackers are kind of getting beaten over the heads by the tech companies, these defence contractors, on behalf of US government agencies, were offering them something like $150,000 for a reliable way to get into Microsoft Windows software. And so, the market took off, and the United States, US government agencies started paying defence contractors, and brokers, and hackers directly for these zero-days, and we had this unquenched thirst for as much intelligence as we could possibly get our hands on, and zero-day exploits and digital exploitation programs were turning over some of the best intelligence we could get. But a big game-changer was what we did in Iran.
Taylor Owen: Stuxnet?
Nicole Perlroth: Stuxnet.
[CLIP]
News Anchor: There's a report out about a new computer virus that may be aimed at destroying a bricks and mortar facility. The virus is called Stuxnet, and according to The Financial Times, it may be aimed at Iran's controversial nuclear facility.
SOURCE: Bloomberg Quicktake YouTube Channel https://youtu.be/H6VipR0xBGo
“Falkenrath Says Stuxnet Virus May Have Origin in Israel: Video”
March 23, 2012
Nicole Perlroth: And what we did was, with Israel, we broke into an Iranian nuclear facility, Natanz, sometime around 2007, and we used a series of zero-day exploits to get from their Microsoft Windows systems into the industrial systems that spin, or monitor the speed at which these rotors spin their uranium centrifuges and enrich their uranium. And by the time that attack was discovered, we had destroyed something like 1,000 of Iran's centrifuges, and set their nuclear ambitions back a few years. But the problem was that that attack got out. It got out in 2010. It was discovered on networks all over Asia. It got into Chevron. And eventually, some security researchers and experts tied it back to this joint operation by the United States and Israel on Iran. But it opened up the world's eyes to the power of a zero-day exploit, not just for digital espionage and surveillance, but for destruction. Suddenly, we set a new bar, where it was okay to break into another country's nuclear facility and take out their centrifuges, so long as you did so with code. And so, since then, every other country on earth, with the exception of Antarctica and maybe another handful of other governments, have invested in zero-day exploits and hacking tools, and an entire market has cropped up to meet that demand for these click-and-shoot spy tools, and in some cases, the tools that could be used to take out another country's grid.
Taylor Owen: I was really struck how you described the ways potential adversaries were using these tools, and Russia, North Korea, Iran, China in particular is the four big ones, but that they all seem to be using them for markedly different things, and in different ways, for different purposes, and when we were looking at the way these digital tools are being used by maybe just those countries, too. So, how were they ... what were they each trying to do with these, and how are they doing them differently?
Nicole Perlroth: Well, let's just start with Iran, because they were the target of Stuxnet, and I think one of my big takeaways over the last decade is that in cyber, the enemy is a very good teacher. We watched Iran a couple of years, just two years after Stuxnet, wipe out the data at Saudi Aramco, the world's largest oil company, and replace it with an image of a burning American flag. They just saw what the US had done to its centrifuges, and no, they could not pull off a Stuxnet-style attack, but they learned that they could use rudimentary wiping code to exact similar damage in a lot of cases, and that was a stunner for us. That was a real shocker for US government officials, who said, "Wait a minute, we thought ... We knew that eventually they might catch up, but we underestimated the damage they could do with really basic code." And so, for years we saw Iran kind of exact as much destruction as they could with these lower-level tools. It wasn't just the attack on Aramco. They wiped out data at Sands Casino because Sheldon Adelson insinuated that we should go ahead and bomb Tehran.
[CLIP]
Brian Todd: A cascading attack. Servers shut down. Screens go blank. A rush to unplug computers. This attack hit the world's largest casino operation, including the Venetian Hotel in Las Vegas, 10 months ago, and this also may have been the work of a rogue nation.
SOURCE: CNN YouTube Channel https://youtu.be/d_wCXDgEXX4
“Iran suspected in cyberattacks on U.S. casinos”
December 18, 2014
Nicole Perlroth: They were knocking online banking offline with denial of service attacks, which are really low-level attacks, but they were doing it in a really powerful way, and had clearly invested a lot in the tools they used in those attacks. And so, for a long period there, I was just covering these attacks, as one bank after another in the US just went offline, because they were being bombarded by traffic from Iran. So, their intent was really destruction. China was a different case. China, for a long time, it was intellectual property, and they were able to do that with these really kind of misspelled spearfishing emails.
Taylor Owen: And just to build up their own domestic industrial capacity, right? I mean, really, that's what that was at the time.
Nicole Perlroth: Yes. I mean, they didn't want to be the world's manufacturing hub forever. They wanted to be innovators, and what easier way to catch up on innovation than to steal the intellectual property from some of the biggest innovators in the world?
Taylor Owen: You have that amazing line, that they have the design of the F-35, Google source code, the Coca-Cola formula, and Benjamin Moore paint formula. I mean, so, they have everything, right?
Nicole Perlroth: Yeah, I guess. The Benjamin Moore paint formula, I don't know why, I mention it all the time. It just stuck with me, because it just gets to, we'll take anything, we'll take anything. So, yeah, but when we did see China use zero-day exploits, it was very telling how they were using them. They were using them on the Uighurs. We have seen them use these tools, for the most part, for surveillance of those they see as their biggest threats, which are their own people. The five poisons, they call them. Tibetan and Taiwanese activists, the Uighurs, and I'm blanking on the other two. But Russia, we've seen them use these tools for destruction.
Taylor Owen: Right, and causing chaos, right?
Nicole Perlroth: And causing chaos, exactly. We saw them turn the lights off in Ukraine a couple of times. We've seen them break into US power plants. The Department of Homeland Security published this screenshot a few years ago, literally showing Russian hackers with their fingers on the switches at our power plant. They did also break into one of, or possibly two of our nuclear plants.
Taylor Owen: Yeah, I mean, that nuclear line is the most scary sentence in the book, I think.
Nicole Perlroth: Yeah. That was the scariest thing. I was driving through the mountains, headed for my 4th of July weekend vacation. I just told my husband basically, let me out there on the side of the road, while someone basically relayed to me that they've gotten into Wolf Creek Nuclear Plant. And who have we not covered? I mean, North Korea ...
[CLIP]
Brian Todd: A devastating hack, crippling one of the world's most powerful entertainment studios. Sony Pictures Entertainment tells CNN it's still investigating what it calls a very sophisticated cyber attack.
SOURCE: CNN YouTube Channel https://youtu.be/k6vtFLfPSMM
“North Korea behind Sony Pictures hack?”
December 5, 2014
Taylor Owen: I love that they did Sony, and then decided just to make money by hacking bitcoin.
Nicole Perlroth: Yeah, they're like, well, that wasn't worth it. We're just going to use this to hack cryptocurrency exchanges, so we can get back some of our nuclear weapons.
Taylor Owen: And make some money.
Nicole Perlroth: Yeah.
Taylor Owen: So, what's amazing is, all these different countries are using these for clearly damaging use cases against the United States, which is kind of the core thesis of your book, here, which is, for whatever gain the US government might get from using these vulnerabilities, the blowback is both unknowable, and uncontrollable, and uncontainable. And I guess I'm wondering, did people actually make a decision that that was worth it, or did it just kind of happen? Like, one part of government thought this was a good idea, because it furthered their intelligence gathering, and nobody was really thinking about the whole picture, the potential for collateral damage.
Nicole Perlroth: I think we worried about it. I think, it's in there about sort of Barack Obama's fears about what would happen with Stuxnet if it got out, the bar that they were setting. I just think that there were no good options at that time for slowing down Iran's nuclear plans. So, I think, yes, we worried about it, but we made this decision that it was worth it. And I don't even know if that decision was wrong for the time, because arguably, that code saved lives, and kept Israeli jets on the ground, and kept us out of World War III. But there has been a big trade-off, and that trade-off has been what we're witnessing right now. All of these countries are coming for us in different ways, and the governments we didn't even touch on, in sort of going through each country's motivation for acquiring these tools, are the sort of others, I group them as, which is the UAE, Saudi Arabia, and even Mexico, which have turned these tools on their own people, on journalists, on human rights lawyers and activists, on Jamal Khashoggi. And it's so far out of our control. Their appetite for keeping close tabs on any form of dissent in those countries, so they can avoid another Arab Spring, is so strong, they started spending more energy hacking people like Ahmed Mansoor, who wanted to expand the right to vote, than they did tracking terrorists. And that is a really complicated case study, because these are our allies.
Taylor Owen: The other big player that emerges, or the set of actors that emerge in your book, are the way the tech companies grew and transformed over that eight-year period, the US tech companies. And I was really struck by that moment in the Aurora hack ...
[CLIP]
Tom Merritt: Google shocked the tech community this past week by not only announcing that they were considering pulling out of China, but the reason why was targeted attacks against them from within China. … Let's start with what happened. What did Google know, and when did they know it?
Elinor Mills: Google this week said that in mid-December, it noticed that there had been a network intrusion.
SOURCE: Kyle Dawson YouTube Channel https://youtu.be/8Y5Vbp6qQRI
“Operation Aurora (Google vs. China) Explained”
January 25, 2010
Taylor Owen: Google, it seems to me, almost emerges as a nation-state-like entity, in that they discover the problem, they don't go to the US government to solve it. They issue the declaration against the Chinese government. They kind of blame the American government for leaving the space open and vulnerable. And I wonder if that's almost, that represents a bit of a turning point, of how these companies started to see themselves as state-like, almost, in the international system.
Nicole Perlroth: Yes. And you know, it's interesting. I sent my original manuscript, I don't think I've told anyone this, to one of my former editors at the New York Times to read, and his feedback was, "You've made Google out to be too big of a saint." Like, you have to look at what they've been doing the last couple of years, and the fact that they're reentering China, and that kind of thing. But it was hard not to have empathy for them, because what happened when they were hacked by China was, this was the first time a company realized that they were hacked by China, and the first time a company came out and said it. So many companies had been hacked by China, but everyone had tried to sweep it under the rug, and not only did Google come and say, "We were just hacked by China, and this is what they did," they actually invested the resources necessary to keep them out. Just circling back to something we were talking about earlier. I mean, coming in this as an outsider, I also, in college, I don't know how I landed on this topic, but I ended up majoring basically in the Kurds, and US foreign policy toward the Kurds, and Turkey, and Iraq, the just differences there. And so, I just come at this from a totally different perspective, and I think all of this has so many geopolitical implications, and implications for surveillance, and minorities, that people weren't aware of when they were designing this code. No one at Google was thinking, 'I have to make this secure, because this is going to keep China from torturing someone in Tibet, or a Uigher minority.' But suddenly, that attack woke them up to the fact that if they were really going to not be evil, they were going to have to start investing in these resources. And they did, and that movement drew in Microsoft and Apple and Facebook, eventually, that all started their own bug bounty programs, and started really locking up their systems from nation-states, including ...
Taylor Owen: Including the US.
Nicole Perlroth: In some cases, our own.
Taylor Owen: Right. And I mean, it's interesting, your editor, or that person you sent it to, said that, because that was partly my reaction, as well. I mean, I spend a lot of time working on platforms now, and how they should be regulated, and all the harms that we know exist inside that system, the platform system. But I kept wondering if that narrative that's emerged in the zeitgeist about platforms as these harmful actors missed this whole story that you tell. Actually, in some ways, they were the ones picking up the pieces for the state, that was just dropping the ball left, right, and centre, here, and causing these real vulnerabilities that the public didn't see or know about.
Nicole Perlroth: Yeah. The privacy incursions are very real. The disinformation is very real. But the security teams, when you meet the people on these security teams at Facebook and Google, these are people who would never in a million years join Facebook as a user, because of the privacy issues, but they feel so strongly about the fact that they need to secure our people's data from nation-states, including our own, and they're super paranoid people, that they are just not who you would expect when you read the headlines these days about Facebook and Google.
Taylor Owen: And one of the ways I guess that tension is really flaring up now, again, is around encryption, which is obviously a very old debate. And, I mean, you have that amazing story of Tim Cook kind of confronting Obama and saying, "Look, we're going this route. We are going to end-to-end encrypt more and more things."
Nicole Perlroth: Yeah.
Taylor Owen: And you see Facebook now, saying all messaging, across all platforms on Facebook, are going to be end-to-end encrypted. I wonder how you think about that debate, and how that's going to play out.
Nicole Perlroth: I think the argument that Tim Cook made at the time, the one that I think really landed with me and a lot of people in the intelligence community, is if we open up a backdoor for you, which other governments should we not open up a backdoor for? Because we are a global company now. They have more business outside the United States than they do inside the United States now. So, if they open up a door, first of all, how is the FBI planning on keeping that safe when, and conveniently, this all happened at the same time, the office of personnel management, which handles all the personal data and records for everyone applying for a security clearance, was hacked by China, and everything was stolen, including their fingerprints. So, Tim Cook said, how are you planning on securing this magical backdoor key from China, from our adversaries, when you clearly can't lock up your own data? And two, this is just going to invite every other country on the planet to demand that we give them a backdoor, so where does this end? And I think those arguments alone are sort of case closed. But we did this great series of stories in the New York Times this year about the problem of tracking child traffickers and pornographers, and encryption has made it a lot harder for law enforcement to find those people, and those people have migrated a lot of their systems to Tor and encryption. So, I do empathize with the problem, and it's just really tricky. I think it's just one of those things. Like so much of this book, there's no silver bullet. These are really hard questions, and even when I find myself drifting towards one answer, something will population up, and you realize, huh, I was way too certain about how I felt about that. You know, there's no easy answers here, which is partly why I wanted to write a really accessible book, because I actually think at this point, it would be really helpful to open up these conversations to people who think about things like bioterrorism. How have we handled these problems in other sciences and other industries? Because for too long, we've sort of just left these conversations to this really insular information security community, and to classified government corridors, and it's not working. This is not working. So, I think it's time to be very creative about how we handle these issues.
Taylor Owen: You make it so clear all the way through how exclusionary that discourse is, too, and when you have a conversation framed in the language of national intelligence, it creates a very limited set of considerations, and probably certain risks are much more willing to be taken. And maybe with the encryption debate, government should be learning from the broader implications of these strategic decision, and some might be good in the short term, and for very particular, targeted reasons, but those broader consequences are massive, particularly for people in less democratic societies.
Nicole Perlroth: Yes, and the visuals that will really stick with me from reporting out this book are Ahmed Mansoor still stuck in solitary confinement, the guy we call the Million-Dollar Dissident, in the United Arab Emirates, who has been hacked and spied on with every new spyware that hits the market. That will stick with me. The other one is this story that this former NSA hacker told me about being recruited outside the agency by a Beltway contractor, with promises to double, quadruple his salary, being sent to Abu Dhabi, told he was going to do one thing, but then he ended up being hired to do another, and of course, he wasn't allowed to tell anyone what he was actually doing. And at first, that thing was, you're going to hack foreign networks and terrorists on behalf of the United Arab Emirates, but very quickly, it became, hey, we heard reports that Qatar is hacking the Muslim Brotherhood, can you hack them? But Qatar is also a US ally. And suddenly, an NSA hacker is sitting there, hacking another US ally. And then, they invited Michelle Obama, who was the First Lady at the time, to Qatar, and suddenly he's capturing Michelle Obama's emails and security details in this dragnet. And so, that visual sticks with me, which is where is the oversight on this? Where are the rules? And I don't think it's acceptable that a former NSA hacker, whose training comes from taxpayer dollars, is sitting there reading the First Lady's emails. I think that most of us could agree that's not acceptable. I don't think it's acceptable that we're selling ... we, the US hackers, but also hackers all over the world, are selling these tools to governments that over and over again, we know will abuse them. So, where does this all end?
Taylor Owen: I mean, the title of your book is that it ...
Nicole Perlroth: Yeah, right.
Taylor Owen: It doesn't end well. This is How the World Ends. But I mean, I think I was originally most concerned about the nuclear war issue, or nuclear meltdown, or whatever might be done by a hack, but it seems to me, in the way you articulated it there, that the bigger problem might just be this death by a thousand cuts. That these tools just cause so much disruption to our institutions, and to our political systems, and to human rights norms. Maybe that's how the world ends, through a thousand cuts, rather than a reactor meltdown.
Nicole Perlroth: Yeah. I mean, it's not so straightforward, you know? The title, I think people thought they were going to be reading this book about mushroom clouds, and yes, we are getting dangerously closer to that sort of cyber-induced kinetic boom moment. We keep having these close calls, and yes, we are getting closer to it, and we're fortunate that it hasn't happened over the last 10 years, and there's legitimate questions to be asked about why it didn't happen. I do think it's a distraction, this sort of talk of a cyber-Pearl Harbour is a distraction from where we already are, which is our elections have been hacked, our psyches every day with these conspiracy theories and disinformation campaigns, our power plants, our nuclear plants, our hospitals. I mean, the ransomware attacks on our hospitals are just getting worse and worse, and I just heard a healthcare organization paid $10 million in ransom, and their insurance encouraged them to pay that price, because that's still cheaper than the price it would cost to remediate and build back up from scratch. And so, where is ransomware going? Because we're just digitizing our lives. At what point is it going to be your self-driving car, or your insulin pump, or your pacemaker, you know? There's no reason to go there yet, because there's still so much money to be made with these corporate ransomware attacks. But we're just continuing to connect everything without thinking about that possibility, and that's a very real possibility. That's not just stoking fear for fear's sake. That's a possibility.
Taylor Owen: Yeah, and I mean, this is directionally where we're heading, it feels. I mean, look, I think that's why it's just so unbelievably important and valuable that you've written this book in the way you have, that we start having this debate, because I just don't think we have been. So, thank you for doing it.
Nicole Perlroth: Yes. Thank you.
Taylor Owen: Really, and thanks for taking to us about it. I really, really appreciate it.
Nicole Perlroth: Thank you. And just because I hate ending on a everything is terrible note, I wrote this book. I was running to get it to press before the election, because we were all worried about the 2020 election here in the US, and what would happen, in terms of foreign interference. And lo and behold, my book goes to the printer, and then we discover the SolarWinds attack, and now we're discovering another Chinese attack on our software supply chain. And those are terrible. It's going to take a while to unwind these things. But the good news is, we've hit rock bottom on that part. We have no choice but to ask ourselves the hard questions about what is in our network, and who's securing it, and where is this code being built and maintained and tested, and are they investing enough in security? And should there be liabilities for SolarWinds for the password to their software update mechanism being SolarWinds123? And should we create incentives for companies, like tax credits, that do subject themselves to serious penetration testing? These are not things we have been talking about, and suddenly we really have to talk about them. We have no choice. Build back better, as Biden calls it, is being applied to the cyber-domain thinking, as well. So, those are good. Those are good things. They're not enough, but we're headed in a better direction, because we know it has to be more than should we have a data breach notification law? It has to be more than that.
Taylor Owen: And it needs to be a public debate, too. I mean, as you make very clear, there are some real trade-offs involved with how we deal in this space, and that needs public engagement, and buy-in for those trade-offs.
Nicole Perlroth: Yes, yes.
Taylor Owen: But I hope this book helps to do that. So, thank you.
Nicole Perlroth: Thank you. Thank you.
Taylor Owen: Thank you so much. That was really fun.
Nicole Perlroth: Yeah. I felt like I was talking to my therapist. It was so helpful for me to get this out. So, I appreciate it.
Taylor Owen: That was my conversation with Nicole Perlroth.
Big Tech is presented by the Centre for International Governance Innovation and produced by Antica Productions. Please consider subscribing on Apple Podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week