This transcript was completed with the aid of computer voice recognition software. If you notice an error in this transcript, please let us know by contacting us here.
Jameel Jaffer: I don't see this as kind of free speech versus other things. It's a question of, you know, coming up with the best vision of free speech period and building a public sphere that reflects that, you know, that vision.
Taylor Owen: Hi, I'm Taylor Owen and this is Big Tech.
So last week, the Canadian government unveiled a proposal for a new online harms bill. If it passes platforms will be responsible for taking down five categories of already illegal speech within 24 hours. And the public will be able to report content to a new regulator who can order the platforms to take down or even repost content. The bill has unsurprisingly led to a backlash from conservatives, civil libertarians and some internet rights activists. Because while the intentions here are good, there are some serious problems with the legislation as it's currently drafted. 24 hour take-downs, have another countries such as Germany, led to over censoring by platforms. A provision mandating the sharing of content between platforms and CSIS poses serious privacy risks. And forcing telecoms to block the service of non-compliant platforms, challenges long-held notions of net neutrality. This isn't to say that new regulations are not urgently needed. They are. The design of the internet has changed the nature of speech itself, which warrants new safeguards and parts of this legislation are a step in the right direction. But the challenge is that rather than focusing on the root causes of the problem -- the data collection and the business model of platforms -- the government has focused on the symptoms instead. And by doing that, they've stepped into the perilous space of governing speech and re-ignited the free speech debate. But what's often left out of this debate is that there's always a tension in democratic societies between the right to free speech and the right to be protected from harmful speech. Every democracy balances these rights differently.
All of this makes the perspective of our guests this week, even more critical Jameel Jaffer is the executive director of the Knight First Amendment Institute at Columbia University and the former deputy legal director of the ACLU. Jameel is Canadian, but he spent most of his career in the US where he's been involved in some of the most important free speech litigation of the past two decades, including a successful challenge to the PATRIOT Act, a lawsuit against the NSA and freedom of information requests to force government disclosure relating to secret torture and drone programs. As the Canadian government closer to passing some kind of online harm legislation I think it's really important that we have this conversation. Because we're at a point where the discourse around this issue seems almost irreparably polarized. You're either for free speech or for safe speech, but as Jameel and I talked about in this interview, that's a false dichotomy. We need thoughtful regulation that protects both. Here's Jameel Jaffer.
Taylor Owen: I wanted to start talking a bit about the first amendment in part because you now run a centre that focuses on that. And it's something that's run through a lot of your work, obviously over the last 20 or 30 years, but I'm not sure it's particularly well understood. And we keep seeing references to the first amendment in our current debates about regulation and the internet and that seems sort of very ill placed in many ways. So I just want to start off really broadly that it seems as a non-legal outsider, the first amendment covers a huge amount of rights, freedom of religion, freedom of speech, freedom of the press, freedom to assemble. I'm wondering from your perspective what binds all those together?
Jameel Jaffer: Yeah. well let me just go back to something you said a minute ago, which was that maybe the first amendment isn't well understood and that may well be part of the problem. It's also just that the first amendment is deeply contested, right? All those rights that you just mentioned, almost everybody agrees in principle that those are important things, but what they actually mean in any particular context, there's a lot of disagreement about, and, you know, that's true of all of those rights, including, you know, press freedom and assembly and petition, but maybe especially speech. You know, every democratic society recognizes the freedom of speech in some way, but the American Supreme court, you know, starting in the early 20th century, but especially in the 1960s and 1970s and 1980s really built a whole body of jurisprudence around free speech that gives much more protection to speech than most other democracies do. And that is in two different respects. One is that more things count as speech in the United States and the other is that once something counts as speech it's much more difficult to regulate it in the United States than it is elsewhere in the world. You know, we have this very, black and white -- when I say we, in this particular context, I'm talking about the United States, although I will also use a we to talk about Canadians. But you know, we have this black and white understanding Americans do, of rights you know, once a right attaches, it becomes very difficult for the government to regulate it. And that's not equally true in every democratic society.
Taylor Owen: And it's evolved even the United States, right? I mean, the Pentagon Papers and New York times versus Sullivan, and some of these really signature cases changed what the first amendment meant.
Jameel Jaffer: That's definitely true. Yeah, I mean, you know, there was no real first amendment in case law until, you know, Oliver Wendell Holmes and Justice Brandeis, and, you know, in the first instance in descent, and then later on in majority opinions kind of built a contemporary free speech doctrine in the 19 you know, in the early 20th century and then throughout the 20th century, other jurists built on what they, you know, what they, what they wrote. But it's really those, you know, those cases that you just mentioned were decided in the 1960s and 1970s. And when we think about free speech today, when Americans think about free speech today, it's really those cases that define the concept for most Americans. But those are cases that were decided 50 years ago. And those cases still really you know, are the defining ones, the ones that defined, you know, the first amendment as we understand it today.
Taylor Owen: Yeah. In many ways, those are American specific debates, but have had much broader impact or repercussions internationally. I mean, Canada, we don't have a first amendment, but we have a charter of protection for free expression. And so do you see that American discourse that you're sort of central to as having a global reach?
Jameel Jaffer: Definitely a global reach, although it hasn't always been convincing to, you know, other countries or other courts. You know, the Canadian Supreme court has not adopted American free speech jurisprudence you know, kind of hook line and sinker, nor have other high courts of other democracies. I mean, I do think that the US Supreme court is very influential, but, you know, influential is different, is to say that they're influential doesn't mean that, you know, what they say is decisive elsewhere. And more recently, you know, over the last 20, 30 years, the US Supreme court has gone even further. I mean, I think, you know, relatively speaking Pentagon Papers and New York Times versus Sullivan are cases that are broadly respected outside the United States, even if people, you know, ultimately think maybe they weren't correctly decided, but more recent Supreme court cases in the United States have attached the label 'free speech' to a lot of activities that many Americans like, you know, even before you get beyond American borders, many Americans are uncomfortable with. So, you know, sort of campaign finance donations or corporate commercial speech, or you know, there's a case decided in the last decade involving data miners and their access to certain information that related to prescription data. And you know, there too, the Supreme Court took a very broad view of what counts as free speech. And you know, maybe somebody who is a first amendment enthusiast or a free speech enthusiast would at least sort of instinctively celebrate those kinds of decisions, you know, how can it be a bad thing? You know, the first amendment is winning in those cases. Right. But the problem is once the first amendment gets attached to an activity, like once you say that giving money to a political candidate is activity protected by the first amendment, it again becomes very difficult for the government to regulate that activity. And so that's the whole point of the first amendment.
Taylor Owen: Yeah, once a right is given, it's hard to take them back.
Jameel Jaffer: Yeah. And sometimes the regulation that you might want is regulation that would protect free speech in some way, or protect other rights that are closely entangled, but free speech like privacy rights. And if you have said that, you know, for example -- I mean, we can talk more about the Clearview case. You know, it's a whole whole conversation there, but the Clearview case is a case in which Clearview AI is asserting the right to scrape photographs, publicly posted photographs from the web in order to feed it's facial recognition app, which then sells and Clearview is arguing in court in US court that its activities are protected by the first amendment. What it's doing is speech no different from what reporters do every day and you know, collecting information and analyzing it and then publishing it and Clearview says, what we're doing is speech. And therefore, essentially the government can't regulate it. And you know, if Clearview wins, then in a sense, the first amendment will win, but the result will be that the government won't be able to pass laws that protect privacy against those kinds of new technologies. And whether that's a win for free speech, even if it's a win for the first amendment I think is at the very least, you know, debatable.
Taylor Owen: Particularly if privacy protections actually do protect the speech of those whose voice is protected by those privacy provisions.
Jameel Jaffer: That's exactly right. Yeah. I mean, you kind of have, you know, you have free speech interests on both sides. Even if you were sort of very generous to Clearview and you say, well, yeah, you, you have a point what you're doing does resemble what journalists do every day and,
Taylor Owen: Or academics, frankly. I mean, NYU has been using similar arguments, right. To say they should be able to scrape Facebook pages.
Jameel Jaffer: Yeah. I mean, I think that you have to I think it's important to draw distinctions of how, what the purpose of the activity is, what kinds of safeguards are in place to protect individual privacy? What kind of contribution the activity makes to public discourse? I mean, that's something that I think has to be central to the question of whether the first amendment applies or not. And like, does it contribute to public discourse? And so I would draw a distinction between Clearview's facial recognition app and something like what digital journalists do. I mean, you mentioned these researchers at NYU, but there were also journalists at The Markup the New York Times, you know, all over the place, but they do it for completely different reasons. But all of that is just to say that if you keep attaching the first amendment to all of this activity, without thinking very much about what you're attaching it to, what you might be doing is undermining the government's ability to protect free speech or protect privacy, protect rights that are, you know, really the rights that we were setting out to protect in the first place. So it's a kind of, you know, bizarre situation where you have sometimes the first amendment on one side and free speech. On the other side of the 'V'.
Taylor Owen: This has evolved quite rapidly over the past decade, where much of the first amendment work you were doing was about state power and abuses of state control of speech or information, whether it be over the PATRIOT Act or Guantanamo Bay or disclosures of the drone program, like all of those things were about this abuse of state power. And we now seem to be in a place where much of the debate is around corporate power, whether they be platform companies or in the case of Clearview AI, have a very specific targeted corporate actor. And has this changed the way we think about the first amendment and it doesn't even seem purpose-built to talk about corporate power anyway.
Jameel Jaffer: Yeah, no, I think that's a, that's a real question is whether the first amendment is sort of up to the task of protecting free speech in the digital age, when so many of the threats to to free speech are coming, not, you know, as you say, from state actors, but from, from private ones. And, you know, it didn't have to be this way, but the first amendment has been interpreted by US courts to bind -- except in very rare circumstances -- to bind only government actors. And so, you know, there's a relatively robust set of restrictions on what kinds of activities the government can engage in when it comes to, you know, monitoring political protest, or censoring political speech, you know I mean, even in those areas, I would say that there's a lot of work to be done, but at least relatively speaking, you know, there there's at least a body of law that, constitutional law, that limits government action in those spheres. And there was no similar body of constitutional law that limits private actors because you know, even to propose that the constitution should bind private actors is, you know, in the United States in most circles as sign that, you know, you're not a serious person. It's just that that's that far outside the political debate here.
Taylor Owen: Which shows how different the American debate is from the European or even Canadian one in many ways.
Jameel Jaffer: Yeah.
Taylor Owen: So, let's talk a bit about how this debate is unfolding then. I mean, one of the central challenges seems to me that the nature of speech itself and how it's distributed is changing, and some of that change is happening via technology. And these platforms both facilitate more speech than we've ever had. They make speech easier in a way. But they also shaped that speech and they shaped the character of it. They decide what's allowed to be said and what isn't and in many ways they've become public actors in that way. So do you agree with that? And do you think that should change how we think about free speech and who has an obligation to protect free speech? It's not just publishers, it's not just the government anymore. We have a new entity that controls speech in a meaningful way.
Jameel Jaffer: I agree with that completely. I mean this is, you know, I'm not, I'm not sure that people fully appreciate -- I mean, I'm sure your listeners Taylor do do appreciate this -- but I mean more generally, I'm not sure people fully appreciate how aggressively the companies shape the speech on their platforms. I mean, there is this I think sense out there. And now there is I think an awareness that the companies play a very important role in deciding who can, when they decide who can be on the platform and who can't right when Facebook decides Milo's out or Alex Jones is out, that, you know, is a reflection of Facebook's power over the new public square. You know, Facebook has that very important power. And I think that there's now a general awareness that that is you know, real power. But I think there's a less awareness of how Facebook is shaping even the speech, you know, shaping the speech on his platform in other ways. You know, through its sort of algorithmic decisions, it's design decisions, engineering decisions that sort of help determine what speech gets traction on the platform and what speech doesn't and which voices get amplified, and which voices get marginalized. All of that is relatively invisible, or almost entirely invisible. And so I think, you know, people don't understand how significant that that, that power is. You know, there's been a lot of attention to content moderation, which is, you know, at least narrowly understood is about what happens at the, at the gates. I'm not suggesting that that's not important. I think that, you know, I've been sort of mildly surprised by how well the oversight board has worked out with respect to content moderation. But content moderation in my view is just, you know, a small piece of the the puzzle here and the much larger piece of it has to do with decision-making within the gates.
Taylor Owen: So do you think we should consider, I mean, you mentioned algorithmic amplification or recommendation engines. Do you think those are acts of speech themselves?
Jameel Jaffer: Yeah, I mean, I think that is a really hard question. I think some of the decisions that platforms make are plainly, you know, speech under contemporary first amendment doctrine. So for example, when Twitter decides we're going to put a label on this tweet from president Trump, we're going to explain to our users, this is misinformation that is not just speech, but political speech about not just a public official, but the most powerful public official in the country. And it is speech that is coming from human beings. You know, it's, it's some decision has been made within the company to you know, express the company's sort of collective values through that, through that speech. And so I think, you know, if we were talking about that kind of speech, the question would be easy to answer, at least for me. At the other end of the spectrum, you know you have these kinds of algorithmic decisions that take place in, you know, what are black boxes, not only to the public, but even to people within the company, you know, who don't really, who can't really understand why one thing is being amplified and something else isn't being amplified. I mean, it's really, I mean,
Taylor Owen: That's the nature of machine learning is that it operates without human direction. Yeah,
Jameel Jaffer: Yeah. That's what it's intended to do. Right. and I think that's a harder question. I mean, if you accept, you know, as you know, the companies have been arguing for a long time when it comes to liability issues, they want to say we're very different from the newspapers. You know, we are posting the content of third parties. We don't review all of it. We can't, it's not consistent with the technology or the business model. We're very different, and therefore we need immunity that the publishers don't have. But when it comes to regulation, you know, whenever Congress or state legislatures want to regulate the companies, they say, you can't regulate us. We are doing the same things that newspapers are doing we're media organizations, we are making editorial decisions. Our algorithms are editorial decisions themselves. You know are our community standards are a reflection of editorial discretion. And you know, for the same reasons you can't, you know...
Taylor Owen: Tell a newspaper what to do.
Jameel Jaffer: ...yeah, you can't tell us what to do. And I think that's a really, really dangerous, you know, if you take it seriously, if you kind of take it to its logical conclusion, it means that even a transparency mandate would be unconstitutional. Because obviously if Congress said to the New York Times and the Washington Post, "you have to tell us how many op-eds you rejected." And "you have to tell us who was involved in rejecting which Op-Eds," you know, nobody would make the argument that that was a constitutional regulation. There's no way that would survive first amendment scrutiny. But you know, if you can't pass that kind of legislation with respect to the technology companies, then why are we all wasting our time with this, you know, conversation because that's the lowest hanging fruit. If, if you know, that's, that's the easiest question. If you can't do that, you can't do anything. So I think that, you know, you the right way to look at it is the companies do many, many different things. Some of them, some of those things probably are entitled to the full protection of the first amendment in the same way that the New York Times editorial decisions are, are entitled to that protection. But other things resemble less the kinds of editorial decisions that are made by newspapers.
Taylor Owen: I mean, like you say, one of the challenges here, and I share this thought is, is that these companies are, these platforms are many things. So of course they're not a publisher, but sometimes they're a publisher. Like sometimes they're now all sorts of things. They have their own currencies. They are increasingly in the healthcare industry. So we're going to have to regulate them all different ways, but, but even on liability of speech, it's their rights aren't absolute. I mean, we've already limited the there's no child pornography on platforms. There's no terrorist content on platforms. So where do you see that line extending? I think increasingly we're seeing it around hate speech, for example, which in Canada is illegal. So the government's saying, "well, hate speech is illegal. So why on earth should you be exempt from liability for having hate speech on your platform?" But where do you see that line and where are you comfortable with that line extending to?
Jameel Jaffer: Yeah. I mean, I think one of the things that makes that hard is that, you know, we can all agree that content of a particular character should not be on the platforms. You know, it's illegal, let's use child pornography as example, because that's one that virtually everybody agrees on. We can all agree on that. But if you say to the platforms, "if content of that nature is on your platform, you will be held liable." Then you have to think about not just the content for which they will actually be held liable after a trial, et cetera. Right. But also the content that they will take down because of the fear of liability, right. And that content we might care about in a way that we don't care about child pornography,
Taylor Owen: A piece of art, for example, that is caught...
Jameel Jaffer: Lolita or it's, you know, or, you know, who knows what the algorithms you know, I identify. And, you know, child pornography turns out to be a relatively easy category to police because you know, once you have identified a particular image as an illegal one, you can relatively easily search for the same image you know, wherever it occurs and algorithmically just remove, you know, the hundreds of copies of that image from, you know, from the web. But once you get away from that category and you go to say terrorists, you know speech glorifying terrorism, which is a category that some platforms see as, have identified as off limits. It becomes much, much harder to determine what speech falls into that category because context is so important, even if you're talking about, you know, a speech from Osama bin Ladin, you know, that speech might be presented in the context of a news story, or it might be presented for a purpose other than glorifying terrorism, you know, it might be presented for exactly the opposite purpose you know, "listen to what this crazy nut job is saying", you know, that, that kind of purpose. And I think that's, you know, one of the reasons why it's complicated. I mean, one of the reasons that why it's complicated is that we can't agree on what speech should be taken down, like beyond the stuff that is illegal, you know, like child pornography or in the United States, obscenity, you know, once you get past the categories of things that are illegal and unprotected by the first amendment, then there isn't a consensus on, you know, where the line should be. But even if there were consensus, you'd still have to worry about the chilling effect of these kinds of laws and the chilling effect, you know, differs depending on the nature of the law. Like you might have a law that says, instead of saying, you're going to be liable if this kind of content is found on your platform and that kind of law, which kind of law that Germany has right now for some categories of content can have a very significant chilling effect. You might at the other extreme, have a law that says if after after a judicial proceeding, the content is found to be impermissible and you don't take it down, then you'll be held liable and a law like that is less likely to have a chilling effect, right? On the other hand, how many, how many judicial proceedings can you possibly have, like these, these platforms operate at scale. You know, in the time we've been having this conversation, you know, hundreds of thousands of pieces of content they've been published on each of these platforms, you can't possibly have an individual you know, judicial proceeding for each piece of content that somebody alleges is impermissible. So, it's hard, you know, these are...
Taylor Owen: It's hard, but, I mean, there are -- so in this sense, the details of the policy really matter. So Germany, for example, probably made their fine too big. So they created a greater chilling effect around hate speech because the platforms didn't even want to get close to that fine. So they over censored. But you could have changed that fine level, and that would have made a difference. And you could have some sort of, I mean, like it's being debated in Canada now, some sort of fairly rapid appeal mechanism for take downs that could mitigate some of the over censoring, right? So there, there's some policies that do this, but one of my real challenges with this debate, particularly on content and free speech is that the platforms are global they're based in the US so they're operating under a US conception as their primary notion of what is allowed and what isn't, but that's not how we govern speech internationally. We, we govern speech via national governments. And so there's a real disconnect between the model that these companies use and need to operate at scale to function as businesses and the way we legally determined speech historically. And, and I'm not sure why the way they operate at scale as companies should change my government's ability to impose their laws, but they're in conflict with one another.
Jameel Jaffer: Isn't that is, isn't that a problem with your government Taylor, and for this, for these, I'm going to call it your government. But but you, you know, national governments have the power to require the companies-- I mean, this is, you know, in some contexts, a good thing in other contexts of bad thing, but have the power to force the companies to conform to their, you know, ideas of expressive freedom. Now we should remember that often that power is exercised and really problematic ways, right? Like, yeah. And I mean, maybe right now, India is the best example where the Indian government is pressuring the platforms, Twitter in particular to take down speech that is inconvenient to the ruling party. But India is not at all unique in that regard. Many other countries have done the same. So yeah, so on one hand, I do think that, you know, it would be good if Canada, for example, effectively forced the companies to respect Canadian ideas about the limits of free speech and not just free speech, but also, you know, due process and privacy. And you know, that, that would be a good thing. But there are other contexts in which it turns out that the fact that the platforms have to some extent internalize US values has ended up protecting dissidents and political minorities in other places where the freedoms of speech and, you know, assembly association are not as well enshrined or well institutionalized. So yeah, I mean, I'm not sure that, you know, I'm not sure that there's kind of a one size fits all. This is not that you're suggesting that there is, but, you know, there's, there's no single solution across, you know, across the globe.
Taylor Owen: Yeah, I mean, interestingly, we do have you're right. Canadian government has the power to impose these things. Although section 230 was embedded in USMCA as a clause in USMCA. So there's an open question, I think about just how much freedom we actually have to regulate content. So that will be an issue. But look, I think the, the question of whether democratic countries should change their notion of that tension between free speech and the right to be protected from speech, should they change their notion of it to be more like Americas in order to ensure that the rights of people in illiberal regimes are protected by that platform. And that's a really hard question. And maybe we should, right. Maybe we should say, look like there's over a billion people in India who could in theory be by our overzealous regulation, therefore we should nudge closer to American notions of free speech, but that's a tough thing to, to argue.
Jameel Jaffer: Yeah, but I'm not sure I actually believe that. I mean, I think somebody could make a powerful argument along the lines you just, you may made. But, you know, ultimately I don't think that Canadian should you know, kind of stand down because of the possibility that asserting Canadian interests in this context is going to mean that other governments will assert their interests too. What makes sense is for Canada to figure out what kind of regulatory framework is appropriate, given slightly different, you know, Canadian conceptions of, you know, all of these all of these values and, you know, you made a good point when you say that, you know, section 230 could tie Canadian hands to some extent. But, you know, in my view, the, the, the most important regulation is not regulation that would cut back 230, but regulation that would protect individual privacy online. Because I think that you know, at the root of a lot of the problems that we are focused on, including the problem of misinformation, for example, you know, what you have is this very high powered, targeted advertising and targeted messaging, which is enabled by the surveillance that the companies engage in of not only their users, but everybody else as well. And if you better protected individual privacy, in other words, limit what the companies can collect, limit, what they can do with what they collect require them to be more transparent about what they're collecting and what they're doing with it. All of those things would have kind of downstream effects, positive, downstream effects on speech, you know, on, on sort of the quality of public discourse. And you don't need to cut back 230, you know, you can believe what you want to about, you know, whether 230 is a good idea or a bad idea, or, you know, somewhere in between, but you don't need to do anything to 230 in order to protect privacy.
Taylor Owen: And so, if you deal with some of those structural issues first, you may not have to deal with some of these edge case, speech issues via any sort of legislative regulatory approach. They might, most of that problem might just be solved by...
Jameel Jaffer: Yeah. Or at least, yeah. At least we'd have fewer problems to solve.
Taylor Owen: Yeah. I mean, that kinda gets to the last thing I want to ask you briefly, which is this debate. We've seen it in Europe and in some ways in the US and increasingly in Canada over the past few months, as we've been talking about online harms bills and new regulators for the internet. In my view has become incredibly toxic and unhelpful. We've ended up in this situation where it's the authoritarian state versus advocates for free speech. And it's this very binary debate. And I've always thought and respected your, you were sort of in a more thoughtful and measured voices in this debate. And I'm wondering how you hope we could structure this conversation in Canada, but in other countries too, so that we focus in on what matters most in this and not get caught in this dichotomy. And I'm asking almost personally, because I'm just not sure how to be most constructive in this debate, not get caught into this. What I think is a false dichotomy, ultimately between government regulation and free speech.
Jameel Jaffer: Yeah. Well, I mean, I agree with you that it's a false dichotomy. You know, I, I'm not unsympathetic to the complaints of conservatives down in the United States who, you know, who are worried about the power of the platforms. This is rarely happened to me before, but, you know, recently I had read these opinions by dissents usually, or concurrences from justice Thomas or justice Gorsuch, you know, and I don't always agree with their sort of bottom lines, but kind of their hand-wringing about monopoly power in the speech environment is, you know, really resonates with me. I think that there are, you know, they're right. I am worried though that they've identified a real problem in the monopoly power, but the way that it's now being presented in Congress is that we need to solve the problem of censorship of conservative voices. And you know, that's not the problem we need to solve the problem we need to solve as a kind of structural one, not a, you know, abuse of power in that narrower sense. It is a structural problem that has to do with you know, whose solutions probably you know, will require antitrust action will require new forms of regulation. And I wouldn't see those, you know, those actions as a counter to free speech, quite the opposite. You know, those are the things we need in order to ensure that we have a speech environment in which people can exchange views across political divides, and we can kind of, you know, try to negotiate political differences. You know, if you care about the health of the public square, then right now, I think you have to be open to certain kinds of regulation. It doesn't mean we should be insensitive to the possibility that certain kinds of regulation could be used to censor political minorities, but to, to see that as reason to, you know, not engage in, you know, not entertain any regulatory possibilities whatsoever, I think is you know, a mistake at the very least.
Taylor Owen: Well, no, no. I mean, the voices that most often also get lost from this are those that are most harmed by speech itself. I mean, there's a lot of rights advocates and anti-hate groups who have very clearly and poignantly shown the harms of speech and how that has a censoring effect as well. And they're often not part of these debates.
Jameel Jaffer: Yeah. I mean, I don't really see it as weighing free speech against, you know other interests or limiting free speech because of the harms that free speech causes. I see it as thinking about what kind of, what we want free speech to mean. Like what, what does it really mean? Like what values are we trying to protect when we say we care about free speech? And I think that most of us think, you know, more or less the same things that we want to protect everybody's right to participate in the conversation. We want to ensure that free speech works for our democracy. You know, that we can come to a political agreements of one kind or another that have a degree of legitimacy to them. You know, we want the speech environment to be one in which people can kind of seek the truth, you know, actually rely on what they're seeing in the public and not have to worry that this is, you know, actually, you know, some Ukrainian bot and rather than, you know, prime minister Trudeau speaking to me. To a large extent why you don't want the same things or referenced the same values at least. And I don't see this as kind of free speech versus other things. It's a question of, you know, coming up with the best vision of free speech period and building a public sphere that reflects that, you know, that, that, that vision. You know, in, in the course of that, we have to take into account, but the fact that there's a lot of abuse online and, you know, anonymity can be used for good ends, but also bad ends. Like we've got to take all that into account, but you know, ultimately the question isn't should we limit free speech because of other things, the question is like, what is the best vision of free speech for our society.
Taylor Owen: That was my conversation with Jameel Jaffer. Big Tech is presented by the Centre for International Governance Innovation and produced by Antica Productions. Please consider subscribing on apple podcasts, Spotify, or wherever you get your podcasts. We release new episodes on Thursdays every other week.