The “Facebook Papers” — as the confidential files shared early this month by whistle-blower Frances Haugen have come to be known — have dominated the past few weeks of news about social media platforms. The revelations themselves were shockingly unshocking. “Has there been a topic covered as extensively as online safety (particularly on social media) for as long as it has been (+15 years) without anything seemingly changing?” journalist and strategist Ben Whitelaw pointedly asked on Twitter, before linking to BBC headlines from the mid-2000s wrestling with the same issues. Reacting to the revelations that Facebook severely under-invests in content moderation in India and operates with little contextual knowledge, Chinmayi Arun, a legal scholar and fellow at Yale’s Internet Society Project, noted that “Indian researchers have been saying this for a long time. In English.”
But something has changed more recently, providing some very pertinent information not only on what some users may want to see from Facebook but also on the feasibility of taking some power over content moderation away from platforms. That change was the release, on October 21, of the first quarterly transparency report from the Facebook Oversight Board.
The board was created out of a fund from Facebook, with its first members chosen by the company, but has operated independently since its inception in 2020. Many board members are prominent legal scholars, such as Jamal Greene; former politicians, including Helle Thorning-Schmidt, a former prime minister of Denmark; or journalists, such as Alan Rusbridger, a former editor-in-chief of The Guardian. Users can appeal their cases around removed content to the board, but the board will select which cases to deliberate on.
As it happens, the newly released transparency report also contains some shockingly unshocking news: Facebook is not being sufficiently transparent even with its own Oversight Board. The board has noted that the company had not been fully forthcoming about its cross-check system for high-profile users. Facebook also did not answer all of the board’s questions, only fully replying to 130 of 156.
Much more useful were the statistics around appeals. From October 2020 to June 2021, more than half a million Instagram and Facebook users submitted appeals about deleted content to the board. But this body of 20 individuals only has capacity to look at a limited subset of cases. The board’s processes ultimately resulted in Facebook taking action on just over 30 pieces of content, from the more than 500,000 requests. The board exists, but its impact seems severely limited. There’s clearly a mismatch between the users’ appetite for appeals and the Oversight Board’s capacity.
If the Oversight Board cannot satisfy the users’ appetite for appeals, the question arises of how to resolve that mismatch. I’ve suggested previously that some of these cases might be taken out of the realm of companies and moved into e-courts. These digital courts could provide judicial space for resolving disputes about content, such as takedowns for legal reasons. Rather than the Facebook Oversight Board deliberating on the questions of only a few dozen lucky users, e-courts could take on a much larger number of cases.
The e-court proposal emerged from discussions in a Transatlantic High-Level Working Group on Content Moderation Online and Freedom of Expression, in which I took part. We met three times in 2019 and sought to hash out solutions that would be acceptable on both sides of the Atlantic. Unsurprisingly, we could all agree on the value of transparency. More interestingly, we generally agreed that e-courts could be a way to take some of the judgments out of the hands of companies and into a more public domain.
After our final meeting in November 2019 (in Bellagio, Italy, back in the days when we had meetings in person), I co-wrote a paper thinking through the practicalities of e-courts with Ronan Ó Fathaigh and Lisanne Bruggeman, both from the Institute for Information Law at the University of Amsterdam, and Chris Tenove, from the University of British Columbia. Our paper showcased the surprisingly large number of online tribunals and courts that already exist in Europe and North America. Back in 2015, Richard Susskind, a professor and the information technology adviser to the Lord Chief Justice of England and Wales, had suggested the introduction of online facilitators to resolve minor disputes in the United Kingdom. We wanted to show that online courts were possible, that they have existed in places such as the Netherlands and British Columbia for quite some time, and that they could be feasible locations in which to mediate disputes over the legality of online content.
But we could not fully answer one important question: What would be the number of cases? No one could predict how many users might appeal and whether a system would be overwhelmed. The transparency report from the Facebook Oversight Board finally provides some sense of scale. Of the more than half a million complaints submitted between October 2020 and June 2021, 46 percent came from the United States and Canada. That would mean around 230,000 complaints in nine months. Twenty-two percent were lodged from Europe, amounting to about 110,000 cases.
These are large numbers, although not every case would likely merit a hearing. All the same, these are surprisingly manageable numbers compared to the quantity of content posted each day on Facebook in those regions. In fact, other platforms cope with a far greater volume. For example, eBay’s online dispute resolution mechanisms resolve more than 60 million disputes between eBay users and sellers every year. If the dispute is sufficiently complex, eBay brings in a third-party professional reviewer to resolve it. A robust judicial system could quite conceivably establish mechanisms to cope with those numbers, particularly if platforms were to pay their fair share of taxes, which could help support e-courts.
An e-court is not necessarily an appropriate solution for every jurisdiction in which Facebook operates. Even for the company’s own Oversight Board, it is clear that existing mechanisms for redress are not being employed in Asia, Africa, Latin America and the Middle East in proportion with the number of users in those regions. The Oversight Board is undoubtedly correct in noting that they “do not believe this [the distribution of appeals to the board] represents the actual distribution of Facebook content issues around the globe.” This suggests that e-courts and dispute resolution may not be the appropriate mechanisms for every region and country. It will depend upon individual judicial systems, the strength of the rule of law, and many other factors.
That said, even though the Facebook Oversight Board’s transparency report may not grab headlines, it is a timely reminder that many users both want social media companies to take their concerns seriously and are willing to take the time to appeal to outside bodies.
It’s also a reminder that there are ways to think about resolving disputes about content that do not just involve asking social media companies to delete more content. E-courts are by no means a panacea for the online ecosystem. But the concept is a timely reminder that democracies have options at their disposal that go beyond complaining about social media companies and their practices.