On March 25, Jack Dorsey, Sundar Pichai and Mark Zuckerberg — the chief executive officers of Twitter, Google and Facebook, respectively — will appear at a US House hearing focused on the rise of misinformation and disinformation online. Legislators and governments all over the world are now looking at regulating such platforms, as these private companies are shaping our public information ecosystem, making decisions every day on how to moderate content, rank content, target content and build social groups, with very little democratic oversight.
Reforms to the legislative framework that define the democratic oversight of digital platforms are sorely needed.
In November 2020, the Forum on Information and Democracy published a report on how to counter “infodemics.” The 128-page document synthesizes the work of dozens of researchers, and was led by a steering committee co-chaired by Maria Ressa and Marietje Schaake. I served as the leading rapporteur for this report. Among its hundreds of recommendations, here is a sampling to illustrate key measures that the US administration — or any democratic government — should focus on in its effort to curb the spread of misinformation:
1. States Should Impose Significant Sanctions for Non-Compliance with Transparency Requirements
The report’s opening chapter begins by sketching the perimeters of transparency and directs its first cluster of recommendations to states. Foremost is that “states should impose a general principle of transparency by law” to platforms. Transparency obligations “should be imposed for every public or private sector entity in proportion to the power or influence it is able to exercise over people or ideas.” In addition, “online service providers must be predictable for those over whom they have influence, resistant to any manipulation and open to inspection” (a basic principle laid out in the International Declaration on Information and Democracy). The report then lists several key principles regarding areas of platform requirements to be encompassed within the legal transparency requirements:
- Content moderation policies: Platforms need to detail how they implement and adhere to their own policies.
- Algorithms: Platforms must be clear about algorithms’ operations and objectives.
- Content: Platforms should make public the pieces of content that reach the most users per day, per language, per country.
- Advertisements: Platforms should create a public library of the ads they publish, with information on who they are directed to and who pays for them.
- Users’ data: Platforms must report on what they do with the user data they collect, and must ensure that users can access information pertaining to them (both collected and inferred).
- Human rights impacts: Finally, platforms must assess the human rights impacts of their policies and products.
This chapter also deals with the actual governance of transparency, the mechanics of which include sanctions. The authors say that “sanctions for non-compliance could be financial, in the form of a fine for digital platforms of up to three–four percent of the network’s global turnover” — six percent, in the European Digital Services Act proposal — and should be “proportionate to the severity of the non-compliance.” Further, “fines need to be … extremely significant, especially in case of recurring non-compliance, in order to be efficient. In addition, mandatory publicity for non-compliance could be imposed, in the form of a banner ahead of all advertisements visible by all users on the digital platforms.”
As well, chief executives should be liable for their company’s compliance with the transparency requirements. For example, “the CEO could be required to personally sign off on the transparency requirements reports, attesting that the information disclosed is accurate and complete.” In the event that the non-compliant platform did not pay the fine levied, administrative sanctions could be applied as a last resort; the authors suggest, for instance, that a platform could lose access to the country affected, observing that “such extreme sanctions may be needed for rogue or politically motivated platforms operating on an offshore basis from jurisdictions where international judicial cooperation is extremely slow, if not impossible” — with such powers only to be used following formal notice to the platform and a court decision.
2. States Should Compel Platforms to Improve the Quality of Content Review
The report’s second chapter covers the “meta-regulation” of content moderation — laying out a set of five baseline principles that protect democratic values and universal human rights. The second of these, the “necessity and proportionality principle,” focuses on distinguishing users, content and responses in content moderation. This part of the report stresses that human rights due diligence is of particular importance in at-risk countries and that infodemics tend to occur during moments of crisis. Citing the United Nations Guiding Principles on Business and Human Rights, the authors recommend that companies “conduct ongoing human rights due diligence in areas where they operate, so as to be able to address human rights risks as they evolve. Platforms cannot cite the problem of scale to avoid this responsibility. A human rights due diligence process should identify, prevent, mitigate and account for how platforms address their impact on human rights.” A “risk ratio” is one way that platforms could assess when a particular jurisdiction exceeds a certain risk level before they apply, for a specified duration, more stringent content responses. This heightened response might involve, for example, deploying additional human content moderators to review content. Companies could be compelled to spend a minimal percentage of their income to improve the quality of content review, especially in at-risk countries or situations.
3. States Should Develop a New Regulatory Focus on Digital Architecture and Software Engineering Safety
Within chapter 3, about the connection between platform design and the reliability of information, the working group makes the case for establishing public standards for quality and safety in platform design. Every social platform is a construction of deliberate design, architecture and engineering choices; other technical constructions must maintain a common set of technical standards or adhere to “building codes,” but as yet there are no mechanisms for promoting “authenticity, reliability and findability of content” on platforms.. Accordingly, the report recommends that states should “collaborate with technical experts to design digital building codes for social platforms and other digital commons,” because “in the same way that fire safety tests are conducted prior to a building being opened to the public, such a ‘digital building code’ would also result in a shift towards prevention of harm through testing prior to release to the public.”
4. States Should Impose Standards for Algorithm Training
As Christopher Wylie, the Cambridge Analytica whistleblower and a member of the steering committee for this project, told the rapporteurs, “The problem with some of these algorithms is that they become too good and too focused on particular users’ attributes. It will give you only what the algorithm knows you will like and engage with.” Algorithms are trained on a set of data, which includes both signals — useful information — and noise (random content that is not useful). One potential solution to the problem of too-focused algorithms is to mandate that a certain level of random content be included in the training set. Doing so would help defocus the algorithm enough to prevent filter bubbles being created. The report’s authors go on to suggest that “random content introduced in the algorithm could be accessed potentially from content published by entities that provide public interest journalism identified using a standard based on internationally accepted best-practices and ethical norms for the production of reliable information.”
5. Democratic States Should Collaborate on a Global Governance Structure for Digital Platforms
The report highlights the need for a new global governance structure for digital technology “to ensure the effective and coordinated democratic oversight of platforms.” A group of like-minded governments should lead the change step by step, collaborating closely with civil society organizations. The Global Democracy Summit that President Joe Biden’s administration plans to hold during its first year and the Summit of the Initiative for Information and Democracy, which the French government will hold during the United Nations General Assembly in September 2021, could be key milestones on the democratic path.