Digital platforms are at the nexus between technology and the global economy. They have revolutionized not only how we trade in goods and services but also how we exchange information, connect and labour. Such changes provide enormous economic and social gains, yet they come at a price: greater complexity in terms of how to govern technical, legal and social challenges.
At the heart of this complexity rests an important paradox. Technologies that power this revolution such as the internet, blockchain and artificial intelligence (AI) were developed to be global and borderless; ethical, social and legal constraints (and regulations), however, tend to be mostly local.1 To face this imbalance is difficult on its own. Yet, in a scenario where there is increasing competition between significant international players, such as China versus the West, as it appears to be the case today,2 cooperation may not be more than a pipe dream.
From the standpoint of the Global South, the chances of participating in the governance of digital platforms are even smaller. As it stands, existing governance frameworks are rather ad hoc, incomplete and insufficient (Fay 2019), stemming only from the major camps (the United States or the European Union [the West] or China).3 Forging a cooperative framework that brings together actors from a diverse range of sectors, regions and groups seems both a necessary task and a very complicated one.4
Without such governance structure, imbalances and confrontations among players create distortions, which will be analyzed in the “Major Trends” section. For the Global South, these trends lead to important social and economic consequences that impact their participation in any global platform governance arrangement. Before we go into that, we should explore where we are in terms of achieving such a regulatory framework.
Where Are We At?
Internationally, in the last 20 years, the question of governance rested on the governance “of” the internet. Much of the international debate was concerned with the role different actors should play in deciding the protocols, policies and standards applicable to the internet, more than anything, in its technical architecture. On the one side, a more state-centric view was advocated. On the other side, a more open governance model in which the participation of multiple stakeholders such as civil society, the private sector, academia and the technical community was proposed.5 Much of the world has been divided along those lines, and players such as China and the United States were on opposite ends of this spectrum. In a scenario of enhanced competition between the two countries, it is to be expected that finding common ground will have a higher degree of difficulty.
The shift in focus in the last decade from a debate about the internet itself to a debate on content moderation, data flows and competition, changed the discussion on governance. It went from the governance “of” the internet to “on” the internet, adding new layers of political tensions (de la Chapelle and Fehlinger 2020). This is particularly true regarding platform governance, which brings to the centre of the debate elements much closer to the heart of domestic policy. When platforms are called upon to intervene with content and behaviour that happens within their digital spaces, significant questions of values, public interest and public policy have to be part of the discussion.
Hence, the paradox of global technologies and local and/or domestic constraints (legal or otherwise) dominates the scene. Certain actions, policies and regulations may in themselves cause a global impact as platforms have to adjust their architecture, policies, rules and mechanisms in response to them. One example may be illustrative: a regulation that obliges an end-to-end encrypted service to provide a decryption key to a national law enforcement agency may be either restricting the reach of certain platforms and services or creating a global security vulnerability, as such a requirement can scarcely be limited to one specific country.
Proposals aimed at overcoming this complexity (global technology versus local regulation) appear to take one of three forms: specific substantive solutions (international treaties, international standards, or even international guidance and soft law instruments) usually derived from existing international institutions; new institutional frameworks (whether for specific issues such as data protection [Tranberg 2021] and content moderation, or broader and all encompassing6) that may coordinate and decide on policy issues; or transnational frameworks to develop interoperable standards.7 All such proposals are yet to come to fruition.
Without a global platform governance arrangement that addresses technical, legal and social challenges from both a local and a cross-border perspective, all the parties involved might be subjected to severe consequences. Some overarching trends showcase these potential outcomes across the globe and the “turbulence” that is arriving.8 A dispute between China and the West, as we will see, only exacerbates them. As the dispute over platform governance also becomes a dispute over which values and interests may prevail or even which platforms and services may have an edge domestically and internationally over the others, individuals, particularly in the Global South, are the ones more likely to be caught in the middle either without a voice or access to services.
Major Trends
It is a Sisyphean task to map out the consequences of the insufficiency and lack of comprehensiveness of a global platform governance framework. Yet the following consequences seem to be emerging: disputes over “digital sovereignty”; regulatory competition; jurisdictional overreach; and tension over the role and responsibility of platforms.
Digital Sovereignty: Disputing the Digital Space
At the highest level, the competition between China and the West translates to a dispute over the digital space. It is originally a dispute over “the ability of nation states to control the digital infrastructure on their territory” (Timmers and Moerel 2021). Yet, more and more, this traditional understanding of digital sovereignty is evolving to encompass the functioning of platforms. In practice, it means controlling whether certain services can be offered or specific platforms may work within the jurisdiction of a particular country.
Direct examples can be seen by the ban of Chinese apps (TikTok and WeChat) by former US president Donald Trump, justifying it as a matter of “national security, foreign policy, and economy” (The White House 2020). Domestic regulation, in his view, was not enough to guarantee the rights of American citizens and the interests of the nation.9 Chinese reaction, both from a legal and economic standpoint, showcases the same trend. The country, for instance, published legislation that allowed for retaliation in case further actions along the same lines occurred.10
Other countries, including in the Global South, picked up on the trend. India, for instance, banned 49 Chinese apps in the name of safeguarding “sovereignty and integrity” and “national security.”11 Nigeria just recently lifted a “ban” on Twitter, stating that it was dependent on a commitment of respect to the country’s sovereignty and cultural values.12
Hence, without a framework to solve disputes regarding the basic rules and principles platforms should follow, a trend is emerging of disputing foreign platforms’ commitments (or capacity) to respect fundamental domestic tenets, leading to bans and suspensions of whole services and even platforms.
Regulatory Competition
A following trend refers to a regulatory competition where countries seek out to export their regulatory models.13 The so-called Brussels effect (Bradford 2020), referring to the significance of the European regulatory initiatives, is probably the most widely known.
The EU General Data Protection Regulation (GDPR) is an example. Several of the more than 100 jurisdictions that have a general data protection regulation consider this legislation as an influence. From the standpoint of Latin America, since the GDPR’s approval, six countries have enacted new data protection laws: Chile (2017), Brazil (2018), Panama (2019), Paraguay (2020), Ecuador (2021) and El Salvador (2021),14 and at least four started reforms (Argentina, Costa Rica, Mexico and Uruguay). Surprisingly, while China is fostering its own models, it also has a data protection bill inspired by the European model, the Chinese Personal Information Protection Law, which took effect on November 1, 2021.
In contrast, the Chinese regulation of algorithms (setting parameters for designing and implementing them) is an example of a regulation defined only by Chinese standards.15 Another is the regulation of online games, where, for instance, there is a restriction on the available time minors can spend playing.16 Even more controversial, there are documents stating there will be regulatory initiatives restricting games portraying same-sex relationships and more “effeminate men.”17
While the American traditional regulatory frameworks have lost some clout globally, they still maintain the clear edge of being the default standard for most of the terms of service of major platforms.
Regulatory competition does not in itself have a negative impact. It may serve as a steppingstone for legal harmonization or even a rationalization of the rulemaking process, as actors may select the optimal rules for their businesses (Stark 2019). In practice, in the global tech policy scenario, many instances have proven complicated. Control over international data flows illustrates these difficulties.
Several governmental initiatives end up regulating flows through forced data localization or by establishing criteria for data to flow outside their geographical jurisdictions. Data localization rules may have a narrow impact, as certain data may not be shared or may have a broader impact as the categories of data included are also broadened. In terms of control over international outbound flows of data, the most internationally famous is the GDPR, which consolidates the concept that the data flow for “like-minded” countries is free but may be conditional for others. Worldwide, several jurisdictions have similar norms. In Latin America, countries such as Argentina, Brazil, Colombia, Panama and Uruguay have similar restrictions to the international outbound flow of data. Chile and Costa Rica have drafts that seem to go in the same direction; so does the draft bill being discussed in Bolivia.
This type of regulation is a double-edged sword. It may enhance national control, which in many cases may mean higher levels of data protection, but it also stresses the complexity of international access to data. It may be perceived, particularly for countries that are primarily “consumers” of the platform services, as a “tool for power equalization,” but it may also be a barrier for commerce, innovation and access to markets (Svantesson 2019, 165).
Regulatory competition may lead to different standards, contradictory obligations and even silos or zones of influence. All that is detrimental to the global aspirations of the technologies. Even digital rights may be impacted, as individuals from different jurisdictions may have diverging experiences or even be deprived of access to certain services.
Jurisdictional Overreach
A parallel trend refers to the reach and scope of regulation and judicial orders. In the absence of an international governance framework, several nations, including in the Global South, are either adopting regulation with a broad jurisdiction or interpreting existing ones with expansionary lenses. This trend is particularly visible within data protection (article 3 of the GDPR is an example).18
Additionally, judicial decisions are also seeking a broader unilateral reach. In the area of access to electronic evidence by law enforcement agencies, platforms are called upon to provide data even if it may be stored or operated overseas.19 One example is the United Kingdom’s tribunal interpretation of the Regulation of Investigatory Powers Act, which is seen to allow a disregard of certain UK citizens’ rights and access to the content of emails exchanged over international email services (Gmail) because those should be considered international communications.20
This is particularly true for countries that have “blocking legislations” that do not allow certain types of data to leave the country without a specific authorization, usually through a court order. Cases such as the Microsoft warrant in the United States21 (discussing access to data stored by Microsoft in Ireland) and the WhatsApp case in Brazil22 (requesting data that circulated in the platform and backing the order with a suspension of the service) are illustrative. In other instances, platforms have been ordered to take actions with global implications, such as de-indexing worldwide particular contents (under the heading of the “right to be forgotten” in the European Union, for example)23 or even excluding global access to content or accounts (as in an inquiry on disinformation promoted by the Brazilian Supreme Court).24
Without an international framework, this may lead to a governance conundrum for platforms, as in many instances to comply with the requests, the platforms may be in breach of, or may mean to violate a legal obligation in another country (Svantesson 2017).
These overlapping — and sometimes contradictory — obligations may lead to selective compliance for the platforms. Smaller countries and those with less bargaining power to enforce their laws, policies and orders — at least not without potential impacts in terms of their citizens’ access to services — may suffer from geopolitical imbalances.
Increasing Responsibility Placed on Platforms
One final trend that can be explored is that companies are expected to play new roles, becoming both norm setters (as their terms of service increase in importance) and norm enforcers (as “gatekeepers” of conduct and content online). Platforms become the “trusted mediators” between government’s public interests and policies and what happens within the digital space. This means that without clear rules, platforms will “default to their own terms of service and business practices,”25 while also being relied upon by governments to implement and enforce their policies and rules.26
In no other area is this more apparent than in content moderation. As opposed to China, where the state has historically been more involved in moderating the content available online, in the West, platforms have traditionally been granted a wide latitude to implement their own rules and practices (Chander 2014). This rationale of providing incentives for them to innovate and find creative ways to deal with harmful and even illegal content in their space, however, is now brought into question.
Matters of hate speech, incitement of violence and disinformation demand action, often fast and swift. Platforms are themselves the most capable of dealing with the volume of hate speech and with the velocity needed to be effective in curbing such behaviour, which may impact individuals and sometimes entire communities. The role played by social media in serving as a conduit to spread hate speech that fuelled the conflict in Myanmar, for instance, showcases the relevance of action and the impact inaction may have.27
In many cases, such as the sexual exploitation of children or pedophilia, there is consensus in the need for companies to act. However, in many other instances, there is major controversy surrounding not only how platforms should act but also the scope of their actions. There are major variances worldwide, for instance, in the extent to which expression should be unrestricted and which situations merit action.
This creates a scenario where international platform governance becomes all the more important. One the one hand, the more platforms play a significant role (either voluntarily or through domestic regulation), the more is demanded of them in terms of legitimacy and transparency in their decision-making processes. On the other hand, more potential normative conflicts appear (for example, terms of service versus domestic regulation) and between different domestic regulations.
The latter international aspect of this trend is highlighted by a series of private solutions proposed by some of the platforms. The Oversight Board established by Facebook is one example. It is an institution that resembles an appeals court, yet it is a private organization.28 The board aims to help manage existing international challenges as it has a global scope of action (it may decide for the whole platform). Yet, in a scenario of tension between China and the West, the board only has a very narrow reach. Particularly, it does not provide an arrangement for a multiplatform problem, and it may find itself in a situation where national legislations and court orders may challenge its decisions,29 thus impacting even further its sphere of action.
These new roles for platforms create significant domestic and cross-border challenges demanding an international governance arrangement. For many in the Global South, this scenario represents a risk for their local values, views and diversity besides concerns over accountability and compliance with public policy. There is a perception that developing countries and, in particular, smaller countries are only very late or not invited to partake in policy and regulatory debates — paths seem to be chosen absent their views or encompassing very little of their interests.30
Social and Economic Consequences for the Global South
The geopolitics of today involving a dispute between China and the West are probably more impactful and divisive because no global governance framework serves a coordinating purpose. In the absence of such an arrangement, the aforementioned major trends — disputes over digital sovereignty, regulatory competition, jurisdictional overreach, and tension over the role and responsibility of platforms — emerge. These trends reinforce the significance of the paradox of having global technologies and local and/or domestic constraints.
For the Global South — pressed among competing forces and without a venue to express its grievances and concerns — the consequences of such a complex context materialize in the predominance of commercial spaces versus public interests; the imposition of social, cultural and legal standards; the lack of diversity and exclusion of individuals and groups; and a “data trap.”
In the early days of the internet, parallel to commercial enterprise spaces (focused on the “.com”), there were other public interest spaces with significant relevance, such as the “.org” for organizations, the country specifics consisting of “.” and the country abbreviation, the education focused “.edu” and others. The arrangement established, from the start, rules and policies that guaranteed a certain balance. As platforms rise and become, for most, the centre of the internet, this value of maintaining public spaces ended up without specific shoulders on which to rest. Smaller and less developed countries became dependent on the interests of others, particularly platforms, to achieve goals for creating and maintaining digital civic spaces.
As regulatory competition and jurisdictional overreach gain traction, less opportunities are available for smaller players to influence and take part in developing policies and rules for the space their citizens will spend much of their time on or, more importantly, where their elections will be debated and even, in some cases, won or lost. Adding to the current geopolitical disputes and lack of avenues to contribute, choices for such smaller players may be based on extremes: accepting regulatory models from one side or the other, banning certain services and apps, or shutting down apps to foster compliance.
On the other side of the spectrum, platforms pressed between complying with one state’s regulation or judicial orders and risking violating the legal regime of another may adhere as a policy to the “strictest common denominator” — a “race to the top” — where the top means potentially the most limiting to freedom of expression or access to other digital rights. In a pinch, if companies cannot harmonize their different obligations, they may not adhere to local laws, particularly those of nations with less political or economic clout, potentially impacting the legitimacy of the domestic legal order.
Additionally, the operation of technologies such as AI may have a fundamental impact on the fabric of societies. As the architecture of platforms and the structure of algorithms or AI tools may benefit directly or indirectly certain groups and characteristics,31 being left without a global framework to influence the basic principles, to inform about cultural specificities, means to adhere to standards (legal, cultural and social) that may not reflect the makeup of one’s own society.
Finally, societies and communities in the Global South may be subject to a “data trap” in which they supply data for algorithms and AI tools developed elsewhere and then become “consumers” of such technologies.32 This creates a possible cycle of exploitation and dependency that will be hard for countries and communities to leave.33
Conclusion: Toward a Way Forward
Far from painting a completely bleak picture of global platform governance, the issues discussed in this essay should be understood as an opportunity to act, to explore new avenues where different stakeholders and actors from both the Global North and the Global South can cooperate. Yet a shift is needed from an approach that focuses on either a state-centric or a more multi-stakeholder view. This dichotomy only increases with the tensions between China and the West. The aim should be a transnational framework that allows for some integration of local and domestic values, interests, and cultural and social particularities but also maintains a degree of interoperability. The arrangement could be supported by a digital clearinghouse that would facilitate the exchange of technical, academic, social, cultural and regulatory information on, and experience with, issues pertaining to platform governance.