Platform Governance in a Time of Divide: Navigating the Paradox of Global Tech and Local Constraints

Digital platforms are at the nexus between technology and the global economy. They have revolutionized not only how we trade in goods and services but also how we exchange information, connect and labour. Such changes provide enormous economic and social gains, yet they come at a price: greater complexity in terms of how to govern technical, legal and social challenges.

At the heart of this complexity rests an important paradox. Technologies that power this revolution such as the internet, blockchain and artificial intelligence (AI) were developed to be global and borderless; ethical, social and legal constraints (and regulations), however, tend to be mostly local.1 To face this imbalance is difficult on its own. Yet, in a scenario where there is increasing competition between significant international players, such as China versus the West, as it appears to be the case today,2 cooperation may not be more than a pipe dream.

From the standpoint of the Global South, the chances of participating in the governance of digital platforms are even smaller. As it stands, existing governance frameworks are rather ad hoc, incomplete and insufficient (Fay 2019), stemming only from the major camps (the United States or the European Union [the West] or China).3 Forging a cooperative framework that brings together actors from a diverse range of sectors, regions and groups seems both a necessary task and a very complicated one.4

Without such governance structure, imbalances and confrontations among players create distortions, which will be analyzed in the “Major Trends” section. For the Global South, these trends lead to important social and economic consequences that impact their participation in any global platform governance arrangement. Before we go into that, we should explore where we are in terms of achieving such a regulatory framework.

Where Are We At?

Internationally, in the last 20 years, the question of governance rested on the governance “of” the internet. Much of the international debate was concerned with the role different actors should play in deciding the protocols, policies and standards applicable to the internet, more than anything, in its technical architecture. On the one side, a more state-centric view was advocated. On the other side, a more open governance model in which the participation of multiple stakeholders such as civil society, the private sector, academia and the technical community was proposed.5 Much of the world has been divided along those lines, and players such as China and the United States were on opposite ends of this spectrum. In a scenario of enhanced competition between the two countries, it is to be expected that finding common ground will have a higher degree of difficulty.

The shift in focus in the last decade from a debate about the internet itself to a debate on content moderation, data flows and competition, changed the discussion on governance. It went from the governance “of” the internet to “on” the internet, adding new layers of political tensions (de la Chapelle and Fehlinger 2020). This is particularly true regarding platform governance, which brings to the centre of the debate elements much closer to the heart of domestic policy. When platforms are called upon to intervene with content and behaviour that happens within their digital spaces, significant questions of values, public interest and public policy have to be part of the discussion.

Hence, the paradox of global technologies and local and/or domestic constraints (legal or otherwise) dominates the scene. Certain actions, policies and regulations may in themselves cause a global impact as platforms have to adjust their architecture, policies, rules and mechanisms in response to them. One example may be illustrative: a regulation that obliges an end-to-end encrypted service to provide a decryption key to a national law enforcement agency may be either restricting the reach of certain platforms and services or creating a global security vulnerability, as such a requirement can scarcely be limited to one specific country.

Proposals aimed at overcoming this complexity (global technology versus local regulation) appear to take one of three forms: specific substantive solutions (international treaties, international standards, or even international guidance and soft law instruments) usually derived from existing international institutions; new institutional frameworks (whether for specific issues such as data protection [Tranberg 2021] and content moderation, or broader and all encompassing6) that may coordinate and decide on policy issues; or transnational frameworks to develop interoperable standards.7 All such proposals are yet to come to fruition.

Without a global platform governance arrangement that addresses technical, legal and social challenges from both a local and a cross-border perspective, all the parties involved might be subjected to severe consequences. Some overarching trends showcase these potential outcomes across the globe and the “turbulence” that is arriving.8 A dispute between China and the West, as we will see, only exacerbates them. As the dispute over platform governance also becomes a dispute over which values and interests may prevail or even which platforms and services may have an edge domestically and internationally over the others, individuals, particularly in the Global South, are the ones more likely to be caught in the middle either without a voice or access to services.

Major Trends

It is a Sisyphean task to map out the consequences of the insufficiency and lack of comprehensiveness of a global platform governance framework. Yet the following consequences seem to be emerging: disputes over “digital sovereignty”; regulatory competition; jurisdictional overreach; and tension over the role and responsibility of platforms.

Digital Sovereignty: Disputing the Digital Space

At the highest level, the competition between China and the West translates to a dispute over the digital space. It is originally a dispute over “the ability of nation states to control the digital infrastructure on their territory” (Timmers and Moerel 2021). Yet, more and more, this traditional understanding of digital sovereignty is evolving to encompass the functioning of platforms. In practice, it means controlling whether certain services can be offered or specific platforms may work within the jurisdiction of a particular country.

Direct examples can be seen by the ban of Chinese apps (TikTok and WeChat) by former US president Donald Trump, justifying it as a matter of “national security, foreign policy, and economy” (The White House 2020). Domestic regulation, in his view, was not enough to guarantee the rights of American citizens and the interests of the nation.9 Chinese reaction, both from a legal and economic standpoint, showcases the same trend. The country, for instance, published legislation that allowed for retaliation in case further actions along the same lines occurred.10

Other countries, including in the Global South, picked up on the trend. India, for instance, banned 49 Chinese apps in the name of safeguarding “sovereignty and integrity” and “national security.”11 Nigeria just recently lifted a “ban” on Twitter, stating that it was dependent on a commitment of respect to the country’s sovereignty and cultural values.12

Hence, without a framework to solve disputes regarding the basic rules and principles platforms should follow, a trend is emerging of disputing foreign platforms’ commitments (or capacity) to respect fundamental domestic tenets, leading to bans and suspensions of whole services and even platforms.

Regulatory Competition

A following trend refers to a regulatory competition where countries seek out to export their regulatory models.13 The so-called Brussels effect (Bradford 2020), referring to the significance of the European regulatory initiatives, is probably the most widely known.

The EU General Data Protection Regulation (GDPR) is an example. Several of the more than 100 jurisdictions that have a general data protection regulation consider this legislation as an influence. From the standpoint of Latin America, since the GDPR’s approval, six countries have enacted new data protection laws: Chile (2017), Brazil (2018), Panama (2019), Paraguay (2020), Ecuador (2021) and El Salvador (2021),14 and at least four started reforms (Argentina, Costa Rica, Mexico and Uruguay). Surprisingly, while China is fostering its own models, it also has a data protection bill inspired by the European model, the Chinese Personal Information Protection Law, which took effect on November 1, 2021.

In contrast, the Chinese regulation of algorithms (setting parameters for designing and implementing them) is an example of a regulation defined only by Chinese standards.15 Another is the regulation of online games, where, for instance, there is a restriction on the available time minors can spend playing.16 Even more controversial, there are documents stating there will be regulatory initiatives restricting games portraying same-sex relationships and more “effeminate men.”17

While the American traditional regulatory frameworks have lost some clout globally, they still maintain the clear edge of being the default standard for most of the terms of service of major platforms.

Regulatory competition does not in itself have a negative impact. It may serve as a steppingstone for legal harmonization or even a rationalization of the rulemaking process, as actors may select the optimal rules for their businesses (Stark 2019). In practice, in the global tech policy scenario, many instances have proven complicated. Control over international data flows illustrates these difficulties.

Several governmental initiatives end up regulating flows through forced data localization or by establishing criteria for data to flow outside their geographical jurisdictions. Data localization rules may have a narrow impact, as certain data may not be shared or may have a broader impact as the categories of data included are also broadened. In terms of control over international outbound flows of data, the most internationally famous is the GDPR, which consolidates the concept that the data flow for “like-minded” countries is free but may be conditional for others. Worldwide, several jurisdictions have similar norms. In Latin America, countries such as Argentina, Brazil, Colombia, Panama and Uruguay have similar restrictions to the international outbound flow of data. Chile and Costa Rica have drafts that seem to go in the same direction; so does the draft bill being discussed in Bolivia.

This type of regulation is a double-edged sword. It may enhance national control, which in many cases may mean higher levels of data protection, but it also stresses the complexity of international access to data. It may be perceived, particularly for countries that are primarily “consumers” of the platform services, as a “tool for power equalization,” but it may also be a barrier for commerce, innovation and access to markets (Svantesson 2019, 165).

Regulatory competition may lead to different standards, contradictory obligations and even silos or zones of influence. All that is detrimental to the global aspirations of the technologies. Even digital rights may be impacted, as individuals from different jurisdictions may have diverging experiences or even be deprived of access to certain services.

Jurisdictional Overreach

A parallel trend refers to the reach and scope of regulation and judicial orders. In the absence of an international governance framework, several nations, including in the Global South, are either adopting regulation with a broad jurisdiction or interpreting existing ones with expansionary lenses. This trend is particularly visible within data protection (article 3 of the GDPR is an example).18

Additionally, judicial decisions are also seeking a broader unilateral reach. In the area of access to electronic evidence by law enforcement agencies, platforms are called upon to provide data even if it may be stored or operated overseas.19 One example is the United Kingdom’s tribunal interpretation of the Regulation of Investigatory Powers Act, which is seen to allow a disregard of certain UK citizens’ rights and access to the content of emails exchanged over international email services (Gmail) because those should be considered international communications.20

This is particularly true for countries that have “blocking legislations” that do not allow certain types of data to leave the country without a specific authorization, usually through a court order. Cases such as the Microsoft warrant in the United States21 (discussing access to data stored by Microsoft in Ireland) and the WhatsApp case in Brazil22 (requesting data that circulated in the platform and backing the order with a suspension of the service) are illustrative. In other instances, platforms have been ordered to take actions with global implications, such as de-indexing worldwide particular contents (under the heading of the “right to be forgotten” in the European Union, for example)23 or even excluding global access to content or accounts (as in an inquiry on disinformation promoted by the Brazilian Supreme Court).24

Without an international framework, this may lead to a governance conundrum for platforms, as in many instances to comply with the requests, the platforms may be in breach of, or may mean to violate a legal obligation in another country (Svantesson 2017).

These overlapping — and sometimes contradictory — obligations may lead to selective compliance for the platforms. Smaller countries and those with less bargaining power to enforce their laws, policies and orders — at least not without potential impacts in terms of their citizens’ access to services — may suffer from geopolitical imbalances.

Increasing Responsibility Placed on Platforms

One final trend that can be explored is that companies are expected to play new roles, becoming both norm setters (as their terms of service increase in importance) and norm enforcers (as “gatekeepers” of conduct and content online). Platforms become the “trusted mediators” between government’s public interests and policies and what happens within the digital space. This means that without clear rules, platforms will “default to their own terms of service and business practices,”25 while also being relied upon by governments to implement and enforce their policies and rules.26

In no other area is this more apparent than in content moderation. As opposed to China, where the state has historically been more involved in moderating the content available online, in the West, platforms have traditionally been granted a wide latitude to implement their own rules and practices (Chander 2014). This rationale of providing incentives for them to innovate and find creative ways to deal with harmful and even illegal content in their space, however, is now brought into question.

Matters of hate speech, incitement of violence and disinformation demand action, often fast and swift. Platforms are themselves the most capable of dealing with the volume of hate speech and with the velocity needed to be effective in curbing such behaviour, which may impact individuals and sometimes entire communities. The role played by social media in serving as a conduit to spread hate speech that fuelled the conflict in Myanmar, for instance, showcases the relevance of action and the impact inaction may have.27

In many cases, such as the sexual exploitation of children or pedophilia, there is consensus in the need for companies to act. However, in many other instances, there is major controversy surrounding not only how platforms should act but also the scope of their actions. There are major variances worldwide, for instance, in the extent to which expression should be unrestricted and which situations merit action.

This creates a scenario where international platform governance becomes all the more important. One the one hand, the more platforms play a significant role (either voluntarily or through domestic regulation), the more is demanded of them in terms of legitimacy and transparency in their decision-making processes. On the other hand, more potential normative conflicts appear (for example, terms of service versus domestic regulation) and between different domestic regulations.

The latter international aspect of this trend is highlighted by a series of private solutions proposed by some of the platforms. The Oversight Board established by Facebook is one example. It is an institution that resembles an appeals court, yet it is a private organization.28 The board aims to help manage existing international challenges as it has a global scope of action (it may decide for the whole platform). Yet, in a scenario of tension between China and the West, the board only has a very narrow reach. Particularly, it does not provide an arrangement for a multiplatform problem, and it may find itself in a situation where national legislations and court orders may challenge its decisions,29 thus impacting even further its sphere of action.

These new roles for platforms create significant domestic and cross-border challenges demanding an international governance arrangement. For many in the Global South, this scenario represents a risk for their local values, views and diversity besides concerns over accountability and compliance with public policy. There is a perception that developing countries and, in particular, smaller countries are only very late or not invited to partake in policy and regulatory debates — paths seem to be chosen absent their views or encompassing very little of their interests.30

Social and Economic Consequences for the Global South

The geopolitics of today involving a dispute between China and the West are probably more impactful and divisive because no global governance framework serves a coordinating purpose. In the absence of such an arrangement, the aforementioned major trends — disputes over digital sovereignty, regulatory competition, jurisdictional overreach, and tension over the role and responsibility of platforms — emerge. These trends reinforce the significance of the paradox of having global technologies and local and/or domestic constraints.

For the Global South — pressed among competing forces and without a venue to express its grievances and concerns — the consequences of such a complex context materialize in the predominance of commercial spaces versus public interests; the imposition of social, cultural and legal standards; the lack of diversity and exclusion of individuals and groups; and a “data trap.”

In the early days of the internet, parallel to commercial enterprise spaces (focused on the “.com”), there were other public interest spaces with significant relevance, such as the “.org” for organizations, the country specifics consisting of “.” and the country abbreviation, the education focused “.edu” and others. The arrangement established, from the start, rules and policies that guaranteed a certain balance. As platforms rise and become, for most, the centre of the internet, this value of maintaining public spaces ended up without specific shoulders on which to rest. Smaller and less developed countries became dependent on the interests of others, particularly platforms, to achieve goals for creating and maintaining digital civic spaces.

As regulatory competition and jurisdictional overreach gain traction, less opportunities are available for smaller players to influence and take part in developing policies and rules for the space their citizens will spend much of their time on or, more importantly, where their elections will be debated and even, in some cases, won or lost. Adding to the current geopolitical disputes and lack of avenues to contribute, choices for such smaller players may be based on extremes: accepting regulatory models from one side or the other, banning certain services and apps, or shutting down apps to foster compliance.

On the other side of the spectrum, platforms pressed between complying with one state’s regulation or judicial orders and risking violating the legal regime of another may adhere as a policy to the “strictest common denominator” — a “race to the top” — where the top means potentially the most limiting to freedom of expression or access to other digital rights. In a pinch, if companies cannot harmonize their different obligations, they may not adhere to local laws, particularly those of nations with less political or economic clout, potentially impacting the legitimacy of the domestic legal order.

Additionally, the operation of technologies such as AI may have a fundamental impact on the fabric of societies. As the architecture of platforms and the structure of algorithms or AI tools may benefit directly or indirectly certain groups and characteristics,31 being left without a global framework to influence the basic principles, to inform about cultural specificities, means to adhere to standards (legal, cultural and social) that may not reflect the makeup of one’s own society.

Finally, societies and communities in the Global South may be subject to a “data trap” in which they supply data for algorithms and AI tools developed elsewhere and then become “consumers” of such technologies.32 This creates a possible cycle of exploitation and dependency that will be hard for countries and communities to leave.33

Conclusion: Toward a Way Forward

Far from painting a completely bleak picture of global platform governance, the issues discussed in this essay should be understood as an opportunity to act, to explore new avenues where different stakeholders and actors from both the Global North and the Global South can cooperate. Yet a shift is needed from an approach that focuses on either a state-centric or a more multi-stakeholder view. This dichotomy only increases with the tensions between China and the West. The aim should be a transnational framework that allows for some integration of local and domestic values, interests, and cultural and social particularities but also maintains a degree of interoperability. The arrangement could be supported by a digital clearinghouse that would facilitate the exchange of technical, academic, social, cultural and regulatory information on, and experience with, issues pertaining to platform governance.

  1. In September 2021, Elizabeth Denham, the UK information commissioner at the Information Commissioner’s Office, put forward something similar in terms of data: “Data flows are international, but the checks and balances are domestic and that brings on a lot of problems” (Tranberg 2021).
  2. The scenario has been described as a “Tech Cold War”; even if the term may be disputed, it has the value to showcase the gravitas with which people are taking this dispute. For the term, see Wu, Hoenig and Dormido (2021).
  3. From a Latin American view, a 2020 study showed that 81 percent of those interviewed saw that foreign regulatory initiatives were an influence on domestic proposals. See Souza (2020).
  4. Ibid. In the aforementioned study, 73 percent of those interviewed either disagreed or strongly disagreed that there was sufficient international coordination to address cross-border legal challenges on the internet.
  5. See Global Commission on Internet Governance (2017).
  6. See, for instance, Fay (2019).
  7. See, for instance, de la Chapelle and Fehlinger (2020).
  8. In 2018, CIGI founder Jim Balsillie said: “If we don’t [address unprecedented digital challenges] the trend will be much more turbulence before we come to the final realization that we should have done something some years before and have to pay the very big price of turbulence” (Orol 2018).
  9. A similar discussion happened when the gay dating app Grindr was aquired by a Chinese company, leading to a national security panel review by the Committee on Foreign Investment in the United States. The app was eventually sold “back” to another American company. See Whittaker (2020).
  10. China has issued legislation that allows it to retaliate in terms of “unjustified extension of jurisdiction.” See, for instance, Tang (2021).
  11. See Phartiyal (2021).
  12. See Nyambura (2021).
  13. This does not relate solely to tech regulation, but in this field, it became more relevant and apparent.
  14. The president vetoed the bill a month later.
  15. Certain parts of this initiative are even impacting the debate in the US Senate. See Edelman (2021) for comments from senators in the hearing.
  16. See Adam (2021).
  17. See Kain (2021).
  18. In terms of Latin America, for instance, the UN Economic Commission for Latin America and the Caribbean (ECLAC) and the Internet & Jurisdiction Policy Network’s Regional Status Report 2020 explores this trend in further detail (see Souza 2020).
  19. See Kent (2014) and Osula (2017).
  20. For an analysis of the regulation, see Anderson (2015).
  21. For an overview of the case, see Internet & Jurisdiction Observatory (2015, 50).
  22. For an overview of the case, see Goel and Sreeharsha (2015).
  23. For a broad discussion on the repercussions of the “right to be forgotten,” particularly for Latin America, see Banerji et al. (2017).
  24. See Reuters (2020).
  25. Retrieved from Owen (2019).
  26. In many instances, platforms are called upon to assess the legality of content (and sometimes conduct) that happens online. As one example, in the German Network Enforcement Act, platforms have a short period of time to decide on and, if necessary, remove “manifestly illegal content.” For comment on this piece of legislation, see Douek (2018).
  27. See UN Human Rights Council (2018).
  28. See https://oversightboard.com/. For its development, see Harris (2019).
  29. The board’s decision in the suspension of Trump’s account from the platform has not yet been challenged domestically. A similar decision by Twitter, however, is being challenged. See Lyons (2021).
  30. This is very clear in the report for the Latin American and Caribbean region published by ECLAC and the Internet & Jurisdiction Policy Network in 2020. See Souza (2020).
  31. This has been reported to be the case in terms of gender and race, which may be an issue that is not limited to the Global South, yet it tends to affect disproportionately more countries in the Global South. To illustrate the matter, see the Massachusetts Institute of Technology study, Gender Shades: http://gendershades.org/.
  32. For an interesting view on the matter, see Couldry and Mejias (2018).
  33. This may happen under a commercial setting or even through trade legal arrangements. For the latter, see Scasserra and Elebi (2021).

Works Cited

Adam, Nick J. 2021. “New Limits Give Chinese Video Gamers Whiplash.” The New York Times, September 26. www.nytimes.com/2021/09/26/business/gamers-china.html.

Anderson, David. 2015. A Question of Trust: Report of the Investigatory Powers Review. www.brickcourt.co.uk/news-attachments/IPR_Report_Web_Accessible.pdf.

Banerji, Subhajit, Savni Dutt, Ella Hallwass, Yindee Limpives, Miguel Morachimo, Mirena Taskova, Shelli Gimelstein and Shane Seppinni. 2017. “The ‘Right to Be Forgotten’ and Blocking Orders under the American Convention: Emerging Issues in Intermediary Liability and Human Rights.” Intermediary Liability & Human Rights Policy Practicum. September. Stanford, CA: Stanford Law School Law and Policy Lab.

Bradford, Anu. 2020. The Brussels Effect: How the European Union Rules the World. New York, NY: Oxford University Press.

Chander, Anupam. 2014. “How Law Made Silicon Valley.” Emory Law Journal 63 (3): 639–94.

Couldry, Nick and Ulises A. Mejias. 2018. “Data Colonialism: Rethinking Big Data’s Relation to the Contemporary Subject.” Television & New Media 20 (4): 336–49.

De la Chapelle, Bertrand and Paul Fehlinger. 2020. “Jurisdiction on the Internet: From Legal Arms Race to Transnational Cooperation.” In Oxford Handbook of Online Intermediary Liability, edited by Giancarlo Frosio, 726–48. Oxford, UK: Oxford University Press. doi:10.1093/oxfordhb/9780198837138.013.38.

Douek, Evelyn. 2018. “U.N. Special Rapporteur’s Latest Report on Online Content Regulation Calls for ‘Human Rights by Default.’” Lawfare (blog), June 6. www.lawfareblog.com/un-special-rapporteurs-latest-report-online-content-regulation-calls-human-rights-default.

Edelman, Gilad. 2021. “The Senate is mad as hell at Facebook — again.” Wired, October 3. https://wired.me/business/big-tech/the-senate-is-mad-as-hell-at-facebook-again/.

Fay, Robert. 2019. “Digital Platforms Require a Global Governance Framework.” In Models for Platform Governance, 27–31. Waterloo, ON: CIGI. www.cigionline.org/articles/digital-platforms-require-global-governance-framework/.

Global Commission on Internet Governance. 2017. Who Runs the Internet? The Global Multi-stakeholder Model of Internet Governance. GCIG Research Volume Two. Waterloo, ON: CIGI. www.cigionline.org/static/documents/documents/GCIG%20Volume%202%20WEB.pdf.

Goel, Vindu and Vinod Sreeharsha. 2015. “Brazil Restores WhatsApp Service After Brief Blockade Over Wiretap Request.” The New York Times, December 17. www.nytimes.com/2015/12/18/world/americas/brazil-whatsapp-facebook.html.

Harris, Brent. 2019. “Establishing Structure and Governance for an Independent Oversight Board.” Meta, September 17. https://newsroom.fb.com/news/2019/09/oversight-board-structure/.

Internet & Jurisdiction Observatory. 2015. “Microsoft appeals US court order to hand over data stored in Ireland to US law enforcement.” 2015 In Retrospect: Internet & Jurisdiction Project Global Trends, Volume 4.

Kain, Erik. 2021. “China Cracks Down On Same-Sex Relationships In Video Games.” Forbes, October 4. www.forbes.com/sites/erikkain/2021/10/04/china-banning-same-sex-relationships-in-video-games-effeminate-men-moral-choices/?sh=7f5d4d77251a.

Kent, Gail. 2014. “Sharing Investigation Specific Data with Law Enforcement – An International Approach.” Stanford Public Law Working Paper. doi:10.2139/ssrn.2472413.

Lyons, Kim. 2021. “Trump sues to reinstate his Twitter account.” The Verge, October 2. www.theverge.com/2021/10/2/22705584/trump-sues-reinstate-twitter-account-jan-6-riot-protest.

Nyambura, Helen. 2021. “Nigeria Lifts Twitter Ban With Limits After Four-Month Sanction.” Bloomberg, October 1. www.bloomberg.com/news/articles/2021-10-01/nigerian-president-announces-conditional-lifting-of-twitter-ban.

Orol, Ronald. 2018. “The IMF Should Spark a Bretton Woods Moment for the Digital Age, Says Balsillie.” Opinion, Centre for International Governance Innovation, November 22. www.cigionline.org/articles/imf-should-spark-bretton-woods-moment-digital-age-says-balsillie.

Osula, Anna-Maria. 2017. “Remote search and seizure of extraterritorial data.” Ph.D. dissertation, University of Tartu.

Owen, Taylor. 2019. “Introduction: Why Platform Governance?” In Models for Platform Governance, 3–6. Waterloo, ON: CIGI. www.cigionline.org/articles/introduction-why-platform-governance/.

Phartiyal, Sankalp. 2021. “India retains ban on 59 Chinese apps, including TikTok.” Reuters, January 25. www.reuters.com/article/us-india-china-apps-idUSKBN29U2GJ.

Reuters. 2020. “Facebook puts global block on Brazil’s Bolsonaro supporters.” Reuters, July 31. www.reuters.com/article/us-facebook-brazil/facebook-puts-global-block-on-brazils-bolsonarosupporters-idUSKCN24X3BN.

Scasserra, Sofia and Carolina Martínez Elebi. 2021. Digital Colonialism: Analysis of Europe’s trade agenda. Amsterdam, the Netherlands: Transnational Institute.

Souza, Carlos Affonso. 2020. Internet & Jurisdiction and ECLAC Regional Status Report 2020. Santiago, Chile: United Nations. www.cepal.org/sites/default/files/publication/files/46421/S1901092_en.pdf.

Stark, Johanna. 2019. Law for Sale: A Philosophical Critique of Regulatory Competition. Oxford, UK: Oxford University Press.

Svantesson, Dan Jerker B. 2017. Solving the Internet Jurisdiction Puzzle. Oxford, UK: Oxford University Press.

———. 2019. Internet & Jurisdiction Global Status Report 2019. Paris, France: Internet & Jurisdiction Policy Network. www.internetjurisdiction.net/uploads/pdfs/GSR2019/Internet-Jurisdiction-Global-Status-Report-2019_web.pdf.

Tang, Frank. 2021. “China’s new rules on ‘unjustified’ foreign laws bolster ability to strike back at US long-arm jurisdiction.” South China Morning Post, January 13. www.scmp.com/economy/global-economy/article/3117578/chinas-new-rules-unjustified-foreign-laws-bolster-ability.

The White House. 2020. “Executive Order on Addressing the Threat Posed by TikTok.” August 6. https://trumpwhitehouse.archives.gov/presidential-actions/executive-order-addressing-threat-posed-tiktok/.

Timmers, Paul and Lokke Moerel. 2021. Reflections on Digital Sovereignty. EU Cyber Direct. January. https://eucd.s3.eu-central-1.amazonaws.com/eucd/assets/khGGovSY/rif_timmersmoerel-final-for-publication.pdf.

Tranberg, Pernille. 2021. “ICO: We Urgently Need A Bretton Woods For Data.” DataEthics, September 9. https://dataethics.eu/ico-we-urgently-need-a-data-bretton-wood/.

UN Human Rights Council. 2018. Report of the independent international fact-finding mission on Myanmar. A/HRC/39/64. September 12. www.ohchr.org/Documents/HRBodies/HRCouncil/FFM-Myanmar/A_HRC_39_64.pdf.

Whittaker, Zack. 2020. “Grindr sold by Chinese owner after US raised national security concerns.” TechCrunch, March 6. https://techcrunch.com/2020/03/06/grindr-sold-china-national-security/.

Wu, Debby, Henry Hoenig and Hannah Dormido. 2021. “Who’s Winning the Tech Cold War? A China vs. U.S. Scoreboard.” Alliance for Science & Technology Research in America, June 25. https://usinnovation.org/news/whos-winning-tech-cold-war-china-vs-us-scoreboard.

Originally published by the Project for Peaceful Competition.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.