What Is Foreign Interference, Beyond the Headlines?

There is a new urgency to this question, resulting from startling findings by the National Security and Intelligence Committee of Parliamentarians.

June 12, 2024
FIlatest
Prime Minister Justin Trudeau takes part in public hearings for the independent commission probing alleged foreign interference in Canadian elections in Ottawa, April 10, 2024. (Blair Gable/REUTERS)

What is foreign interference to Canadians? What is it really, beyond the headlines?

The Public Inquiry into Foreign Interference (PIFI), launched last September, aims to find out. It has issued a call for public submissions, with a deadline set for July 31, 2024. This will be its long summer’s work, preparatory to hearings scheduled for the fall.

There is a new urgency to this task, resulting from startling findings by the National Security and Intelligence Committee of Parliamentarians. Its review of foreign interference, published on June 3, unearthed intelligence indicating that Canadian members of Parliament (MPs) and senators had been “witting” accomplices in foreign interference operations conducted by both China and India. The most egregious example involved a case in which an MP was discovered by the Canadian Security Intelligence Service (CSIS) to have attempted to arrange a meeting in a foreign state with a senior intelligence officer. The same unnamed MP was said to have provided confidential information to the intelligence officer. This is one face of foreign interference, and a deeply troubling one.

The public inquiry led by Justice Marie-Josée Hogue has until now aimed to explore a different dimension of the foreign interference threat: not the exploitation of willing political contacts (though this may come under the inquiry’s purview, as Parliament has requested), but the assault on Canadian diaspora communities and political actors. The inquiry will be documenting personal experiences of the impact of foreign interference.

The target audience for submissions includes political actors at all levels of government, people involved in political campaigns, and members of diaspora communities who have felt or fear the heavy hand of transnational repression. PIFI is also looking to gauge how Canadians felt about government responses to efforts to report foreign interference or seek help.

The assaultive dimension of foreign interference is an important one. Public engagement and feedback are essential to the inquiry’s task. But what might be missing in its outreach questions or in the answers coming back?

The inquiry’s focus on traditional forms of foreign political interference threatens to overlook a newer and potentially more deadly threat — digital disinformation campaigns — the contemporary, internet-facilitated version of age-old propaganda wars between states. The “experience” of disinformation, given its opacity, the definitional murk that surrounds it and the technological tools in play, will likely elude capture in a mail-in campaign.

The threat of disinformation is morphing and growing, even while its intent and reliance on crude social engineering remain unchanged. The broad intent, simply put, is subversion. The social engineering dynamic is equally simple — reinforce what people are already inclined to believe. This is what lurks beneath the surface of falsehoods conveyed for political influence.

That issue became the subject of tense, semi-private discussions between the Conservative Party and government intelligence agencies after the 2021 election, with the Conservatives holding firm to a belief that their election results suffered from a Chinese campaign to undermine its candidates and the party platform.

Leading up to its first interim report, published on May 3, PIFI heard witness testimony and examined documents on disinformation campaigns’ potential impacts on the Canadian federal elections of 2019 and 2021. The big issue that emerged was possible People’s Republic of China (PRC) political interference campaigns during the 2021 snap election, targeting primarily the Conservative Party and its candidates, particularly in the lower mainland of British Columbia.

That issue became the subject of tense, semi-private discussions between the Conservative Party and government intelligence agencies after the 2021 election, with the Conservatives holding firm to a belief that their election results suffered from a Chinese campaign to undermine its candidates and the party platform.

To study this episode, PIFI turned to an innovation in the work of public inquiries — the production of “intelligence summaries” from the mass of government intelligence records. One such summary covers PRC disinformation; specifically, influence campaigns targeting the Conservatives in 2021. Federal security bodies, including CSIS, the Communications Security Establishment and Global Affairs’ Rapid Response Mechanism, a Canadian-built and -managed G7 unit to monitor disinformation, were on the case and could produce some forensics.

What they could not do was distinguish between misinformation (erroneous information) and disinformation (deliberate untruths propagated covertly and deceptively) or follow any clear attribution trail. The intelligence summary delivers an assessment that social and traditional media incidents involved misinformation and “may have involved disinformation within Canada’s Chinese language ecosystem, and notably over WeChat.”

In both the case of Chinese-language media discussing Erin O’Toole and the Conservative Party platform, and more specific media incidents involving Kenny Chiu, an incumbent BC Conservative Party candidate who had sponsored a private member’s bill to create a foreign agent registry and who lost his seat in the 2021 election, the intelligence and security community came to the same conclusion — that any hidden hand of the Chinese state could not be detected.

Missing in this analysis was any discussion of the grey area of malinformation — the deliberate amplification of erroneous information. The even bigger missing piece concerned the technological underpinnings of false narratives.

The near-future worry is the potential for a tidal wave of malinformation and disinformation generated by artificial intelligence (AI) chatbots and AI-generated deepfakes. Both tools could have deep implications for elections and democratic processes. AI chatbots could be used to generate false and polarized political narratives on a remarkable scale — narratives that have the feel of genuine debate. Deepfakes are becoming ever more sophisticated, rapidly moving out of the realm of farce and low comedy.

A remarkable example was a recent deepfake video broadcast by a Russian TV channel, allegedly showing the head of Ukrainian intelligence claiming responsibility for the Russian Crocus City Hall terrorist attack. Have a look on the TrueMedia.org website for this stunner, or for other detected deepfakes involving US politicians (hint, Donald Trump features).

As one leading AI researcher, and the creator of TrueMedia.org, Oren Etzioni, has noted, technical tools and assessments to counter disinformation can be built and deployed, including by the platform giants. But in the end, people, as they confront artificially generated debates and ever more sophisticated deepfakes, “still need to decide if it is real.” Is that a reassuring message about human agency, or a deeply pessimistic one?

The answer can only be found in public education. This is where a public inquiry, such as PIFI, can play an important role, if it is willing and able.

The inquiry will have to go beyond generic statements such as this one in the May 3 report: “Foreign interference impacted the overall election ecosystem in 2019 and 2021.” Impacted how, and to what degree? Neither could be determined, thanks to the democratic practice of the secret ballot. Impacted through what means? That is the real story that must be uncovered.

Only with an appreciation of the technological monsters that lurk in our polluted, but still democratic, information ecosystem can human agency stand a chance to mark its own ballot.

The opinions expressed in this article/multimedia are those of the author(s) and do not necessarily reflect the views of CIGI or its Board of Directors.

About the Author

Wesley Wark is a CIGI senior fellow.