Safeguarding Canadian democracy in the age of growing digital disinformation
Institute for Research on Public Policy (IRPP)
Improving public policy in Canada by generating research, providing insight and informing debate.
A host of measures are needed to educate the Canadian public and improve the country’s ability to respond to information threats.
The stakes over the next year are higher than ever when it comes to security and defence challenges in the Canadian information ecosystem.
With monumental elections on the horizon, numerous global risk reports have highlighted mis/disinformation among the top threats facing our interconnected societies.
These threats are multiplied by the shifting global arena and rapid, constant technological advancements. These include growing multipolarity, security and defence threats, and the opening of armed conflicts across the globe that test Canada’s middle power paradigm within a strained rules-based world order.
Deliberate and coordinated attacks on politicians and the fallout from bilateral diplomatic conflicts further demonstrate that Canada is not immune to digital disinformation operations.
Moreover, a recent report by the Communication Security Establishment (CSE) offers a stark warning: Canada can expect unprecedented activity by foreign actors in our cyber and information space in the next federal election cycle, especially with the use of AI-generated content including deepfake videos and other sophisticated tools of deception.
To meet this challenge, Canada must prioritize understanding this complex and evolving landscape and adopt a whole of government approach.
In broad terms, mitigation strategies must simultaneously address short-term and immediate defence, national security and intelligence threats while simultaneously investing in societal resiliency strategies to reinforce democratic institutions and processes over the longue durée.
The disinformation threats we know
Digital disinformation is expanding in two ways: the range of actors and the nature of activities. This constant state of flux and expansion means there are existing threats, threats on the horizon, and unknown threats.
Developing an understanding of each is key to anticipating and preparing for security challenges.
Familiar state actors like Russia, China and Iran have the motivation and the strategic advantage of experience to disrupt Western elections. They have grown increasingly savvy and elusive, working through proxies and intermediaries to make it difficult to trace tactics back to the original source.
Elements used in these campaigns come through content farms in other states that can be used to produce digital disinformation content while offering the benefit of plausible deniability.
The nature of the disinformation ecosystem can transform actors with fewer resources and less power into formidable threats. Production and peddling of digital disinformation are incentivized by a low barrier to entry with a high rate of return for those seeking to incite disruption.
In other words, for these state actors disinformation campaigns represent a win-win strategy with little to no cost, because they don’t need to sway the outcome of an election to successfully pollute liberal democratic information environments. They only need to sow doubt and diminish trust in the legitimacy and efficacy of elections.
Liberal democracies like Canada may therefore lend themselves as an easy target for foreign disinformation campaigns.
That being said, while foreign interference during elections has received much of the attention in academic and policy research as well as in media coverage, other disinformation activities are also being pursued by threat actors.
Foreign influence campaigns and computational propaganda often shape public opinion and perceptions. Strategic distraction tries to prime individuals to pay attention to certain issues and ignore others in a bid to provoke decision paralysis.
What requires more attention, however, is the slow drip of polarizing and illiberal narratives exacerbating ideological and partisan fault lines that chips away at our social fabric, fostering a trust deficit between citizens and democracy.
领英推荐
A diversifying threat landscape
There are also indirect ways in which disinformation spreads within the Canadian information environment. We have long shared a unique, often embedded connection with the information space of the United States. As the convoy experience demonstrates, right-wing ideologies and narratives south of the border can influence perceptions and mobilization in Canada.
At the same time, Canada’s diasporic communities are also embedded in multiple information environments that include narratives and content circulating in home countries. Places in the Global South including Brazil, India, Nigeria, and the Philippines have seen digital disinformation used to influence domestic audiences, especially during elections.
The digital platforms on which disinformation spreads – including Facebook, X and TikTok but also direct messaging apps like Telegram, WeChat, and WhatsApp – reach global audiences at a scale and speed previously unimaginable. Ethnocultural communities in Canada are often doubly exposed to disinformation emanating from their home countries and from within Canada.
Encrypted messaging platforms provide a high level of privacy and security to individual users. But they present a different set of challenges identifying disinformation threats. There is also the question of how information is perceived among groups of users who share some level of interpersonal communication, connection and trust.
AI on the new front line of disinformation
The emerging use of generative AI to produce disinformation content quickly, cheaply, and in abundance will have potentially calamitous implications through platforms ranging from simple text messages to deepfakes.
In the Global South, AI will allow states to engage in microtargeting, deliver disinformation content into different languages in multilingual contexts, and wage more coordinated propaganda campaigns.
For example, in Bangladesh, the dominant Awami League does not need to sway public opinion only through autocratic rule. It can now deploy easily accessible and inexpensive deepfake videos to discredit and delegitimize the opposition.
In India, political parties across the board have also deployed generative AI in state-level elections. India’s 2024 general elections may turn out to be the world’s largest democratic experiment with the use of generative AI as a campaigning strategy. And it is bound to have spillover effects in increasingly globally connected digital information environments.
Domestic challenges on the horizon
Canada needs to act quickly to develop collective capacity to meet and prevent the information threats that are at the doorstep:
Democratic allies should also be mobilized around ways to leverage AI to help detect disinformation and its spread along with how to use open-source intelligence and information to identify potential threats on the digital information landscape.
Building social capacity
These steps must be undertaken alongside a longer-term commitment to building social capacity and resiliency among individual Canadians.
This needs to start with education in primary school on promoting critical thinking and identifying disinformation, akin to work being done in countries like Finland.
The long game to counter and mitigate disinformation should focus on resiliency building by leveraging some of Canada’s innate strengths as a diverse and pluralistic liberal democratic society.
This whole-of-society approach stands to benefit by drawing on the rich network of civil society and grassroots organizations that function as trusted intermediaries between Canadians, especially those belonging to marginalized and underrepresented communities, and the government.
Many of these community and civil society organizations are already doing critical work in their community-based digital spaces of pre-bunking, de-bunking, counter-messaging and correcting.
In a diverse and multilingual setting like Canada, this also requires investment in third-language digital information resources and tools that promote accurate information and digital and media literacy.
Over the long-term, targeted efforts rooted in preserving and building trust while protecting individual rights and civil liberties can foster lively and productive debate, the exchange of information, and a space for dissent necessary in a healthy liberal democracy.