The Expanding Scope of State Sponsored Espionage and Influence
It’s over a year since Covid-19 was declared a pandemic by the World Health Organisation and we all learned what it means to be in lockdown. The pandemic has changed our personal lives beyond recognition and has exposed vulnerabilities in local and national economies. Moreover, our sudden reliance on digital means of working and communicating has created unique threat vectors and vulnerabilities. Where online worlds intersect at work and home, we are suddenly exposed to threats and risks we can’t even see. Just like the Covid-19 virus itself.
On Tuesday, February 9th, 2021 CSIS Director, David Vigneault warned that, “The fluid and rapidly evolving environment created by COVID-19 has created a situation ripe for exploitation by threat actors seeking to cause harm or advance their own interests. With many Canadians working from home, threat actors are presented with even more opportunities to conduct malicious online activities.” previously
Vigneault went on to assert that Canadians are being “aggressively” targeted by hostile foreign governments seeking political, economic and military advantage. During his speech he singled out China and Russia and the fact they were “pursuing a strategy for geopolitical advantage on all fronts – economic, technological, political, and military – and using all elements of state power to carry out activities that are a direct threat to our national security and sovereignty.” Vigneault also stated that this has been “accelerated” by the pandemic which had created a scenario “ripe for exploitation by threat actors seeking to cause harm or advance their own interests.” He wasn’t pulling his punches.
The challenge for many Canadians is navigating the online world, safely and securely. Numerous social medial sites are being created by threat actors with the intent of manipulation and disinformation. These threat actors often use online trolls and bots to achieve the desired effect. Bots are programmed to perform in ways that mimic a legitimate user. Their purpose is to accelerate content for specific goals on a large scale. Of course, bots can be used for positive and legitimate reasons. However, many are designed for the malicious purposes of platform intimidation and manipulation. The power of the bots is fuelled by Artificial Intelligence (AI), that target and promote negative online influence activity. The goal is to keep the flow of disinformation and misinformation steady and inescapable, setting up echo chambers that reinforce existing prejudices and encourage extreme views to take hold. The result is social polarization, diminished trust in institutions and economic instability.
Topics like governmental COVID responses, vaccine distribution, social justice and political issues are favourite topics for this type of manipulation. More important, the real anger and division occurs when the comments are distorted by bots to create a specific reaction. For instance, when controversial topics are met with a high volume of incendiary comments through bots, a well manipulated post can do massive damage and widen a divide. In May 2020, researchers at Carnegie Mellon University estimated that nearly half of the Twitter accounts spreading messages on the social media platform about the coronavirus pandemic were likely to be bots.
Further supporting this, on February 28th the Globe and Mail’s Steve Chase reported that DisinfoWatch.org, a new disinformation monitoring and debunking project at the Ottawa-based Macdonald-Laurier Institute think tank, sifted through more than 68,000 tweets released last month when Twitter disclosed the results of efforts to identify networks of state-linked information operations. These included accounts that Twitter stated, “can be reliably tied to Russian state actors” as well as accounts from co-ordinated campaigns “that show signs of being affiliated with the Internet Research Agency (IRA) and Russian government-linked actors.” The Kremlin-backed IRA is often informally referred to as the “Russian troll factory” used to develop and amplify disinformation.
Canada has a long history of foreign interference. No matter the government of the day or party in power, this will always be a problem. The difference now is that the target of such interference is the people of Canada and their opinions. The methodology has changed to a more digitalized approach, but the intent is the same: spread disharmony and division. As Vigneault concluded his address he also stated that, “Efforts by foreign states to target politicians, political parties, and electoral processes in order to covertly influence Canadian public policy, public opinion and ultimately undermine our democracy and democratic processes represent some of the most paramount concerns. Our electoral system has been shown to be resilient, but we must also work hard to keep it that way. Vigilance is the best defence.”
All organizations are vulnerable to foreign influence, both in terms of personnel and data. So what can you do to mitigate the threat? First, you must recognize the impact it can have directly or indirectly on your workforce. As much as we agree that vigilance is always your best defence, awareness through understanding of the threat is essential to an appropriate response. Understanding how a workforce is being influenced will allow foreign influence to be identified, disrupted and slowed. At Globerisk we can help you better understand both the existing and evolving threats, in order to identify the vulnerabilities it creates for your organisation. But most importantly, we can help you plan how to mitigate against it, quickly and effectively.
?Alan W. Bell is the President of Globe Risk International Inc. providing world renowned security expertise in leadership, crisis planning and crisis intervention.
Ignis Aurum Probat
3 年Good article.
--
3 年WOW!