Iran’s Assault on Our Democracy and a Closer Look at Their Disinformation Tactics
Article by Dr. Paul de Souza

Iran’s Assault on Our Democracy and a Closer Look at Their Disinformation Tactics


The Iranian disinformation campaign is a complex and calculated operation, meticulously designed to disrupt U.S. democratic processes (our US elections!). Through the lens of the DISARM framework, I show the deliberate structuring of each phase, aimed at maximizing societal disruption by exploiting social divisions and eroding trust in the electoral system. This analysis reveals Iran's sophistication and comprehensive understanding of how its efforts align with strategic disinformation tactics. Based on the DISARM framework and the detailed tactics, techniques, and tasks, here's a basic and yet actionable breakdown I put together of Iran's disinformation campaign targeting U.S. elections:


PLAN PHASE

Plan Phase

Tactic: Plan Strategy (TA01) Goal Setting (TK0001): Iran's primary objective is to sow discord and undermine the integrity of the U.S. electoral process. This involves setting a clear goal to erode trust in democratic systems by intensifying social divisions. Campaign Design (Objective Design) (TK0003): The disinformation campaign likely incorporates multiple layers of operations, each tailored to influence different voter demographics through targeted messaging, polarizing issues, and divisive narratives.

Tactic: Plan Objectives (TA02) Population Research / Audience Analysis (TK0002): Extensive research identifies vulnerabilities within the U.S. electorate, focusing on groups most susceptible to disinformation, particularly highly polarized political factions. Identify Target Subgroups (TK0004): Iran targets specific voter subgroups, including conservatives and liberals, by tailoring disinformation content to resonate with these audiences, using contentious topics such as LGBTQ rights, immigration, and the Israel-Hamas conflict.


PREPARE PHASE

Prepare Phase

In the Develop Content (TA06) stage, Iran employed Develop Competing Narratives (T0004) by crafting narratives that oppose mainstream media, often using AI-generated content to enhance credibility and appeal. These narratives were strategically designed to foster confusion and distrust among the electorate. Additionally, they utilized Create Inauthentic Social Media Pages and Groups (T0007) by establishing inauthentic websites, such as Even Politics, Nio Thinker, and Savannah Time, to pose as legitimate news sources and spread false or misleading information.

During the Select Channels and Affordances (TA07) phase, Iran leveraged Facilitate State Propaganda (T0002) by utilizing state resources to create a network of controlled websites and social media platforms. This infrastructure was used to disseminate disinformation narratives, effectively pushing their agenda through tightly managed channels.


EXECUTE PHASE

Execute Phase

In the Microtarget (TA05) stage, Iran used Leverage Existing Narratives (T0003) by amplifying existing stories related to the Gaza war and U.S. political figures to deepen societal divisions and further polarize the electorate. This tactic used sarcasm and provocative language to stir emotions and provoke reactions.

Additionally, they employed Cultivate Ignorant Agents (T0010) by spreading disinformation through various channels, intent on cultivating individuals who unknowingly propagate false information. This approach significantly extended the reach and impact of their campaign, making the disinformation more pervasive and difficult to counter.


ASSESS PHASE

Assess Phase

In the Assess Impact (TA08) stage, Iran likely employed Measure Effectiveness (TK0050) by monitoring the impact of its disinformation campaign. This would involve evaluating the extent of discourse and division created within the U.S. electorate, tracking engagement levels on inauthentic websites, and observing the spread of disinformation across social media.

Following this, Iran may have used Adapt Strategies (TK0051) to refine their approach. Based on the assessment, they could adjust the content or shift focus to other divisive issues as the election date approached, aiming to maximize the campaign's overall impact.

The following countermeasures can be deployed to counter Iran's disinformation campaign. These countermeasures are strategically designed to disrupt, mitigate, and neutralize the impact of the disinformation efforts:

COUNTERMEASURES

Strategic Planning (TA01)

  1. Inoculate: Positive Campaign to Promote Resilience (C00022) Metatechnique: Resilience (M001) – Deploy a positive information campaign to create resilience among the targeted population. This includes promoting trust in democratic institutions and educating the public on the importance of critical thinking to counter fear-based disinformation tactics employed by adversaries like Iran. Techniques Implement the 5Ds (Dismiss, Distort, Distract, Dismantle, and Discredit) to actively counter disinformation narratives.
  2. Create Shared Fact-Checking Database (C00008) Metatechnique: Scoring (M006) – Establish a centralized fact-checking database accessible to media organizations, government agencies, and the public, allowing for the rapid identification and debunking of false narratives. The database should also provide tips and response strategies to counter disinformation in real time. TechniquesUtilize the 5Ds framework to score and categorize disinformation for rapid response.
  3. Enhanced Privacy Regulation for Social Media (C00010) Metatechnique: Friction (M004) – Strengthen privacy regulations on social media platforms to reduce the spread of disinformation. By limiting the ability of malicious actors to collect and misuse personal data, the propagation of targeted disinformation can be curbed. TechniquesConduct a Centre of Gravity Analysis to identify the most vulnerable points in social media ecosystems and implement regulations that create friction for disinformation operators.

Objective Planning (TA02)

  1. Educate High-Profile Influencers on Best Practices (C00009) Metatechnique: Resilience (M001) – Engage and train key influencers on social media to recognize and counteract disinformation. Since influencers significantly shape public opinion, their ability to detect and discredit disinformation can substantially reduce its spread. Techniques – Educate influencers on cultivating ignorant agents and baiting techniques to ensure they can effectively identify and counter these tactics.

Operational Tactics

  1. Charge for Social Media (C00006) Metatechnique: Friction (M004) – Introduce a paid option for privacy on social media platforms. This measure would limit the reach of disinformation by making it more difficult for malicious actors to use free services to propagate false information widely. Techniques – Create obstacles for fake social media profiles and pages, reducing the effectiveness of mass disinformation campaigns.

Deploying these countermeasures can significantly diminish the effectiveness of Iran's disinformation campaign. These strategies disrupt current operations and strengthen resilience within the population, making it more difficult for future disinformation efforts to succeed. By combining education, regulation, and proactive information campaigns, we can better safeguard the integrity of the U.S. electoral system.

Iranian state-sponsored groups have utilized various malware tools in support of disinformation campaigns, often in conjunction with broader cyber espionage and influence operations. Here are some notable examples:


Mindmap by Dr. Paul de Souza

1. APT35 (Charming Kitten) - Operation "Phosphorus" Malware Used: Spear Phishing, Credential Harvesting Tools Details: APT35, also known as Charming Kitten, is an Iranian cyber-espionage group linked to several operations targeting U.S. political figures, journalists, and researchers. In 2019, Microsoft reported that APT35 conducted a campaign dubbed "Phosphorus," where the group attempted to breach email accounts associated with the U.S. 2020 presidential election. The operation involved spear-phishing emails designed to steal credentials and access communications, potentially for use in crafting disinformation narratives. Support to Disinformation: Stolen credentials and sensitive information from high-profile targets could be selectively leaked or manipulated to undermine public trust in democratic institutions.

2. APT33 (Elfin) - Shamoon Malware Malware Used: Shamoon Details: APT33, also known as Elfin, is linked to the deployment of the Shamoon malware, a destructive wiper tool. Although primarily used in attacks against the energy sector in Saudi Arabia and elsewhere, the tactics and techniques associated with Shamoon have been adapted for disinformation campaigns. The malware's ability to cause widespread disruption and its use in politically motivated attacks demonstrate how Iranian actors can combine destructive cyber capabilities with influence operations. Support to Disinformation: The destruction caused by Shamoon and similar malware can be leveraged to create narratives around the vulnerability of critical infrastructure, amplifying fears and uncertainty within target populations.

3. APT34 (OilRig) - POWBAT and BONDUPDATER Malware Used: POWBAT, BONDUPDATER Details: APT34, also known as OilRig, has been involved in cyber-espionage campaigns using custom malware such as POWBAT and BONDUPDATER. These tools have been used to gain persistent access to networks in the Middle East and the U.S. The group targets various sectors, including finance, energy, and government. Intelligence gathered from these operations can be repurposed for disinformation efforts. Support to Disinformation: Information exfiltrated through these malware tools can be weaponized in disinformation campaigns, such as spreading false narratives about economic instability or manipulating public opinion on geopolitical events.

4. APT39 - Rana Intelligence Computing Company Malware Used: Various Custom Tools for Surveillance and Data Theft Details: APT39, linked to the Iranian Ministry of Intelligence and Security (MOIS), has been involved in extensive cyber-surveillance operations. The group targets individuals and entities in the travel, telecommunications, and business sectors. The stolen data is often used to support Iran's geopolitical goals, including disinformation. Support to Disinformation: Surveillance data obtained through APT39’s operations can be used to craft disinformation targeting individuals or groups, aiming to discredit opponents of the Iranian regime or influence public opinion abroad.

5. MuddyWater - POWERSTATS Malware Used: POWERSTATS Details: MuddyWater is an Iranian threat actor known for using the POWERSTATS backdoor in its campaigns. This group has been linked to various attacks on government and telecommunications sectors in the Middle East, Europe, and the U.S. The data collected through these operations can be used for strategic disinformation purposes. Support to Disinformation: MuddyWater’s operations often involve collecting intelligence that can be repackaged as disinformation. For example, falsified documents or doctored communications could be disseminated to create confusion or distrust among targeted populations.


Diego M.

IT Support Technician & Cyber Security

2 个月

I appreciate the sense of freedom and democracy that the USA offers. Great info!

This is phenomenal and I wish this would be implemented immediately and become standard practice.

Laurin Groover

National Security - Cyber and Information Operations - Attorney

2 个月

Great info here, Paul. I'm interested to learn more about this foundation.

MSc Max Campos

Senior Advisory Officer in Cyber Security and Defense Affairs -Lead Researcher at the Naval War College Simulation and Scenarios Laboratory CISSP | GSLC | GICSP | CISO | CyberOps Associate ??

2 个月

Very informative

要查看或添加评论,请登录

Dr. Paul de Souza的更多文章

社区洞察

其他会员也浏览了