How Preconceptions Shape Perceptions: Addressing Attribution Biases in Cyber Threat Hunting
Do you ever feel like the cyber threat hunting process can be a bit of a guessing game? That’s not just a feeling—it’s actually our human cognition shaping our security operations. Attribution biases are an important component of human psychology, and they can often lead us astray when it comes to recognizing threats.
This article will explore what attribution bias is, how it can color our perceptions of danger, and how we can work to detect and mitigate it. We’ll touch on the traditional methods of threat hunting, where attribution bias is especially common, as well as some helpful tips for removing bias from the equation. So let’s dive into the why and how of attributing threats—and getting rid of preconceptions that may be causing us to miss key indicators.
Anchoring Bias: Relying Too Heavily on Initial Impressions
When it comes to making sense of the complex world of cyber threat intel, it can be easy to be influenced by preconceived notions. In this sense, our first impressions become 'anchors'— they gage and inform the way we perceive and interpret subsequent information. This phenomenon is known as anchoring bias; and, unfortunately, it can lead to misinformed assessments—which could have serious consequences for cyber threat hunting.
Anchoring bias occurs when we focus on one piece of information (usually, the first piece) when making a decision or solving a problem. We tend to over-rely on the initial piece of information that is presented to us, which prevents us from considering other possible alternative solutions. In cases involving cyber threat intel, this can be dangerous if we let our initial 'anchor' become our overall judgment.
By recognizing and understanding this common cognitive bias and taking steps to reduce its influence in decision-making processes involving cyber threat intel, organizations can more accurately assess security threats—and ultimately protect their systems and data from potential breaches.
Availability Heuristic: Judging Frequency Based on Ease of Recall
Have you ever wanted to know the likelihood of something happening based on how easy it is to remember things? Welcome to the world of availability heuristics—an area of cognitive psychology that examines how we make decisions and judgments based on ease of retrieval and recall.
In the context of cyber threat hunting, availability heuristics help people judge the frequency and likelihood of certain threats by relying on how easily they can call up similar events or information. For example, if you hear news reports about a particular type of malware on the rise, you may be more likely to assume it is a greater threat than other types because it is ‘top-of-mind’ due to familiarity or recent media coverage. This phenomenon has been called “the availability bias” — where people make decisions or judgments based on ease of retrievability and recall (as opposed to actual frequency).
The truth is, without proper research and data analysis, it's easy for preconceived notions like this one to shape our perceptions. That’s why a key part of effective cyber threat hunting should involve taking steps to address these biases in order to get an accurate measurement of threat occurrence and severity.
Confirmation Bias: Favoring Information That Confirms Your Beliefs
One of the most common—and dangerous—attribution biases in cyber threat hunting is confirmation bias. This cognitive bias occurs when you search for, interpret, favor, and recall information that only confirms your pre-existing beliefs, expectations, or hypotheses. In other words, you look only for things that fit with what you already believe and discard evidence that may disconfirm it.
For example, let's say you're researching a particular type of threat or malicious code and you already have an idea about what it is. Confirmation bias can lead you to search for information that confirms this belief and ignore evidence that might challenge it. Not only is this dangerous because it could lead to inaccurate conclusions about the threat, but it also means that other types of threats might be overlooked as a result.
So how do you fight back against confirmation bias in cyber threat hunting? The best way is to make sure all available data is examined objectively. Question any existing assumptions about the data before drawing conclusions and consider new evidence regardless of whether or not it supports prior beliefs. It's also important to remember to take into account context when evaluating data; sometimes the same evidence can mean different things depending on the specific circumstances in which it was found.
Illusory Correlation: Perceiving Relationships Where None Exist
When it comes to cyber threat intel, one concept you should be aware of is illusory correlation. It's a cognitive bias that happens when people perceive a relationship between two things, even when no such relationship exists.
At its essence, illusory correlation is the phenomenon of perceiving a relationship between variables even when none exists. It's created when two separate variables are paired together, leading to an overestimation of how often they co-occur.
The classic example of illusory correlation is believing that seeing two coin flips that land on heads creates the perception that "heads" is more likely to be the outcome of the third flip — despite heads and tails having an equal probability for each flip.
This same concept can be extended to cybersecurity threat intel. For instance, let’s say a team sees a connection between malicious methods used and certain payers or IP addresses; without verifying other possible connections or having further data, they could assume that all things related to this payer or IP address are malicious — whereas in reality, it may have been unrelated coincidences or features of other elements in the system.
In essence, illusory correlations can lead to vast misconceptions about cyber threats; it’s important for threat intel teams to remain vigilant and stay focused on actual evidence-based data rather than assumptions when hunting threats.
领英推荐
Affect Heuristic: Letting Emotions Influence Your Judgments
When it comes to cyber threat hunting, we have to be extra careful not to let our emotions get the better of us. One way this can happen is through something called the affect heuristic, a type of mental shortcut that lets emotions influence our decisions.
The affect heuristic is essentially a fast and automatic way of judgment based on how you feel currently—in other words, your emotions can get in the way of making rational decisions. As a result, preconceptions and prejudices may lead to inaccurate judgments when it comes to threat intel analysis.
Here’s an example: let’s say you see reports from two sources, one from a company with a good reputation and one from an unknown source. You're likely to think that the first source is more reliable than the second one without actually checking for facts or evidence—a decision made based solely on emotions.
So, what can we do about this? Awareness is key! Being aware that our decisions and judgments may be influenced by emotions can help us make more fact-based decisions instead of relying too much on feelings. We should also stay suspicious when cases present themselves too easily or have seemingly perfect evidence – this could be an indicator of confirmation bias or cognitive dissonance.
Fallacies: Errors in Reasoning That Distort Attribution
When it comes to threat hunting, there are certain fallacies that can arise and cause us to make faulty attributions. One such fallacy is known as the Fundamental Attribution Error, which is a cognitive attribution bias. The error essentially states that people tend to attribute others' behaviors to internal factors — e.g., character or disposition — rather than external factors, such as situational constraints or cultural norms.
Put simply, this type of reasoning can lead us to misread situations and falsely assume that someone was responsible for their own actions. While it may be true in some cases, we must be open to examining all sides of an issue before making an attribution — and this is especially important in cyber threat hunting.
To break down this fallacy further, think of it in terms of a rhetorical triangle: the Fundamental Attribution Error is a fallacy from ethos, which is the appeal based on trustworthiness and credibility. In other words, if we rely on our gut feelings when attempting to attribute cyber threats and activities, we may be doing a disservice ourselves by not taking into account the other sides of an issue before forming an opinion or conclusion.
The reasons why cyber analysts and threat hunters may experience attribution biases in their work can be traced back to preconceived notions that hinder analysis and performance. The phenomenon of confirmation bias, for instance, can lead analysts to develop tunnel vision, to overlook important evidence, and to miss opportunities for new insights. Furthermore, some adopters of threat intelligence may not re-evaluate data in the context of their unique environment, leading to false assumptions and incorrect conclusions.
Ultimately, threat hunters must become aware of these biases and prejudices, and be mindful of how they shape our perceptions and objectivity when it comes to analyzing and responding to cyber threats. Their ability to catch and address emerging threats depends on their capacity to remain open to all possibilities, no matter how unlikely, and to be cognizant of their own preconceptions and the way they shape the realities they see.
Hey, I'm Penelope Raquel! ??
I'm an IT- sec person
I love helping other people
I also love helping people with cybersecurity and compliance
If you enjoy my content:
?? Ring the Bell on my profile to get notified about my new posts!
Senior Cybersecurity Engineer | Global Speaker
1 年#letsconnect