Intelligence Methodologies
Methodology, in intelligence, consists of the methods used to make decisions about threats, especially in the intelligence analysis discipline.
The enormous amount of information collected by intelligence agencies often puts them in the inability to analyze them all. According to McConnell, the US intelligence community collects over one billion daily information. (McConnell 2007) The nature and characteristics of the information gathered as well as their credibility also have an impact on the intelligence analysis.
The capability parameter is essential to the current understanding of the threat. (Vandepeer 2011) Analysts use two approaches to capacity assessment: the use of measures and proxy measures. A measure allows a direct assessment of the capacity. Proxy measures are indirect measures used to make deductions in terms of capacity.
For the assessment of a country’s military weapons and armed forces, in addition to capacity measures, there are five direct measures to assess military capability: leadership and C2 (command and control); order-of-battle; force readiness and mission; force sustainability; and technical sophistication, (Joint Publication 2-01 2012) plus proxy measures (military related subjects assessment), including C4 systems (telecommunications and networks); the state’s Defence industries; energy/power; geography; demography; and medical capability. State capabilities may only be known once they are effectively used against an opponent. (Vandepeer 2011)
The nature itself of an intention means that it is not “measurable” like capacity. It is estimated or deduced from observable factors, called indicators (observable factors used to deduce or observe current or future intentions). The indicators provide a means of inferring rather than quantifying.
There are three indicators that appear significantly in assessing state intentions: the military capacity of the state; ideology of the state; and words, actions and behaviors of state leaders. So, military capacity assessments are not enough to infer a state’s intentions. The ideology of a state reflects political leadership, the third indicator of intentions.
Intelligence analysts are “essentially information translators, whose role is to review information and provide reliable intelligence in a practical and operational format.” (Cope 2004, 188) The U.K. National Intelligence Model describes four major products resulted in the analysis process: strategic assessments, tactical assessments, target profiles and problem profiles,. (Association of Chief Police Officers, Bedford 2005) The evaluation of information implies their credibility, together with an assessment of the reliability of the sources. (Palmer 1991, 22) There are few formal information rating systems used by analysts around the world. The most common of these methods is the Admiralty System (referred to as the NATO System), which is used to demonstrate the net value of certain information based on the reliability of the source and the validity of the data. (Besombes, Nimier, and Cholvy 2009) The traditional model is a 6 x 6 matrix. Agencies operating within the National Intelligence Model in the UK use an alternative classification system commonly called the 5x5x5 system. (Joseph and Corkill 2011)
The prism theory of Robert Flood, termed by others as methodological pluralism, uses the metaphor to describe creative thinking and transformation, a prism that decomposes light into its component colors through double refraction. This type of thinking produces multiple different visions on the same thing and a common vision for many different things. Its purpose is to challenge hypotheses, provoke new ideas and generate unexpected prospects. (Flood 1999) (Duvenage 2010, 81)
The concept of prismatic thinking has gained ground in the analysis of information. Jones states that besides convergent thinking, we also need divergent thinking to ensure an effective analysis and problem solving. (M. D. Jones 2009) Divergence helps analysts analyze a more creative issue, while convergence helps to achieve completion. (Duvenage 2010, 82)
Wolfberg proposes a full-spectrum mindset, in which the analyst applies both intuitive and structural methods, depending on the specific context, assuming at the outset that there are multiple interrelated problems that need to be solved simultaneously. (Wolfberg 2006) (Duvenage 2010, 83)
Waltz conceived the integrated reasoning process, (Waltz 2003) an integrated formal and informal methods of reasoning for analysis-synthesis in the operational environment of the intelligence activity. The process stems from a set of evidence, and a question for them that explains the evidence. This process, from a set of evidence to detection, explanation or discovery, detects the presence of evidence, explains the processes underlying the evidence, and discovers new patterns in the evidence. The model illustrates four basic ways that can use the set of evidence: three fundamental ways of reasoning and a fourth way of feedback: deduction (by testing on models/hypotheses previously known), retroduction (when the analyst conjectures a new conceptual hypothesis causes a return to the set of evidence), abduction (creates explanatory hypotheses inspired by the set of evidence), induction (searching for general statements (assumptions) about evidence). (Duvenage 2010, 84–85)
Waltz typifies the analysis-synthesis process as a process of decomposing evidence and building the model, helping the analyst to identify the missing information, the strengths and weaknesses of the model. The model serves two functions: hypothesis (if the evidence is limited), and explanatory (when more evidence matches the hypothesis). The process involves three phases defined using the term “space” and the use of structural analytical techniques: data space (data is indexed and sorted), argument space (the data are reviewed, correlated and grouped into a set of hypotheses) and explanatory phase (models are composed to serve as explanations). (Duvenage 2010, 86)
The flow of cognitive process is identified as: searching and filtering, reading and extracting, schematizing, building the case, telling the story, reevaluating, looking for support, looking for evidence, looking for relationships, looking for information. (Duvenage 2010, 88)
A rigorous analytical model that can help analysts was developed by Zelik, Patterson and Woods in 2007. This model improves Heuer and Pherson’s structured self-critique technique. This model has eight rigorous indicators: exploration of the hypothesis, search for information, validation of information, stance analysis, sensitivity analysis, collaboration of specialists, synthesis of information, explanation critique. This model explains cognitive processes, provides the first metric to test informational products, and provides a framework for collaborative learning. (Duvenage 2010, 91–92)
Duvenage details further the?sensemaking?concept derived from cognitive and especially organizational theory, (Weick 1995) is used in knowledge to investigate and describe how the individual, the group and, specifically, the organization are confronted with uncertainties and adapt to complexity. (Duvenage 2010, 92–93) At the individual level, sensemaking means the ability to perceive, analyze, represent, visualize and understand the environment and the situation in an appropriate contextual manner. (Cooper and Intelligence 2012) This is known in intelligence analysis as?situational awareness?or?environmental scanning. The relevance of meaning in information analysis becomes clear when seven properties of Weick’s significance are applied to the psychology of Heuer’s information analysis: social context, grounded in identity construction, retrospective, driven by plausibility rather than accuracy, ongoing, extracting from salient cues, enacting. (Duvenage 2010, 94–95)
Fishbein and Treverton cite Klein, Stewart, and Claxton, who argue that empirical research has shown that intuitive judgment is the basis of most organizational decisions and is superior to analyzing problems marked by ambiguity or uncertainty. (Shulsky and Schmitt 2002)
Robert M. Clark proposed a methodology for analyzing information by addressing the target-centric intelligence cycle (Clark 2003) as an alternative to the traditional information cycle. It has redefined the informational process in the form of an integrated network where information can circulate directly between the different stages of the cycle (practically, nor is it a cycle in the traditional sense of the term).
Sherman Kent encouraged arguments and dissent among intelligence analysts to reach a “wide range of outside opinions”, (Davis 1995) encouraging “collective responsibility for judgment” by networking the intelligence with loops of feedback between analysts and various stages of the intelligence cycle.
Conceptual models allow analysts to use powerful descriptive tools to estimate current situations and predict future circumstances. (Clark 2003, 37) After the model was sketched, the analyst populated the model by researching, gathering information and synthesizing. He has to find information from a wide range of classified and unclassified sources, depending on the targets.
The collected data must be collated, organized, and the evidence is evaluated for relevance and credibility. After analyzing the data, the analyst includes the information in the target model, thus determining where inconsistencies exist in the conclusions by further research to support or deny a certain conclusion. The target model shows where there are gaps in the model. Possible discrepancies force the analyst to collect additional information to better describe the goal.
Robert M. Clark’s organizational model helps analysts successfully describe the target organization and see the strengths and weaknesses of the target for predictive and reliable analysis. (Clark 2003, 227)
General Stanley A. McChrystal proposed in 2014 a targeting cycle called “F3EA” used in the war in Iraq, which means:
领英推荐
Richards Heuer states that no method guarantees the success of the conclusions. Analysts need to continually improve it, depending on their specific context and previous personal experiences. (Heuer 1999) Also, in the case of a network cycle approach, it should be borne in mind that these models consume much longer than a traditional cycle. (Johnston 2005)
Structural analytical techniques are used to provoke judgment, identify mentalities, overcome prejudices, stimulate creativity, and manage uncertainty. Examples include verifying the main assumptions, competing hypothesis analysis, the devil’s advocate, red team analysis, and alternative futures / scenarios analysis, among others. (US Government 2009) The following methods are ways to validate the analyst’s judgment:
Opportunity analysis: Identifies, for decision-makers, opportunities or vulnerabilities that their organization can exploit.
Linchpin analysis: results from information that is certain or likely to be safe. (Davis 1999)
Analysis of competing hypotheses:?The analysis of competing hypotheses was a step forward in the methodology of information analysis. More challenges, according to Heuer, are more important than more information, especially to avoid rejecting cheating at hand, as the situation seems to be simple. The steps in the analysis of competing hypotheses are: (Heuer 1999)
Analyzing competing hypotheses is auditable and helps overcome cognitive biases. It allows the return to evidence and hypothesis, and therefore the monitoring of the succession of rules and data that led to the conclusion.
Van Gelder proposed hypothesis mapping as an alternative to competing hypothesis analysis. (van Gelder 2012)
The structural analysis of competing hypotheses provides analysts with an improvement over original limits, (Wheaton and Chido 2007) maximizing possible assumptions and allowing the analyst to divide a hypothesis into two complex assumptions.
A method, used by Valtorta and colleagues, uses probabilistic methods, adding Bayesian analysis to competing hypotheses. (Goradia, Huang, and Huhns 2005) A generalization of this concept led to the development of CACHE (Collaborative ACH Environment), (Shrager et al. 2010) which introduced the concept of the Bayesian community. The work of Akram and Wang applies paradigms in graph theory. (Shaikh Muhammad and Jiaxin 2006)
Pope’s and J?sang’s works use subjective logic, a formal mathematical methodology that explicitly deals with uncertainty, (Pope and J?sang 2005) which forms the basis of Sheba technology that is used in intelligence assessment software.
Analogy: Common in technical analysis, but the engineering features that seem the same does not necessarily mean that both have the same mode of operation just because they are similar.
In the process of intelligence analysis, analysts should follow a series of sequential steps:
Effective intelligence analysis must ultimately be tailored for the end user but without lowering the quality and accuracy of the product. (M. L. Jones and Silberzahn 2013)
Bibliography
Nicolae Sfetcu
Email:?[email protected]
This article is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International. To view a copy of this license, visit?https://creativecommons.org/licenses/by-nd/4.0/.
Sfetcu, Nicolae, “Intelligence Methodologies”, SetThings (March 31, 2019), MultiMedia Publishing (ed.), URL =?https://www.telework.ro/en/intelligence-methodologies/