Intelligence's Accidental Profession
???? Zachery Tyson Brown
National Security, Defense, and Intelligence Leader | Personal Views Only
“I’d rather be a bookie than a goddamned poet.”
- Sherman Kent, 1964
When everything is intelligence, nothing is intelligence. That was the theme of a speech given by the intelligence scholar Wilhelm Agrell in 2002. Agrell judged then that the craft of intelligence analysis—what for now I’ll call the translation of information into advantage—was beginning to become a modern profession. But he also gave a warning: the growing conflation of intelligence with information threatened to reduce it from distinct discipline to management buzzword.
Today, Agrell’s warning is quickly becoming a reality.
Intelligence analysis still hasn’t become a “modern profession.” It is at best an accidental one, as former Assistant Director for Analysis at the Central Intelligence Agency Mark Lowenthal puts it. Experts can’t even agree on what intelligence analysis is, let alone what analysts are supposed to do. Hardly anyone sets out to become an intelligence analyst. Those who end up in the field often arrive there by happenstance. Sadly, professionalization seems a distant goal.
Worse, the term intelligence analysis itself has been significantly diluted by its appendage to disparate subjects over the years. Its once-esoteric tools have diffused into the broader economy, expanding the scope and variety of their application.
But here’s the thing. Even if we could define it, and even if analysts thus defined were granted singular authority within that definition, our putative profession would still be at risk because today all of the professions are in the midst of revolutionary change that calls their very purpose into question.
If this seems like pedantry—or worse, navel-gazing—allow me to explain why it matters.
The Intelligence Community’s mission is to help leaders make sound decisions. The decision to be made might be whether or not to authorize a lethal drone strike in Syria, which economic pressure points to apply against a competitor, or how to approach arms control negotiations. Regardless, all intelligence analysis has the same aim: to help leaders make better choices.
But decision-makers must trust that the analysis they’re receiving is itself sound, and they must value their analysts’ judgments—both of which have been problems from the start. Both are also exacerbated by the lack of formal accreditation and professional standards.
Stephen Marrin, a former CIA analyst and current professor of intelligence studies at James Madison University noted that the “failure of intelligence analysis to become a formal profession has led to great variation in the competence and skill of individual analysts, uncertainty regarding the very duties of intelligence analysts, and an overall diminution in the role that intelligence analysis could play in decision-making.”
The accidental profession now stands at a crossroads. The familiar path leads to further systematization and ultimately, oblivion; the alternative is uncharted but may lead to renewal. This is not hyperbole; the failure to define or even delimit intelligence analysis has contributed to its declining legitimacy and relevance within the national security enterprise.
Which path we take is up to this generation of intelligence practitioners.
Whatever Happens, Let’s Be Professional
“Professionalization is perception.”
- Eliot Freidson, 1986
“Profession” can be a contentious term. There are many professionals in occupations not considered professions. The distinction isn’t the job’s difficulty or even the amount of specialized training required but about its role within society.
Professions are social institutions that exist to create, transmit, and apply formal knowledge. They developed as life grew more complex than any single person could reasonably expect to cope with on their own. Acting as gatekeepers of specialized knowledge, professions also protected society from frauds who might profess the same knowledge but who in fact were charlatans. In return, they were granted corporate autonomy, exclusivity, and an elevated social status in a sort of grand bargain.
Medicine is considered the ur-profession. Today, physicians have well-established literature, recognized paths of education and examination, and regulatory bodies that admit and discipline members. But this took centuries to develop and did not approach its modern form until relatively recently.
Now take espionage. While it may be the ‘world’s second-oldest occupation,’ it remained outside the bounds of respectability. Intelligence is an altogether much more recent invention. The difference is that between the sometimes successful but usually amateurish plots “run out of a general’s hat or a diplomat’s pocket,” and the systematized work of purpose-built institutions born in early modern Europe but not maturing until the industrial revolution.
Intelligence analysis is younger still. Militaries eschewed romantic notions like aristocratic command in exchange for technocratic expertise over the course of the nineteenth century, but the assessing of intelligence—what Clausewitz almost dismissively called ‘every sort of information about the enemy and his country,’ remained the sole purview of the commanding general well into the twentieth. It wasn’t until the Second World War that an eclectic group of scholars—President Roosevelt called them “carefully selected, trained minds,”—were drafted into the Office of Strategic Services (OSS) to invent what we recognize as intelligence analysis, largely on the fly.
It might be unfair to compare a discipline less than a century old with medicine, which really is the world’s “oldest profession.” But it’s worth remembering that for thousands of years, physicians were useless at best and harmful at worst. Medicine didn’t professionalize until very recently.
In the 19th-century, medicine was a “chaotic free for all,” of “lotions, potions, and ligaments,” rife with literal snake-oil salesmen. In 1875, for example, anyone with a high school diploma could attend a medical school and the mere two years of coursework could be completed in any order the student preferred. This situation led the American Medical Association to conclude that there was “probably no other country in the world in which there is so great…and so fatal a difference between the best, the average, and the worst [physicians].” As late as 1921, the prestigious medical journal The Lancet questioned the value of statistically-validated medical studies, which of course today are standard.
Professionalization, though long sought after, won’t solve the problem as long as intelligence analysts lack the one characteristic medicine and other established professions use to mark themselves as distinct—exclusivity.
In an earlier article, I wrote about the growing problem of disintermediation—that is, society’s growing ability in many fields to cut out the middle man. The users of intelligence—busy policymakers often with little interest to begin with and less time even if they were—now possess the ability to acquire information directly, faster and more easily than ever before, from a variety of sources. But even at the height of the intelligence community’s monopoly during the Cold War, all of them reserved the right to ignore their intelligence officers and even to do their own analysis.
The grand bargain may exist for physicians—even in the age of WebMD, most people prefer to sit in front of a real doctor if they are fortunate enough to be able to afford one—but for intelligence analysts, the deal was never sealed. A National Security Council staffer once chided me, “the intelligence community doesn’t own research.” Fair enough. Honestly, we’ve never even owned the word intelligence.
What is it? More importantly, what isn’t it?
“The meaning of every concept is in the limitation. A word for everything is a word for nothing specific. Intelligence analysis runs the risk of ending up here.”
- Wilhelm Agrell, 2002
If you asked a hundred analysts to describe their job, you might get a hundred different responses. Some would compare themselves to detectives piecing together clues. Others would highlight their numeracy, specialized knowledge of a particular country’s economy or demographics, or ability to effectively tee up decisions for executives. A few might even wax poetic about a profession of cognition—that is, simply being better at thinking than anyone else.
Regardless of what they tell you, though, the dirty secret is that most analysts do very little analysis day-to-day. What many of them do instead is what has been called ‘monitor and report.’ That is, read relevant reporting pertaining to their functional or geographical issue, and report up the chain when they see something a decision-maker probably needs to know about. A senior CIA officer recently distilled this to “read stuff, write stuff.”
Sherman Kent, one of Roosevelt’s carefully-selected minds and widely-recognized as the ‘father’ of American intelligence analysis, referred to it as a “special category of knowledge.” That may have been true, once. During the Cold War, the intelligence apparatus was the only reliable source of information about the Soviet Union, and analysts served as its special mediators.
Analysts have always been knowledge workers—that is, cognitive laborers who deal in the creation, transmission, and application of information. But knowledge workers now constitute half of the modern workforce, a number that will keep growing, and all of them do some form of analysis.
The intelligence analyst has always been a hybrid. Kent modeled his new craft after the one he knew best—historian—and the one he most admired—journalist. Others added to his recipe over time: “the analyst must combine the skills of historian, journalist, research methodologist, collection manager, and professional skeptic,” one wrote. Today, we might include data scientist, information technologist, communications specialist, and graphic designer to the list of desired skills. The analyst has always been a stopgap, a pinch hitter, everything to everyone.
So, what, if anything, distinguishes intelligence analysis from knowledge work in general?
Apologists provide three answers. First, that intelligence analysis deals with uncertainty. But this answer is unsatisfying in an era in which we are less certain of everything. Second, that intelligence analysis deals in secrecy. This one, too, is unsatisfying when there are no secrets left, at least, none that remain so for long. Lastly, that intelligence analysts cope with deception, is particularly quaint when you consider that patients have always lied to their doctors, and even more so in an age wherein disinformation has been weaponized on an industrial scale.
Intelligence in general—and intelligence analysis in particular—were invented to help governments cope with a flood of new information and to handle the increasing complexity of modern challenges. But now, anyone can access as much information as the Allies did during the Second World War.
Maybe in the information age, everything really is intelligence after all.
The Choice
“Intelligence has a history; it also has a future.”
– George S. Pettee, 1946
This is the end of the article but not a conclusion. Rather, I hope it is the beginning of a vigorous debate about the future of the accidental profession. I certainly don’t claim to have all the answers, but I’m certain there are many sharp, dedicated professionals who care as much as I do about the craft of intelligence.
For now, let’s return to the metaphor of the crossroads and the choice before us.
If the craft of intelligence analysis progresses along its current course, it will soon become indistinguishable from other knowledge work in our data-driven age, whose future lies in facilitation, sharing, and sensemaking. As its methods proliferate and the subjects being analyzed further overlap, there will be even less of a distinction between so-called strategic intelligence analysis and its now-numerous cousins both within and outside of government.
And while computers surpassed humans at data processing long ago, in an increasingly-automated future they will be capable of creating and updating compelling narratives, too—providing more information to more users than ever before, all on demand.
This will undoubtedly please decision-makers, at least at first. But policymakers are already drowning in data. And while they may think they just want information more quickly, the fact is that more data often leads to worse decisions by reinforcing what cognitive scientists call the illusion of explanatory depth. No, sharp, insightful analysis has always been vital to sound decision-making, and it will only grow more important as the datasphere balloons.
The alternative is a reimagining of what intelligence analysis is for. Design thinking must become second nature if we are to engage, delight, and inform a new generation of national leaders. Participation from users must become the norm.
We could even go as far as viewing the intelligence analyst themselves as the product. Expert intelligence analysts are hyper-intelligent and highly-trained critical thinkers with a desire to help. What’s more, the trusted workforce traffics in integrity. A truly professional analyst is unbiased, apolitical, and policy agnostic. This is in itself a compelling value proposition for a ‘post-truth’ world.
The most senior intelligence officers have always provided a form of this at the highest levels of government; the Director of Central Intelligence and later Director of National Intelligence served as principal advisors to the President of the United States, for example. This made sense in the hierarchical, slower-paced industrial age.
But in an interconnected world with an accelerating rate of change, policy is more often created by gestalt rather than fiat, and collaborative understanding should exist throughout the enterprise rather than be concentrated at the top.
Intelligence analysis is instrumental; unlike academic research, it has no autarkic significance at all. It does not, as Kent wrote, “pursue knowledge for its own sake.” Its purpose isn't simply to answer questions or even give warning; those are simply methods facilitating understanding so better decisions can be made. It is at its acme a collaborative process between knowledge and action for the purpose of achieving policy preferences.
Zachery Tyson Brown is an intelligence officer and U.S. Army veteran. He is a member of the Military Writers Guild and has written essays for The Strategy Bridge, War on the Rocks, Defense One, and West Point’s Modern Warfare Institute. He can be found on Twitter @ZaknafienDC
Strategic Analysis | Experienced IT Business Analyst | Intelligence | Advisor | Inter-Disciplinarian
4 年https://www.foreignaffairs.com/articles/2019-08-09/next-director-national-intelligence
CEO | Webintelligency + Lilium | Webintelligency - ESG Research Services | Lilium - Built to Specs strategic software systems
5 年This is a very interesting article and I saw that many of the commenters chose to support the main concepts and arguments. Please allow me to be different, for my feelings after reading the article are between confusion to anger. I write this comment and while I'm at it I understand that I am a little emotional about the issue and that it is beyond my control. Anyway, I can not imagine a more stressful and mind keeping and a responsible profession as one of an intelligence analyst.? Yes, it is a profession and it would keep being that for a long time in the future. It is true that technology is developing as we speak and the ability to compute and analyze a big amount of data is the core elements of AI. It is also true that there is a huge gap between the "intelligence education" taught in military and government sectors where one can find methods, unity, formation, and clear goals and purposes. As in the private sector, those who work in CI are sometimes an eclectic gathering of people who came from a military background or academic graduates of information technology or something close, some are self-taught and some are intelligence-savvy. Altogether it is very hard to create generic methods and concepts to uniform al processes within the CI daily work. You can even argue that gathering information is a profession but not analyzing it. After a long elaboration I return to my comment and say that in order to make a good decision you must trust your analyzer, you wrote it. One can argue that a machine can output a fairly reliable suggestion to help you make this decision. But the truth is that even if it is not a matter of life and death and only a matter of allocating budget for the next quarter, a manager still prefers to get it from someone who can think and speak like human and can argue with him/her on their first or second impression of the derived insights. I really think that a machine can help the analyst with computing and processing and analyzing, but there would never be, in my opinion, a replacement for the final human eye looking at the analyzed data and making human-mind processing as well.? ? ?? ? ?
Strategic Service Design & Foresight Strategist | Futures & Design Researcher | Publicis Sapient Alumni
5 年Thank you for sharing this, some good points and perspectives. One this I do appreciate able intelligence analysis methods are the methodsbfor trying to mitigate bias. This is not as thoroughly done in other analysis professions and something that should be applied. More often I see it used in generative contexts. I would be interested to see you build more on where you ended the article. With the changing behaviors and needs of the information consumer, an on demand, self-serve culture, and a growing complexity of data and the world in general, having the skills to reduce, validated, mitigate and communicate in a multitude of ways-visual, audio, written-succintly still seems key. It is a good Design Thinking opportunity. Hope to read more of your thinking on the NEXT for the profession.
Russian Linguist/OSINT Researcher (TS/SCI-CI; worked in Russia for 15 years)
5 年This article is incredibly incisive; Zachery Brown, kudos to you for writing it. When I made DC my home in 2015, something immediately jumped out at me: there is a clear distinction drawn between policymakers and everyone else. Everyone else could mean anybody from an American Army veteran who served in Afghanistan to a shopkeeper from Bulgaria now working as a Bulgarian linguist in DC. But that category also encompasses all the people who have the tools to quickly, professionally, and hopefully objectively glean the intelligence that policymakers need. Policymakers have some of the most important responsibilities in the nation, and need to trust the IC, and not use its information to help promote their own agendas - otherwise, they are no better than the DC "think tanks".
Communications and Content Expert
5 年I am guilty of this one ...."more data often leads to worse decisions by reinforcing what cognitive scientists call the illusion of explanatory depth". I find myself constantly wanting to consume as much information as I can, from all different sources, and even though I'm naturally a skeptic I fall into this illusion of explanatory depth. More is not always better. Thank you for this share, it is a great article filled with passion!