Marsh & McLennan Advantage Insights logo
Conversations and insights from the edge of global business
Menu Search

BRINK News is transitioning to This Moment platform on as of March 31, 2023. Read the update here.

In Practice

How the Intelligence Community Makes Decisions

In the second article, Sir David Omand, one of Britain’s most senior intelligence officials and the author of How Spies Think: 10 Lessons in Intelligence, talks about the decision-making process behind the infamous Curveball Incident during the Iraq War, which he uses as an example of how analysts can be influenced by their own predispositions. You can read the first article here.

OMAND: Lesson No.1 in intelligence is that our knowledge of the world is always fragmentary and incomplete and is sometimes wrong. That’s not just true of intelligence, that’s true for all decisions we have to face.

Curveball was a source of the BND, the German Intelligence Service. He was an Iraqi refugee in Germany, a chemical engineer, and crucially had been involved on the fringes of Saddam Hussein’s real biological warfare program in 1990. 

In 2002, he generated a large number of intelligence reports about the situation. It was exactly the information U.S. defense analysts were looking for, and his description of current biological warfare production and mobile production units appeared to be entirely plausible, because he knew what he was writing about.

The problem was, he was making these reports up. After the war was over, when the media caught up with this individual, he said, “I had the chance to fabricate something to topple the regime. I and my sons are proud of that.” He wanted the Americans to invade and overthrow Saddam. 

The Danger of Confirmation Bias

However, the reason the analysts seized on this information wasn’t just its plausibility. It was because, unconsciously, this source was telling them something that they wanted to believe, which was that Saddam Hussein not only had a biological weapons program, but that he had built mobile, biological trailers, which were starting to manufacture the stuff. 

That fitted their preconceptions. This is known in the trade as confirmation bias. It’s unconscious; it’s not about political distortion of intelligence, and it’s not about wanting to please the customers for intelligence. 

It’s about subconsciously paying more attention to information that appears to confirm your beliefs, rather than information that appears to run counter to them. 

The latter kind of information usually gets the treatment: “Well, I’m not sure about that. Let’s wait a bit and see if there’s any confirmation.” Or, “Can you go back and reanalyze that?” Because it doesn’t seem to fit your preconceptions. 

Everyone suffers from confirmation bias. One of the ways you can avoid it is by getting more than one analyst to look at the evidence, and then you argue it out, treating analysis as a team sport so that one analyst can say to another, “I think you’re being over influenced because of this or that.”

Companies, like governments, need to create a safe space where the C-suite can meet and talk frankly to each other about the problems they face and the options that might be open. 

There are many examples in military history, where the general in command accepts the intelligence that appears to suit their plan and finds difficulty with the intelligence that appears to run counter to it. He then goes ahead, and disaster ensues.

The Need for a Safe Place in the C-Suite

Companies, like governments, need to create a safe space where the C-suite can meet and talk frankly to each other about the problems they face and the options that might be open. 

If, as sometimes happens, you have a circle of essentially “yes people” around the CEO, and you’ve got a chief executive with a big ego, with big ambitions, determined to push ahead, you won’t have any voice saying, “Yeah, but that’s going to take at least nine months to put into place, and we haven’t actually got the resources on the ground to do it.” 

So, connecting the rational analysis and the leadership analysis together is very important, and you’re only going to get it right if more than one person is involved. 

A CEO must make sure that they have people around who can challenge perceptions. It’s important to be able to do it privately, so it doesn’t look to the company as if there are divisions at the top. Hence the need for a safe space in the C-suite.

Handling Misinformation

BRINK: One increasing challenge for companies is disinformation or misinformation online. Do you have any advice for how decision-makers can filter that out or work their way through it?

OMAND: It appears to be a phenomenon that goes with social media. It’s hardwired into advertising technology that stories with impact are the ones that attract the most clicks. Therefore, I think we have to learn to live more safely with the kind of biases that you find in social media. 

Its effect can be pretty devastating. You can find your company’s reputation trashed by stories that circulate, either simply when a bit of malicious gossip goes viral, or because somebody has deliberately set some stories running and then amplified them, which can nowadays be done automatically using bots. This is the sort of technique the Russians used against the United States in the 2016 presidential election.

As a user of information, there’s no substitute, of course, for careful cross checking from different sources, to narrow down where the truth might lie. But in terms of protecting reputation, you’ve got to employ a company that can scan what is going on on social media and alert you as quickly as possible to stories that might impact your business. 

And you’ve got to be prepared to go right out front and rebut those stories fast, hopefully before they spread too far. Once the story is out, once it’s been repeated and repeated, then it becomes as good as true to the next person to see it. If it’s a conspiracy story, of course it’s even worse as it’s a closed loop, which is very difficult to get people out of. 

In the pre-digital days, of course you had some stories like that, but they spread rather slowly and only within limited circles. Nowadays, it goes viral and gets picked up and amplified, so you’ve got to get on top of disinformation very fast.

Sir David Omand

Former UK Security and Intelligence Coordinator and former Head of GCHQ

Sir David Omand GCB is a visiting professor in War Studies at King’s College London. His previous posts included Security and Intelligence Coordinator in the Cabinet Office, Permanent Secretary of the Home Office, Director GCHQ, and Deputy Under-Secretary of State for Policy in MOD. He is the author of How Spies Think: 10 Lessons from Intelligence

Get ahead in a rapidly changing world. Sign up for our daily newsletter. Subscribe