What Business Can Learn from the Intelligence Community
Sir David Omand was one of Britain’s most senior intelligence officials. He ran GCHQ, the British equivalent of the NSA, and was the U.K. Security and Intelligence Coordinator after 9/11 putting together the British counter-terrorism strategy. In his new book, How Spies Think: 10 Lessons in Intelligence, he shares insights on the decision-making processes of intelligence officials to help others like corporate executives who are faced with making difficult choices.
OMAND: Let me start with one injunction, which is: Know yourself. And the reason I put that first is really because to make a decision, we have to process two different kinds of thinking in our heads, and we need both.
There’s the analytic thought that goes into the impartial assessment of the situation and the options that might be open. And then there’s the more passionate, values-driven, side, which is why you wanted to take a decision or were forced to take a decision in the first place.
That brings in the ambitions of the leader, what they want to get out of the decision, what outcome they want to achieve and why, or what they want to avoid by taking a decision. The trick is, you’ve got to balance both.
The Dangers of Magical Thinking
Otherwise, you fall into the trap that so often government ministers fall into of magical thinking, where you set out the vision but you don’t actually have the rational analysis to connect it back to the skills you’re going to need on the ground, the resources, and the time that people will require to deliver the outcome.
The sort of people who get to the top of organizations, whether in business or in government, are going to be people who have a strong sense of motivation. They may be very inspirational individuals with big ambitions. But you have to balance that against the need for very careful, rational analysis of the facts on the ground.
BRINK: Is there a process in the intelligence community that you tend to follow that would be of use to other professions, or to any of us faced with a big decision?
OMAND: I’ve suggested in my book that there is a way of being quite methodical about how you work your way through to taking a sound evidence-based decision.
You start with situational awareness, checking what you think the world is like. These days, particularly with the internet and social media, you may actually have a lot of information that’s either deceptive or incomplete or partial. Some sources of information may not be reliable.
So, you start by checking all of that to answer questions that tend to start with “what, when and where?” And then you ask yourself, “Can I explain why I’m seeing what I’m seeing? What are the alternative explanations for the motivations of the people involved?”
Deciding Between Alternatives
Obviously, those sorts of explanatory questions tend to start with “how?” or “why?” and that’s quite hard. You would do it by comparing different possible explanations (you could call them hypotheses), looking at the evidence for and against each of them. And what you’re looking for — it’s a rather ugly word — is discriminability: What is the evidence that really helps you decide between alternatives?
Very often, a piece of evidence you have will be consistent with more than one explanation. But if you look at all the evidence thoroughly, you can probably find some which help exclude some of the hypotheses. So, you can narrow it down.
And when you’ve got a good explanation, you can move onto what you really want, which is an estimate of how events are likely to turn out on different assumptions. That’s really the process that an intelligence analyst will go through.
It’s the third category of risks that are in many ways the most interesting for an organization, and certainly for the CEO, which are the self-imposed risks.
BRINK: You talk about risk management in the book, and that’s obviously a phrase that’s used a lot in business. In your experience, what are the most important elements of managing risk?
OMAND: Well, one lesson here is that, as the economist F.H. Knight wrote, “Without risk, there is no profit.” So, there’s upside risk and there’s downside risk, and most of the textbooks and the articles are written about downside risk, how to manage bad things happening, but equally, you need to think quite carefully about the upside risks, whether the opportunities will be there if you position yourself in advance. Both are aspects of risk management.
The Three Kinds of Risk
My personal way of approaching this, based on my own experience working inside government and more recently on commercial boards, is that you need to think about the three different kinds of risk facing the organization.
There are things you can do nothing about: a sudden change in the exchange rate, the closing of the borders of a country to which you export, or perhaps an important market overseas suffers a political upheaval. Those are best managed by identifying the risks and then trying to insure against them or finding a way of mitigating the risk, and that’s where considerations of contingency planning, insurance and so on, come in.
The second type of risk are those that are inherent in the nature of the business. If you’re in retail, you’re going to suffer shrinkage in the shops. If you’re selling to a large customer base, then you’re going to have the personal details of customers held in databases inside the organization, so you will be the subject of cyber attack. For such risks you need to examine whether you have adequate systems of control and audit them. And the same is true of financial controls appropriate to the kind of business.
It’s the third category of risks that are in many ways the most interesting for an organization, and certainly for the CEO, which are the self-imposed risks. You didn’t have to lead the company into a brand new market. You didn’t have to replace all the accounting systems with some different type of software or, even worse, switch the corporate IT system over to a new architecture. If that goes wrong, you could bring down the company. So the questions to ask are about who you put in charge, whether they have the board support and resources they need, and whether you are getting honest appraisals of how the initiative is getting on.
That’s just one way of structuring risks from the typical long risk register to bring them alive and structure them into the sort of mitigation decisions that executives can actually really start grappling with.
Hard Decision Points
BRINK: When you look back on your long career, are there certain kinds of decision points that you still think about that were particularly difficult for you to make?
OMAND: It’s a very good question. Lots of those, and some are perhaps best forgotten and some are best not brought into the public domain. But I’ll just give you one example, which is now public knowledge, which was when I was the U.K. security and intelligence coordinator, essentially the government’s chief risk manager.
The British Security Service said that one of their sources had suggested there was a terrorist plot underway to bring down an airliner landing at Heathrow Airport with a surface-to-air missile. And they even identified the particular firing point that the gang was thinking of, just outside the airport.
We assembled in COBRA, the cabinet office briefing room, all the different people who might have a view on this or help with the response. And we agonized over all the different possible ways of dealing with it. If you tell the airline the concern, they’ll stop flying, the knowledge will get out and the airport will be forced to shut. If it shuts, under what circumstances would you judge it safe to reopen it? Can you ignore it, and hope for the best — that would be to place an unconscionable risk to the traveling public.
If you stake out the possible firing point with police and special forces — very easy to do, but what if the gang changes their minds at the last minute and shoots it down from somewhere else, or goes to another airport to bring down a different airliner? Of course you ask, “Is there further information that will help you send out to the intelligence agencies?” But they come back, and they say, “No, there isn’t any more information available.”
In the end, after looking at all the possible options, we decided that the only sensible thing to do was to make so much noise in the security space that if there was such a plot — and we couldn’t be certain there was — then the plotters would realize they’d been exposed, and we might then catch intelligence traces of them, and at least the plot would not go ahead.
When we took the decision to the prime minister, Tony Blair, he asked exactly the right question, which was, “Is everyone agreed that this is the best option, the police, the intelligence agencies and so on?” And we said, “Yes, we are all in agreement. This is the least-worst option, so that’s why we did it.”
We mounted an enormous operation with thousands of police and troops and aircraft circling overhead, armored vehicles patrolling the perimeter of the airport, and the response from the media the next day was ferocious. We didn’t get any thanks for it. The media thought it was all a put-up job to wind up the public.
And inevitably, the intelligence turned out to be aspirational talk, not a real plot that was underway. But it was an interesting exercise in how to make a decision based on incomplete and fragmentary information that you know may be wrong. How do you go about methodically working through the options and then picking the least-worst one for the protection of the public?
Part two of this interview will appear next week.