Which Biases Are Impacting Your Decision-Making?An interview with James G. Dinan Professor at the Wharton School of the University of Pennsylvania and Co-Director of the Wharton Risk Center and Decision Processes Center
Howard Kunreuther and Michael Useem, professors at the Wharton Risk Management and Decision Processes Center, recently released a book called Mastering Catastrophic Risk: How Companies Are Coping with Disruption, for which they interviewed senior executives from 100 multinational companies. BRINK spoke to Professor Kunreuther about the biases that distort how executives think about risks and decision-making.
BRINK News: How did you define catastrophic risk as opposed to other risks?
Professor Kunreuther: When we undertook the interviews, we tried to be quite general by asking each of the executives what was the most adverse event or risk that their company had faced. Many of them pointed to the financial crisis, the 9/11 attacks and the Japanese earthquake and tsunami. But we also heard about things that were idiosyncratic to a firm. For example, a kidnapping in South America or a CEO who passed away. These were disruptive adverse events that these companies had to deal with.
BRINK: What were the most common mistakes that these companies made when thinking about these types of risks?
Kunreuther: The most common mistake was believing that the risk was below their threshold level of concern. Firms were overly optimistic or they perceived that the adverse event would not happen to them, and therefore they didn’t pay attention to its consequences until after it occurred.
The other mistake that we highlight in the book is myopia, or thinking very short term. If one thinks only about next year’s balance sheet or bonuses when considering the costs and benefits of reducing the risk of an adverse event, the firm may undervalue precautionary measures.
BRINK: You talk in the book about management having biases when assessing risk. What sort of biases are you referring to?
Kunreuther: Myopia, as I’ve mentioned, is one. Amnesia, or the tendency to forget things after a period of time, is another. An example of amnesia would be that individuals often cancel their insurance policies if they haven’t had a loss for a few years, having only purchased coverage after a disaster occurs. Firms may not behave quite that way, but there’s a tendency to forget past disruptions a few years after they occurred. There is also inertia, maintaining the status quo because of the uncertainty of what change might bring.
Simplification, the idea of focusing on only a single dimension of a problem, is another bias. If a firm focuses only on the very low probability of an event occurring next year, then it may not pay attention to the potential consequences. On the other hand, by focusing on disasters recently experienced, you may overreact to the outcomes rather than also considering the likelihood of its reoccurrence.
The worst-case scenario might not actually happen, but it is better to think about it rather than ignore it.
And finally, there is a herding bias. If you’re an executive, you look at what other firms are doing and then say, “Well, I think I’ll do the same thing that they’re doing.” And that may not be the right strategy to follow for your organization.
BRINK: Do companies need to deploy a different way of planning for catastrophic risk than they might do for more common risks?
Kunreuther: Yes. Companies must think and plan deliberately. When thinking intuitively, emotions such as worry and anxiety coupled with past experience often play key roles in making choices. For example, when firms have had little past experience with specific events such as hurricanes or financial risks, they may perceive that these disasters will not happen to them. Low probability events require firms to make their choice in a more deliberative way because intuition may lead key decision makers astray.
We’re not suggesting that key decision makers in firms will be easily able to change these biases. This is endemic to the way all of us react. So we need to develop ways of framing problems so that executives and other key decision makers in the organization pay attention to extreme events. If you can develop short-term economic incentives that address the myopia bias, you may be able to get firms and individuals to do things that they otherwise may not consider.
To illustrate this point, bonuses can be changed to be contingent, and longer-term. That’s a good way to motivate long-term thinking. What we’re saying is that firms may have to be more attentive to low-probability high-consequence events than they have been in the past.
BRINK: Do you go into the issue of thinking counterfactually? Do they ever spend time thinking about, “What if ‘x’ had happened and it had actually been a lot worse?”
Kunreuther: Yes, near misses are critical in this regard by providing insight into the consequences if the adverse event had actually occurred. One of the recommendations in our checklist is, “Anticipate the worst.” The worst-case scenario might not actually happen, but it is better to think about it rather than ignore it.
For example, after the Japan earthquake and nuclear power disaster, United States auto companies recognized that they had a single supplier from Japan for inputs to their production process and that supplier was unable to meet their needs for several weeks. These firms never thought about having more than one supplier to avoid business interruption until after the Japan disaster.
This interview has been lightly edited and condensed for clarity.