What to Do When Computer Models Get Things Wrong
[Perspectives on Innovation: Part of our regular monthly series featuring content from Perspectives on GE Reports.]
Every senior executive in every large company is on unstable ground right now. They are faced with increasingly uncertain futures, and often they don’t know how to assess risk and opportunity. Brexit should be a reminder.
We used to think we understood the order of things, the way that markets work, why our companies succeed, what our customers want and most importantly, what’s likely to happen next in our markets.
We were wrong. Brexit showed yet again that we have very little ability to predict outcomes.
Markets are being supplanted and customers are changing their behaviors and needs. As the sand shifts, leaders recognize a need to adapt. But too often they fail to institute meaningful change and fail to think forward adequately. They are stuck between running the day-to-day operations of their companies and reshaping their organizations in the face of uncertainty.
Markets are being supplanted and customers are changing their behaviors and needs.
In a recent survey, CEOs said they need to prepare for uncertainty, but their actions didn’t back this up. They said the biggest internal projects they implemented last year or plan to implement this year are cost-reduction and efficiency initiatives.
Executives are being squeezed to perform now, yet they know they need to look forward, become more agile and prepare their organizations for uncertain futures. Few executives understand uncertainty. Thus, they fall back to doing what they know: They focus on efficiencies.
Today is No Indicator of Tomorrow
If you’ve read any Nassim Taleb or remember Donald Rumsfeld’s famous quote, you’ve probably heard the term, “unknown unknowns.” These are blind spots to reality–to what can really happen. Taleb often invokes philosopher David Hume’s swan analogy to illustrate our lapse in logic about what can happen: “No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.”
Our perception of reality overrides this basic logic. This is our blindness.
The late mathematician Benoit Mandelbrot, who’s considered one of the fathers of Chaos theory, explains why. In talking about turbulent markets in his book, The (Mis)Behavior of Markets: A Fractal View of Risk, Ruin and Reward, Mandelbrot said people think that if they study and analyze enough data, they will better be able to predict outcomes. The reason: We believe in the word “because.” If we know why something happened through cause and effect, we can assess risk and forecast events.
The problem, said Mandelbrot, is that causes are usually obscure. “Critical information is often unknown or unknowable,” Mandelbrot wrote. Despite the information gaps, Mandelbrot said, we have a “human need to find patterns in the patternless.”
We use statistical models too easily while analyzing data to find those patterns. Unfortunately, we can’t predict events like Brexit using normal statistical curves. Most major market shocks are outliers that might as well be statistically impossible. In 1987, the Dow price fluctuated by 22 standard deviations, which has a probability of 1 in 1050 years, which far exceeds the age of the universe (13.799 x 109 years).
How can executives overcome this “Executive’s Dilemma” and look into the future, confront uncertainty and make decisions that will help their companies survive industry changing shifts?
John Cage was an experimental composer born in 1912. He was a leading figure of the post-war avant-garde movement. Like many artists, he constantly challenged his understanding and his own expertise. He questioned the very nature of what music was or what it could be. He said of himself, “I’m trying to check my habits of seeing, to counter them for the sake of greater freshness. I am trying to be unfamiliar with what I’m doing.”
Every executive might take Cage’s words to heart. If you want to see better, to sense, to be able to “see around corners,” as Jack Welch puts it, you need to put your habits in check. Unlearn what you know. Recognize your cognitive biases (such as the Framing Effect, Confirmation Bias, GroupThink or Overconfidence Bias) in order to see reality better. Then, try to look forward and extrapolate possible futures.
This is not about data analysis. This is about opening up to possibility. This is about synthesizing the whole. It’s about becoming visionary. It’s about discovering truth.
By casting aside assumptions and preparing ourselves for possibility, we can use abductive reasoning to look forward.
Albert Einstein used abductive reasoning to develop his two theories of relativity. It’s the process of taking the known evidence and using creativity and insight to find new patterns of understanding.
When we synthesize a pattern, we feel the wholeness of it. We experience harmony. Our awareness is not defined by the rational or by analysis, but by a complete, emotional and intuitive understanding. This is the moment of insight, of having a vision and of recognizing truth.
German scholar Jo Reichertz echoes John Cage, “Abductions … always aim at one thing: the achievement of an attitude of preparedness to abandon old convictions and to seek new ones… It is a state of preparedness for being taken [in an] unprepared [state].”
Abductive reasoning and creative insight can help us explore possibilities based on observed cues from shifting markets, new technologies, changing customer behaviors, global political regulatory changes and many other areas.