Business Magazine

Creeping Determinism

Posted on the 30 May 2017 by Candacemoody @candacemoody

Embed from Getty Images

This is one of a series of posts based on the book Everything is Obvious, Once You Know the Answer by Duncan Watts. Watts is a sociologist who is a principal researcher at Microsoft Research and a Professor at Large at Cornell University.

When we hear a story, whether it ends well or badly, we tend to look for explanations. We feel comfortable when we think we have figured out why something happened: why a book became a best seller, why a company failed, or why the gunman made the decision to shoot. Watts writes “This tendency, which psychologists call creeping determinism, is related to the better-known phenomenon of hindsight bias, the after-the-fact tendency to think that we “knew it all along.”

Watts says that the two phenomena are related, but not exactly alike. “Hindsight bias, it turns out, can be counteracted by reminding people of what they said before they knew the answer or by forcing them to keep records of their predictions. But even when we recall perfectly accurately how uncertain we were about the way events would transpire— even when we concede to have been caught completely by surprise— we still have a tendency to treat the realized outcome as inevitable.”

We mistake coincidences as causes, and create stories around them to convince ourselves that the outcome should and could have been predicted.  The danger lies in the way we then create future models based on our flawed theories. Watts says that we can never know which, if any, of what could be thousands of factors, conditions, and actions contributed to the outcome. We focus on the ones we can see and measure, and weave a post-event narrative together that makes sense to us. In this way, he says, we’re not much removed from our ancestors who looked up at a stormy sky and theorized about a God of Thunder.

One problem we have is that we pay much more attention to outliers than we do to the many events that may have the same causes. We notice when a new product breaks through and becomes a national sensation, but not the hundreds of companies with the same business model that have failed over the past few years.

We also tend to end a story at an arbitrary point (of success or failure) when the truth is continuing to unfold. We model ourselves on someone who achieves enormous success one year, but may fade into obscurity a few years later. History is messy while it’s happening we don’t have the benefit of hindsight until many years later. By then, your chance at a decision is long past.

Even analytical decision makers and scientists make these creeping determinism errors. Watts cites the case of an airplane crash. Safety engineers identified five factors that could have contributed to the crash, including fog, missed communication and pilot fatigue. But those exact conditions had been present in dozens of flights at the same airport and did not result in a crash.

“We see the five risk factors identified by the Flight 2605 investigation and all the corresponding outcomes,” he writes. “One of those outcomes is indeed the crash, but there are many other noncrash outcomes as well. These factors, in other words, are “necessary but not sufficient” conditions: Without them, it’s extremely unlikely that we’d have a crash; but just because they’re present doesn’t mean that a crash will happen, or is even all that likely. Once we do see a crash, however, our view of the world shifts. Now all the “noncrashes” have disappeared, because we’re no longer trying to explain them— we’re only trying to account for the crash— and all the arrows from the factors to the noncrashes have disappeared as well. The result is that the very same set of factors that appeared do a poor job of predicting the crash now seems to do an excellent job.”

The world is just much more random than we realize, Watts says. When we tell ourselves that we can understand what happened, chances are, we’re fooling ourselves.

“This confusion between stories and theories gets to the heart of the problem with using common sense as a way of understanding the world. In one breath, we speak as if all we’re trying to do is to make sense of something that has already happened. But in the next breath we’re applying the “lesson” that we think we have learned to whatever plan or policy we’re intending to implement in the future. We make this switch between storytelling and theory building so easily and instinctively that most of the time we’re not even aware that we’re doing it.” And then we’re surprised when the outcome is wildly different from what we expected.

This explains why so many “experts” get it so wrong so much of the time.

Advertisements

Back to Featured Articles on Logo Paperblog