How?
How could this have happened? We did a thorough analysis, discussed the matter carefully, evaluated almost all the available options, and apparently took the right decision. In short, we did everything right, yet it didn’t take long for things to turn south and eventually for a disaster to strike.
We did everything right, yet it didn’t take long for things to turn south and eventually for a disaster to strike.
The Tunneling Concept
Does the above scenario sound familiar? Hopefully not, but I bet it does. Every now and again, we face real-life scenarios that entail swift and risky decisions. Whether in a leadership, manager, or contributor role, we normally take critical decisions based on a certain level of maturity and understanding related to the matter of concern. This understanding, however, can be directly influenced by personal desires and beliefs leading to a predetermined opinion about an issue and a biased preferential proposition for its resolution. Under such circumstances, critical risks can be easily overlooked, dismissed, or severely discounted. As it turns out there is a title for this phenomenon, which is the “Tunneling Concept” or otherwise known as the “Confirmation Bias.”
Tunneling is a very serious and dangerous mental concept that is correlated to the tendency to steer decisions towards substantiating predefined propositions or strategies, rather than identifying, exploring, evaluating, and promoting new solutions. It is the human tendency to anchor to pre-existing ideas or beliefs, narrow the focus, and become entrenched within views and ideas that support the original predetermined proposition while ignoring data that challenges it. Eventually, this tendency limits the ability of the decision-makers to realize the sources of uncertainty outside the boundaries of their views and beliefs and leads to irrational decisions and risky behavior.
As on how tunneling can influence professionals, is truly intriguing. The scenario presented in the opening paragraph is a classic example of how professionals tend to believe that they are formulating objective, rational, and risk-balanced decisions, while on the contrary, all what they are doing is deep tunneling into what they already believe in. In hindsight, the disaster should not have happened and the decisions taken were sound – at least from the instigator’s perspective.
In Light of Hindsight
In her publication “Just Culture”, Sidney Dekker states: “There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.” This is a very true and powerful statement. When we tunnel deeper, we gravitate towards the far narrow side of thinking (as opposed to the divergent broad side of critical thinking). If the consequence of tunneling produces neutral or positive results (fair to say, out of a combination of lucky circumstances), we normalize tunneling as a standard practice in future dealings. Even worse, we strive to include tunneling as part of our continuous improvement process. If however, tunneling produces negative consequence, hindsight strikes again after a period of denial (i.e. everything was done right, and nothing could have been done better) – Unconscious / Incompetent Stage. Once we clear this stage of incompetency and denial, we become able to see the divergent end of the tunnel where options were available far more than we had initially thought. Sadly to say though, the latter comes to light unforgivably tardy.
The sour truth is that we are all entrenched, one way or another, in our own beliefs and we continuously resist, with varied levels of intensity, clear and mounting evidence that negates such beliefs. Even when shown evidence to contradict our biased view, we continue without hesitancy to show special emphasis on information that supports the conclusion we want, and in a manner that reinforces our current perspective. We all have done it and continue to do it at the personal and professional levels. Researchers for example, attempt to interpret results in a manner that reinforces their existing perspective, despite clear evidence and tangible data that contradict it. Comparably, doctors sometimes get so attached to a certain diagnosis to an extent that they try hard to interpret symptoms in such a way to confirm their predetermined guess and intuition while ignoring crucial markers of other diseases.
“There is almost no human action or decision that cannot be made to look flawed and less sensible in the misleading light of hindsight. It is essential that the critic should keep himself constantly aware of that fact.” – Sidney Dekker, “Just Culture”
To conclude, the power of tunneling is incredibly powerful and destructive. Tunneling erodes our perception of risk and renders us oblivious to potentially catastrophic errors in our critical analysis leading to decisions. It is the silent killer which we must all be aware of and combat fiercely.