The Culprit

Many of the investigation outcomes related to incidents, accidents, or even catastrophes that we have seen or heard of, point to human error. While that can be very true, it is worth noting out – before diving into this topic – the difference between a proximate cause and a root cause of an incident. Simply put, the former is the cause directly responsible for the incident, whereas the latter is the underlying cause leading to that incident. Putting this into context, let’s take the example of texting while driving. If an accident occurs, the proximate cause, as you guessed, is texting while driving – i.e., the habit of texting or the mobile phone. However, the underlying cause(s) are way more difficult to detect, but they exist. These causes might include physiological, social, or even physical factors that can have profound effects on the driver’s behavior. Failing to recognize and interpret these underlying causes can lead to serious misjudgments and skewed conclusions. To stave off overwhelm, however, investigators have a natural tendency to jump to easy-going and effortless inferences.

One striking phenomenon among the many plaguing and skewing the human behavior norms is the “Outcome Bias”. This bias involves a fixed, overgeneralized conclusion or judgment about a particular matter based on its outcome (endpoint), while ignoring many factors that might have contributed to its success or failure. And while some people think that they are safeguarded against such behavioral anomaly through their overwhelming knowledge, wisdom, and intelligence, the truth is that everyone falls prey to this cognitive quirk and devious culprit.

One striking phenomena among the many plaguing and skewing the human behavior norms is the “Outcome Bias”.

The Outcome Bias at Work

The best way to elaborate on the manipulation of this bias is by presenting an example. Let’s say you decided to hire a tutor for your daughter to help her pass a critical entry exam for a once-in-a-lifetime educational opportunity. You have one of two academies to choose from to hire the tutor to do the job. Academic A has had 70% success in its previous trials with students, while Academic B has only had 40% success. If you’re already choosing Academy A, you have just fallen victim to the outcome bias. While it is sensible to utilize the heuristics technique to judge and decide on what is best for your daughter based on the percentage of success, what you haven’t done is taking sufficient time to evaluate objectively the circumstances leading to the 70% success and 40% shortcoming of Academies A and B respectively. It turns out that Academy A had far fewer students than Academy B. In addition, almost all the students who utilized Academy A have repeated the entry exam at least twice. So, by ignoring these two circumstantial factors, you can easily skew your judgment by referring only to the outcome results.

To further elaborate, assume you are a project manager on a mega petrochemical project where heavy and super lifts are daily routine. You learned that in the adjacent project, the team there has been conducting heavy lifts under wind speeds as high as 25 knots using a super heavy crane like the one used on your project. Operating a crane under such adverse weather conditions involves major risks and obviously entails a major deviation from the standard operating procedures. Yet, one day you decided to proceed with a critical lift under similar weather conditions especially considering that nothing went wrong during all the 10 previous lifts conducted in the nearby project.

In this scenario, the decision to proceed with the previous lifts was itself very risky – and the project may have only avoided an accident through a combination of lucky occurrences. But thanks to the outcome bias, the possibility and assumption that either the risks had been overrated, or that they are irrational prevail to make you feel motivated and better off taking similar risks on your project. Even worse, the more you do it without negative consequences, the less concerned about the risks and danger you become.

.. thanks to the outcome bias, the possibility and assumption that either the risks had been overrated, or that they are irrational prevail to make you feel motivated and better off taking similar risks on your project.

In fact, the outcome bias is generally acknowledged to be the key ingredient contributing to the development of the phenomena Normalization of Deviance, first introduced by sociologist Diane Vaughan when reviewing the Challenger disaster. Normalization of Deviance defines the process in which the deviation from standard procedures become normalized with an organization in a such a way that the deviant behavior no longer become deviant, but rather the new standard practice. This phenomenon will be discussed in more detail in future articles.

As far as I know, the outcome bias was behind major global accidents, least to mention, the Space Shuttle Challenger disaster – 1986, the Chernobyl Nuclear disaster – 1986, the Big Blue Crane collapse – 1999, the Space Shuttle Columbia disaster – 2003, the Deepwater Horizon disaster – 2010, and many others.

Learn more with Eye On Risk…

Leave A Comment

you might also like