What is Cognitive Dissonance
As we journey through life, we encounter many stories of individuals who seem to be fully aware of their surroundings and the reality they inhabit while, in fact, they are living in a state of disillusionment or false perception. Their mental model of the world is so distorted and does not reflect the true nature of their circumstances. What is even more perplexing is that these individuals often possess impressive qualifications, training, experience, and knowledge, yet they still fail to see what should be obvious.
What stands out is not so much the fact that these individuals live in a state of denial but that they almost seem blind to the glaringly obvious. They appear oblivious to their own lack of awareness and in a state of mind where their mental framework does not allow for the recognition or acknowledgment of any wrongdoing or the consequences of their actions. These stories may be unsettling, but they raise important questions about the factors that can influence human behavior. We wonder if there is something at play that is contiguous and harmful to our ability to recognize what is obvious. Perhaps we are collectively navigating uncharted territory where our mental models prevent us from seeing things as they truly are.
As we contemplate the Vietnam War (1955-1975) or the Iraq War (2003-2011), chances are, like me, you wonder what these conflicts were all about. In retrospect, these wars should not have happened and are not justified. Avoiding these wars could have spared millions of lives, and nations could have avoided disarray. Yet, despite the overwhelming evidence and catastrophic outcomes indicating otherwise, politicians, promoters, and their supporters involved in the wars still believe they were necessary. Some argue that this is the natural dirt associated with the political arena, but the fact is, there could be much more to it than that. Political decision-makers and politicians are supposedly trusted to lead because they are experienced personnel. In distressed situations, however, politicians’ actions have the paradoxical consequence of aggravating the problem, especially when they cross the point of no return. I.e., the political space closed off because of war, and the dirt has spilled beyond any control. The only way out is to raise the stakes and hope for an exit that satisfies their egos, maintains their status, and, less importantly, saves humanity. But what is really behind their denial? Is it their beliefs, fear of failure, or act of evasion from self-challenge? We must dive deep into human traits and cognitive infrastructure to answer this question and understand the underlying dynamics.
It is not unreasonable for someone to think that their beliefs are robust, factual, and nonnegotiable and, as such, are a viable justification for their judgments and actions. Consider, for example, the habit of smoking in our modern society. Usually, smokers reframe any inconvenient evidence against smoking to justify their actions and support their presumed unharmful habit. Instead of bridging the gap between the evidence and their belief, they tend to spin the former to normalize somehow its immediate effect on their health, let alone its long-term consequence. However, this and similar scenarios come into play with varying levels of dramatization. People feel twitchy and disconformable when they face evidence and facts that challenge their beliefs. This is Cognitive Dissonance. Similarly, if they mess up due to a wrongful judgment, they feel guilt and discomfort, but their self-esteem kicks in and takes over, pushing them to justify the failure and spin the outcome in one way or another. This is the High Road to Disaster.
Leon Festinger, a renowned and influential sociologist, was the first to coin the term “cognitive dissonance” to describe the inner anxiety and discomfort one feels when their beliefs are proven flawed or otherwise challenged by facts and evidence. This pushes them to adjust the inconsistencies for the sake of restoring the harmoniousness of their thoughts. Surprisingly though, in his close study of a cult group led by Mrs. Keech, who believed in the imminent end of the world as advised by extraterrestrial aliens, Festinger was able to show that when firm believers are challenged with evidence and facts that contradict their beliefs, they, contrary to any rational explanation, become even more deeply entrenched in their stronghold position and more utterly confident of their beliefs. They twist facts, ignore evidence, and realign their objectives for one and only one purpose: justify their poor judgment, maintain consistency between thoughts and behavior, and stay on track.
Empowered by human nature, we love to overestimate our capabilities and over-rank our intelligence. Being trained professionals with experience spanning decades, we reckon our actions are flawless, and our decisions are sound. When we face evidence contradicting our beliefs, we twist, spin, and normalize them. We focus on justifying the wrong rather than learning from the right. Ultimately, our ego and self-esteem do not permit otherwise, despite our agitated feelings.
But how can cognitive dissonance cause disastrous outcomes? To answer this question, we need to understand what leads to success and failure and dive deeper into the factors leading to and emerging from cognitive dissonance on the personal and social levels.
Recipe for Failure
Even though we all experience cognitive dissonance, there are differences in the settings. Those differences create distinctions in the outcomes and consequences. For example, an organization that views mistakes as sins punishable by the firing squad will evoke the complex interrelatedness between cognitive dissonance, blame, and fear of punishment. In addition, there is the utterly primal nature of the human emotional response in such situations: Cover-up or draw attention away from errors and mistakes, blamestorm and preach, and last but not least, normalize the deviance, so it becomes standardized. Consequently, the organization loses the ability to learn and improve from what is supposed to be “lessons learned.”
Take One
It’s more likely that you’ve either experienced a big mess or seen one unfold at work due to poor judgment. Now, try to recall how senior management revealed the underlying causes of failure. Chances are, you’ve heard unjustified justifications, seen the hunt for the scapegoats, witnessed the prosecution of the innocents, and participated in praising the non-involved. So this whole fiasco is simply a means of concealing the source of the problem and hoping for a solution, thereby preventing learning, improvement, and growth. Imagine how a senior manager having many years of experience would react upon messing up. Unless they are true leaders – a scarce commodity these days – they would probably preach, justify, blame the wrongful cause, and prosecute the frontliners. At the same time, their self-esteem and inner fear of failure conceal their mess. Nothing is more unnerving to professionals than questioning their rationality, criticizing their decisions, or threatening their professional identity. Yet, they are who they are: self-confident and less likely to admit to mistakes.
Take Two
There is a lot to be said about the healthcare sector. Unfortunately, to this day, medical errors continue to cause deaths and severe injuries despite technological advances. Some argue that medical practices are inherently risky, healthcare involves intricate processes, and the odds of mistakes are high, let alone the “Will of God.” Although that’s true, the real issue has more to do with the prevailing culture and the need for candidness in dealing with mistakes made by practitioners and governing bodies. Research supported by hard evidence has shown that the healthcare systems in the United States and many other countries are plagued with concealment, back-covering, and evasion. Unlike in different sectors, the healthcare governing bodies do not seem to have done enough to promote transparency and establish powerful and independent investigation frameworks. The problem behind this might be related to the litigation risks and professional liabilities doctors and practitioners might face, especially if doors were left open for the public to push speculations and assumptions far beyond the realm of facts. Irrespective of all this, one thing is almost sure: we shall continue to learn so often about fatal medical errors with the “Will of God” as the potential justification while the actual ingredients for improvement remain concealed. That is to say: A lack of acknowledgment prevents learning, which subsequently inhibits progress.
Take Three
The importance of the judicial system can’t be overemphasized. It protects the rule of law and its supremacy. It resolves disputes, safeguards people’s rights, and promotes social justice. In short, under a robust judicial system, we feel safe. Yet, crucial mistakes happen under the most sophisticated and advanced judicial systems. Among Captain Alfred Dreyfus, Archie Williams, Ron Williamson, Dennis Fritz, Juan Rivera, and Rubin “Hurricane” Carter, there is one thing they all had in common. They spent years behind bars for murders they did not commit. They were convicted based on junk evidence, unreliable testimonies, and biased decisions rather than physical evidence.
The point under discussion here is not the mistakes committed by the prosecutors but how they addressed them when new solid evidence came to light. Putting innocent people on death row or behind bars for years is not a matter to be taken lightly. Nevertheless, what would one expect a prosecutor involved in such cases to say about ruining the life of an innocent person when they come face-to-face with evidence contradicting their original stance? Case research in the United States has shown that despite DNA evidence presumably being indisputable; it is more likely than not that prosecutors become dismissive, fearful of failure, defensive, and reputation focused. In their attempt to justify their behavior, they twist the facts without letting go of their original beliefs. Even though such behavior is expected, it prevents the opportunity to identify error traps and learn from them.
Take Four
Smartness and mental horsepower do not always guarantee success. In fact, the opposite is true in the presence of overconfidence fueled by arrogance. Arrogance restricts one’s ability to rethink options, adapt, and become resilient to evolving situations. One becomes entrenched within their agenda no matter how far the rabbit hole such agenda takes them. History teaches us a lot, and we have heard weird stories of intelligent and successful people who ended up locked in prisons of their own making.
Chances are, you haven’t heard of Mike Lazaridis, who, like many others, has changed our lives in one way or another. At some point, Mr. Lazaridis was regarded as the unbreakable genius who created the Blackberry phone and was on track to do much more. However, his ingenuity came to an end shortly after the introduction of the iPhone. Why? Because of his inability to see and evaluate evidence to allow him to change his belief. I.e., cognitive dissonance was in full swing. Mr. Lazaridis never believed that physical keyboards would be trashed in favor of touch screens or that people wanted wireless devices for entertainment and social networking. Despite all the evidence to the contrary, he was confident that mobile devices would continue to thrive as business, email, and messaging devices. In his book Think Again, Adam Grant elaborates on this story and explains how Mr. Lazaridis and his team failed to adapt and recover from this setback even after acknowledging the reality as the failure to rethink options was widespread beyond control.
Ingredients of Success
Take One
Citicorp Center, later renamed Citigroup Center, is regarded as one of the pioneer iconic buildings in New York. The structural frame of the Citicorp Center was an ingenious design and a major departure from conventional practice. The tower’s structural system is based on a central core and four piers at the center of the tower’s four faces. With its 59 stories and unique base, the building was the 7th tallest building in midtown Manhattan the year it was built in 1977. While Hugh Stubbins was the prime architect, William LeMessurier, a renowned structural engineer, takes most of the credit for this marvel. After completion and occupation, an undergraduate student studying Citicorp Center’s structural system discovered a critical flaw in the design that rendered the building vulnerable to joint weaknesses under wind forces at its corners. Through a series of communication and information flows, LeMessurier became aware of these findings, conducted several reviews and simulations, compared the results, and eventually acknowledged the magnitude of the issue and the dangers it poses to the public. Faced with what can be described as an engineer’s worst nightmare, the situation appeared to be a turning point for LeMessurier’s professional career and reputation. Either he sweeps this issue under the carpet and hopes for the best, or he faces the embarrassment, ignores his ego, and addresses the case on the spot. What came next was extraordinary. LeMessurier’s and his team drew up emergency plans, worked closely with various stakeholders, and coordinated and implemented a massive 24/7 repair scheme. To this day, the Citicorp Center case is studied in universities, and LeMessurier’s actions represent the ideal meaning of professionalism, ethical behavior, leadership, and courage. As opposed to hiding facts fearing professional execution, LeMessurier proudly admitted his mistakes and promoted his engineering practice instead.
To this point, this story seems to have ended; however, there is a caveat. While LeMessurier did the right thing by acknowledging his mistake, informing the affected stakeholders, and deploying emergency rectification measures, his actions didn’t pass without controversy. To date, many argue that LeMessurier and his associates kept the case secret within an inner circle until 1995, when it was revealed in The New Yorker, thus preventing valuable knowledge from being shared for almost 20 years across the engineering community. Nevertheless, LeMessurier emerged as a hero by standing firm in front of his ego and self-esteem, managing his cognitive dissonance, demonstrating wisdom, and acting responsibly.
Take Two
Over the years, we have heard about many air crashes that stunned the world. Yet the aviation industry is regarded nowadays as one of the safest. So why is that? In the aviation industry, decision-makers promoted an environment structured around openness and transparency. They were able to use failure as a mechanism for improvement. That doesn’t imply that the aviation industry is perfect but rather transformational. The black box has become not only a secret revealing box but also a motivator for pilots to report even a minor near miss. In addition, the black box paved the way for the development of CRM (Crew Resource Management) and provided every opportunity for the industry to learn and improve.
Besides rigorous testing and certification procedures, improved automation, and air traffic controls, the aviation industry focused on the human factor. In aviation, reporting near misses is a crucial prerequisite for improvement. Nowadays, pilots are trained and encouraged to report errors irrespective of their intensity. Thanks not only to the black boxes but also to pilots and traffic controllers themselves.
Conclusion
The consensus is that learning from the outcomes of our actions is essential for growth and enhancement. However, this learning process often encounters obstacles, with cognitive dissonance playing a significant role. In his acclaimed work “Black Box Thinking,” Mathew Syed discusses how the fear of blame, punishment, and a personal dread of failure can stifle openness and inhibit learning. He further points out that threats to one’s ego and the potential damage to one’s self-worth can lead individuals to deny and not acknowledge their errors, sometimes even to themselves. Consider a person who has long held and advocated for a belief or stance, only to discover irrefutable evidence proving them wrong, leading to considerable repercussions. The response might be predictable if this individual holds a leadership role within any team or organization. To protect their self-esteem, they might alter facts, rationalize errors, and shift blame onto others. Remarkably, such individuals might come out seemingly unaffected, even more convinced of their stance than before.
Cognitive dissonance affects us all, and we each handle its discomfort in our own way. In his book “How Our Brains Betray Us,” Magnus Mcdaniels outlines three strategies to mitigate cognitive dissonance. The initial strategy involves modifying one’s beliefs, attitudes, or behaviors to align with new information. The second strategy focuses on emphasizing information that supports existing beliefs while downplaying contradictory evidence. The third strategy employs self-persuasion to deemphasize or disregard evidence that challenges one’s current beliefs. Of these, the first approach—adapting in response to irrefutable evidence—appears the most rational but is also the most difficult. It often means questioning and potentially changing long-held views, a process that demands a high degree of self-awareness, intelligence, bravery, and leadership. Simply put, it’s much easier said than done.
So, how do we deal with the fixed mindset? And how do we handle cognitive dissonance on the personal and, more importantly, on the social level? The first thing one should do to reduce cognitive dissonance is to look at failure from a different perspective. As Mathew Syed put it, “When we see failure in a new light, success becomes a new and exhilarating concept.”
Recovering from a significant setback and adapting to a new reality while maintaining oneself throughout requires considerable effort and a bold choice to challenge the status quo. It requires an open mind and a unique mentality embraced with intellectual humility. Simply put, it requires a scientist’s and not a preacher’s thought pattern. To elaborate more, it is worth noting Professor Graham Grant when he outlines four modes of thinking (otherwise called mindsets): Preacher, executor, politician, and scientist. Even though this article does not intend to elaborate in depth on all four modes of thinking, it is worthwhile to distinguish between the two most opposing modes, i.e., the preacher’s and scientist’s mindsets.
Scientists seek the truth by exploring and analyzing options and are ready to expect and accept any potential outcome. The self-worth of these individuals isn’t influenced by being right or wrong about a particular topic. Instead, they are curious and open, and above all, they don’t fear failure but learn from it. They feel intrigued instead of defensive when they encounter information that contradicts their beliefs. As Grant put it, they “favor humility over pride and curiosity over conviction, …, look for reasons why they might be wrong, not just reasons why they must be right.”
Preachers, on the other hand, are hardcore believers, convinced they’re right and others are wrong. They preach from intuition as opposed to evidence; they start with answers as opposed to questions; they permit beliefs to become ideologies. The opinions and views of individuals with a preacher’s mindset are so bound up in their loyalty that even the tiniest questioning is considered shameful or even profanity. The self-worth of such individuals is motivated by the degree people believe in and agree with their views. They consider themselves the unshaken kings of their strongholds.
Acknowledging and understanding the contrast between these two mindsets is essential to overcoming cognitive dissonance. Whether you are an engineer, doctor, pilot, lawyer, or else, you probably fell prey to your mental dissonance trap. So, take on the role of a scientist, be prepared for the unexpected, disappointment, and failure, ruminate, look for and analyze inconsistencies in your thinking, challenge your beliefs, justify your actions, and seek facts that often conflict with your opinion, rather than smoothing out your emotions.
As a final note, Antoine de Saint-Exupery once said: “If you want to build a ship, don’t drum up people to collect wood and don’t assign them tasks and work, but rather teach them to yearn for the endless immensity of the sea.” In other words, people are not machines, don’t push them to do something before they have acquired a desire for its vast and boundless limits. Their attitude needs to change, they need to accept diverse outcomes, and they need to learn not to feel ashamed when they do something wrong. In short, there is more to it than being intelligent or knowledgeable; it is about having the right mindset.
So, the question I would leave you with is, what mindset are you in? A preacher’s mindset or a scientist’s mindset?
It did really answer a lots of unanswered questions.