Why facts don’t always change our minds

This page last updated March 6th, 2023
Discussion

The better informed we are, the better decisions we’ll make. Right?

Not so much!

It turns out that facts often don’t change our minds, especially if we already believe something different. There are many reasons for this.

On this page, we look at the reasons why humans are resistant to changing our minds. In a follow-up page, we’ll look at how we can do better for ourselves, and when talking to others.

Facts, beliefs and opinions

A fact is something that is real or true. But this means there are many unknown or yet to be discovered facts. So sometimes a fact is defined as “something known to be true”. But then it depends on who knows and who we trust to know.

An opinion, on the other hand, is “a view or judgement formed about something, not necessarily based on fact or knowledge”. Many opinions are personal (e.g. favourite music) but others are about important matters like politics, health or ethics.

Beliefs are often considered to be deeply held opinions about matters like religion, ethics, politics or personal identity. But as used by philosophers and psychologists, a belief is anything we hold to be true, whether it be based on fact or opinion.

We have all observed people holding beliefs contrary to fact, and being very resistant to changing those beliefs. Psychologists have studied this phenomenon and discovered aspects of the human brain, behaviour and identity that lead to this.

Ways we avoid facts

Confirmation bias

Confirmation bias occurs when we are selective about which information we accept – we most easily accept information that confirms our existing viewpoint and refuse to consider facts which oppose our beliefs. Sometimes called “myside bias”, it occurs most readily when there are clear sides about an issue, and we can only accept information that comes from our side.

Belief perseverance

People have a tendency to hold onto beliefs even when we listen to new, contrary information.

Motivated reasoning

When confronted with information that threatens our existing viewpoint we are unlikely to analyse and interpret this data well. It is likely that we’ll explain the information away rather than allow it to correct wrong thinking. It turns out that smart people are more likely to use “motivated reasoning”.

We think we understand more than we do

It is impossible to understand everything there is to know about any given subject, and we quite comfortable using technology even when we don’t understand how it works. But this means we can easily forget how little we know, and form beliefs based on very little. This can show in various ways:

  • Illusion of Explanatory Depth: we mistakenly think we know enough to have a reasonable opinion.
  • Dunning-Kruger Effect: people who don’t know much are more likely to over-estimate what they know.
  • Ignorance Gap: we tend to be uncomfortable not knowing something, so we can fill in the gaps by assumption and intuition.
  • Communities of “knowledge”: if baseless opinions are shared with others, the effect is self-reinforcing.

Avoiding complexity

Difficult concepts require time and mental energy to resolve, so sometimes simplistic explanations are preferred.

Group think

People in groups can find it easier to think the same – sometimes called “group think” or “herd mentality”. There are many reasons for this, including will be a separate page on group think soon.

Why we can want to avoid facts

Psychologists and neuroscientists can explain why we commonly behave this way.

In the brain

Different parts of our brain are involved when our beliefs are challenged. The prefrontal cortex is the part of the brain that controls the “exective functions” of reasoning, planning and decision-making. If we are considering our beliefs dispassionately, this is where our thought processes would be focused.

Holding firm in our beliefs releases positive hormones including dopamine and adrenaline, and we feel good. But if our beliefs are challenged, we can become stressed. And when we are stressed, another part of the brain, the amygdala can be activated.

The amygdala activates our more instinctive “fight or flight” response. In the case of our beliefs being challenged, “fighting” might mean raising our voice, arguing, etc, while “flight” might mean refusing to consider the opposing viewpoints any longer.

At first, the prefrontal cortex and override the amygdala, and we may continue to respond rationally. But prefrontal cortex processes can be tiring, and the amygdala doesn’t deal well with complexity so the emotional rejection of new ideas can take over. The hormone and neurotransmitter cortisol is released, executive functions are shut down and the amygdala responds instinctively.

If we end up facing an unwelcome change in our beliefs, we can feel threatened, uncertain, and anxious, which activates another part of the brain, the insula, which supports the amygdala in trying to avoid making the change.

These processes are hard-wired into the brain, and affect all of us. They can be controlled and overcome, but they make it emotionally harder to think rationally, and so they often lead to the responses we have noted, such as motivated reasoning and confirmation bias.

In the mind

Psychologists have identified several ways of thinking that can lead to mistrusting facts:

  • Self-preserving bias: the brain tries to eliminate any thought that will lead to a bad perception of our self (which casn include being wrong). “Identity trumps truth”. This is particularly so when we have held our beliefs since childhood.
  • The endowment effect: most of us are more afraid of losing something than we are eager to gain something of equal value. Our beliefs can be like posessions that we feel negative emotions at the thought of giving them up.
  • Ignorance gap: humans can be uncomfortable not knowing why certain events happen, and our brain is hard-wired to “fill in the gaps” intuitively and assign causality without sufficient reason. This means we don’t always think rationally about things we don’t understand well, and can make us mistrust experts who have researched the matter and come to a diffierent conclusion to the one we have intuited.
  • Anchoring bias: we have a tendency to believe the first piece of information we hear.
  • Believing what we want: it takes more information to convince us to believe what we don’t want to be true than to believe something we want to be true.
  • Emotional commitment: we find it easier to change our minds about neutral facts than about beliefs that we feel invested in.
  • How we feel: “individuals are motivated by fears, hopes, desires or prior beliefs rather than by facts alone”.
  • Backfire effect: hearing an alternative view may polarise and cause us to harden in our beliefs against that view.

It is clear then that both brain chemistry and our psychology and emotions can have a large impact on how we respond to unwelcome information, and can easily lead to us rejecting true information.

Evolutionary explanations

I am a little sceptical of some evolutionary explanations of human behaviour. While I know biological evolution to be true, I am less sure about some explanations that seem to be a little sceptical. Nevertheless, there are evolutionary explanations of these cognitive biases that may be helpful.

Evolutionary psychologists suggest that our reasoning ability evolved not so much to help us find truth but to help preserve collaborative social interactions in groups. One of the human race’s biggest evolutionary advantages, it is said, is our ability to work together as a team. So cognitive abilities that facilitate this will be advantageous.

So confirmation bias can help retain group harmony. People who support the group consensus are more likely to be successful in relationships. If I can win arguments and convince others, even if my arguments are wrong, I am more likely to be chosen as a leader. (We can all see examples of this in recent politics!)

The bottom line

So there are many reasons why we may reject facts which threaten our identity or don’t suit us. This may be particularly true in matter sof politics and religious belief. We are all prone to this at times, but there are ways we can reduce our non-rational decisions and help others to do the same. That will be another page.

References

Photo by fauxels

You may also like these

Feedback on this page

Was this page helpful to you? little

Comment on this topic or leave a note on the Guest book to let me know you’ve visited.