We often find that conversations can play out as follows.
Anti-Vaccine friend arguing that vaccines are bad:
“There was a scientific study that showed vaccines cause autism.”
You reply …
“Actually, the researcher in that study lost his medical license, and overwhelming research since then has shown no link between vaccines and autism.”
Their rebuttal to those facts is then …
“Well, regardless, it’s still my personal right as a parent to make decisions for my child.”
… and that then leaves you completely stunned, and wondering why oh why would anybody just ignore reality like that and instead opt for the embrace of an idea that has been clearly shown to not be true at all.
This is of course not specific to vaccines, but is a meta-pattern that holds true almost universally within the human species, so if discussing climate change, aliens, bigfoot, ghosts, or religious beliefs, then there is a strong certainty that you will indeed come across this.
Is there any understanding of what is going on here?
Yes there is, Troy Campbell and Justin Friesen write in Scientific American about the research they have been doing on this …
Our new research, recently published in the Journal of Personality and Social Psychology, examined a slippery way by which people get away from facts that contradict their beliefs. Of course, sometimes people just dispute the validity of specific facts. But we find that people sometimes go one step further and, as in the opening example, they reframe an issue in untestable ways. This makes potential important facts and science ultimately irrelevant to the issue.
So what they did was this, they presented a group of people with a set of facts regarding a contentious issue (same sex marriage), and then observed that when the facts aligned with the held position, they stated that clearly their position was indeed fact-based, but when the facts contradicted their position, they quickly retreated into a untestable position and claimed that same-sex marriage is not about facts, but rather is about the morality of it all.
Key point: this was not just a rejection of facts, but was a rejection of the idea that facts played any part at all in the decision making process, and that is quite an interesting observation.
They tried the same with religion. A group were given a fact-based article that was critical of religion, and then observed that once again there was a retreat away from facts and into the untestable domain with the deployment of words such as “faith”.
Facing the facts
There is a common assumption that specific groups who hold a different view simply lack a bit of information and if presented with the missing information then they will change their minds. Here are a few examples …
- Those anti-vax believers simple don’t understand the science and the evidence, and if I present them with such facts, then they will quickly change their minds.
- I’m a religious pentecostal and as I sit here in my church and look out across the street at the Catholics, I’m thinking that they simply lack a bit of information. If presented them with my facts, then they will convert and come join us
- Clearly the world is indeed warming up, humans are causing it, and so the facts are clear. If I just present the missing bits of information to those climate change deniers, then they will change their minds.
Now in your own mind rate the probable success of each of the above?
So what this new study reveals is this …
We find that when facts are injected into the conversation, the symptoms of bias become less severe. But, unfortunately, we’ve also learned that facts can only do so much. To avoid coming to undesirable conclusions, people can fly from the facts and use other tools in their deep belief protecting toolbox.
So how do we address this, especially when we are all potentially prone to it?
Well, here is one of the tests that they conducted …
In an experiment with 179 Americans, we reminded roughly half of participants that much of President Obama’s policy performance was empirically testable and did not remind the other half. Then participants rated President Obama’s performance on five domains (e.g., job creation). Comparing opponents and supports of Obama, we found that the reminder of testability reduced the average polarized assessments of President Obama’s performance by about 40%.
So it turns out that being aware of this very natural bias, and then focusing on the testability of the issue did indeed greatly help.
Encouragement to reject the dogmatic embrace of ideology and engage in critical thinking does truly work, but only to a certain extent, and I personally suspect we will continue to observe the persistence of bias. This is because people become emotionally invested in specific ideas and will cling to them. As revealed by this study, one strategy is the retreat into an untestable position that is immune to facts, however some gentle friendly encouragement to test their ideas will indeed make a difference to that.
“we can become a people more free of ideology and less free of facts” – Troy Campbell and Justin Friesen
Links
- The research paper – The psychological advantage of unfalsifiability: The appeal of untestable religious and political ideologies – Journal of Personality and Social Psychology, Vol 108(3), Mar 2015, 515-529