People like to think they’re rational, in control, and their beliefs make sense. I like to think so, too. As a journalist and a researcher, I’d like to think that I can follow the evidence and my opinion will simply go where the proof leads me.
That so many people hold beliefs in direct conflict with prevailing scientific wisdom casts doubt on that assumption of rationality. The Pew Research Center found in a report from last week that while most tea-party Republicans doubt whether climate change is happening, most non-tea-party Republicans say climate change is occurring.
It’s easy to mock people who deny that climate change is occurring (even when 97 percent of climatologists agree that it is) or those who place more value on the creation myth than on the Theory of Evolution. One is, by definition, a myth. And the other is based on 150-plus years of scientific research.
Clearly, the overwhelming majority of reliable evidence supports the Theory of Evolution and the existence of climate change. So have these people been brainwashed? Are they impaired somehow?
Yup. They all suffer from a very serious affliction that makes them susceptible to the confirmation (or myside) bias. This means they typically pay attention to evidence that confirms what they believe while ignoring what contradicts preconceived beliefs.
If you are a human, you also suffer from this exact same condition. People believe what they want to believe. Unbiased rationality rarely plays a role when it comes to beliefs, especially deeply held moral views.
A classical social psychology experiment demonstrated this phenomenon in 1979. Advocates for and against the death penalty were shown two pieces of evidence. One supported and the other negated the punishment’s deterrent effect on crime, and both were equally strong. Like good little humans, the subjects played down the strength of evidence contradicting their beliefs while overemphasizing evidence that confirmed their beliefs.
This same experiment has been replicated with other issues such as stereotypes about gays and a 2004 experiment about whether Iraq had weapons of mass destruction.
This seems like common sense. Of course people do that. People don’t like to be wrong, so naturally they’ll just believe whatever information proves they’re right and disregard anything that suggests otherwise. Because you’re an enlightened individual who is aware of the myside bias and examines evidence from both sides of an argument, you’re immune to it right?
Wrong. The strangest thing about many psychological and sociological concepts is that nearly everyone (including me) tends to attribute these to everyone else, but don’t think we are doing exactly what the theories and evidence predict. We have this weird tendency to assume that we’re above average, that we know better, so we won’t act that way. Oh those silly little humans. When will they ever learn?
Nevertheless, this presents a serious problem for me. I like evidence. It’s useful for guiding an argument, but maybe I just use facts and statistics that support my opinion. Truth be told, I almost certainly have done it (unintentionally).
There isn’t a golden solution to beating the myside bias other than being mindful of its universality. Every journalist, scientist, professor, doctor, lawyer, and your Great Aunt Jemima has it. Above all else, remember that the myside bias applies not just to everyone else but especially to you.