An important addition to any skeptical tool kit is the ability to recognize cognitive biases. Cognitive biases are not so much specific kinds of mistakes, as they are the motivating force for an entire pattern of mistakes. They are subtle tendencies of the human mind to diverge from rationality. One example of a cognitive bias is pattern recognition. Humans are good at pattern recognition. Too good. We are more likely to recognize a pattern that isn't there than we are to miss a pattern that really is there. This cognitive bias results in stuff like pareidolia and the post hoc fallacy ("after, therefore because of"). We might trace this cognitive bias to evolution--to our ancestors, missing a pattern was much more dangerous than seeing a pattern that wasn't there.
But the most important cognitive bias of all is confirmation bias. Confirmation bias is the tendency of people to be more receptive to evidence and arguments that confirm their previous views. The common description of confirmation bias is "Remembering the hits, forgetting the misses". But it's not necessarily a matter of forgetting. More often, we simply underplay, undervalue, or wave away opposing evidence, while being especially receptive to confirming evidence.
It is important to understand that this does not always result in anything we might call a "fallacy". Cherry picking evidence is a fallacy in that it is a misunderstanding of the nature of statistics. But what if we have various pieces of evidence, from all different sources, and we are unsure of the relative quality of each? What if we look into it, and reason ourselves to the genuine conclusion that the confirming evidence is better quality than the opposing evidence? Is that truly "irrational"? What if someone else does the same and draws the opposite conclusion? Cognitive biases allow the rational to blend into the irrational. Even as you use reason, it can betray you, perhaps even giving you more conviction in your wrong beliefs.
Confirmation bias is an ever-present force. It is always in effect, for everyone. Some might even go so far to say that confirmation bias accounts for all of human disagreement.
Ending confirmation bias is a difficult or impossible task. In other people, you can only really point out the most obvious errors that result from confirmation bias. You can point out cherry-picking, or any double-standards of evidence, but that's just about it. In oneself, it's a bit easier, since, to some extent, we can control ourselves. To some extent, we can make ourselves into objective observers. But I wouldn't go too far down that path. You might just end up convincing yourself that you must always be right because you are always so darn objective. In your confidence, your own biases
If my thoughts may wander a moment, I once said that this was the problem with the Realist movement in art. The artists wanted to be more truthful and objective, just like science. Perhaps they succeeded to some extent, but in the end, they only created a better illusion of objectivity. This was based on a misunderstanding of how science should work.
See, science's solution to confirmation bias isn't to make oneself into an objective, emotionless observer. Science's solution is the peer review process. A bunch of humans with knowledge of the facts, they come together with all their different emotions, opinions, and biases, and they produce truth. That is the best solution to confirmation bias, or about as good as we'll ever come up with.
Of course, outside of science, we don't always have the benefit of peer review. Ah, well, we just have to make do.
1 comment:
I think science's attempts to falsify theories is one of the strongest confirmation bias antidotes, forgetting to consider the value of having people's biases balance one another - until you mentioned the peer review process as a solution to confirmation bias. Good point on that. So I borrowed your words when I wrote my own confirmation bias post. (http://www.thinktoomuch.net/2010/08/30/confirmation-bias/) Thanks! ;)
Post a Comment