🧠 Psychology

Confirmation Bias: The Habit of Seeking Only What You Already Believe

Peter Wasons 2-4-6 task showed how the mind hunts confirming evidence and avoids tests that could disprove it. Why intelligence often makes the bias worse, and what actually counteracts it.

May 5, 2026


Confirmation Bias: The Habit of Seeking Only What You Already Believe

Advertisement

In 1960, the British psychologist Peter Wason ran a deceptively simple experiment. He told participants the sequence "2, 4, 6" followed a rule, and asked them to figure out the rule by proposing more three-number sequences and being told whether each one fit. They could test as many sequences as they liked.

Most participants proposed something like "8, 10, 12," were told it fit, then "20, 22, 24," were told it fit, and confidently announced the rule: consecutive even numbers ascending by two.

The actual rule was much simpler. Any three numbers in increasing order. "1, 2, 3" would have fit. "1, 17, 1,000,000" would have fit. The participants confidently proposed only sequences they expected to fit and never tested ones that might fail. Once they had a hypothesis, they hunted for evidence that confirmed it.

Wason called it confirmation bias. He may have named the most powerful and least visible cognitive habit human beings have.

What Confirmation Bias Actually Is

Confirmation bias is the tendency to seek, interpret, and remember information in ways that support what you already believe. It is not a single mistake but a family of related habits, all of them biased in the same direction:

  • Selective search. When investigating a question, we look for evidence that fits our existing view and rarely look for evidence that would refute it.
  • Selective interpretation. Ambiguous information gets read as supporting our position. The same data can confirm two opposing views to two different people.
  • Selective memory. We remember the cases that fit and forget the ones that did not.

The bias is unusually durable because all three failure modes reinforce each other. We notice supporting evidence more, interpret it more charitably, and remember it more vividly — three thumbs on the same scale.

Why Smart People Are Often Worse at It

A counterintuitive finding from cognitive psychology is that intelligence and education do not protect against confirmation bias — and may quietly worsen it. Dan Kahan and colleagues at Yale studied a phenomenon they called motivated numeracy: highly numerate people are better at evaluating data fairly when it does not threaten their political identity, but they are no better, and sometimes worse, than less numerate people when it does.

The bigger your tool kit, the more elaborate your defense of what you already believe.

The reason is straightforward. A clever person is better at constructing arguments. If they are motivated to defend a conclusion, they construct better defenses. The same intelligence that lets you reason your way to the truth also lets you reason your way to wherever you want to go.

Where It Shows Up in Ordinary Life

Confirmation bias is not an academic curiosity. It is everywhere in daily life:

  • Hiring. A manager who forms a quick impression in the first thirty seconds of an interview spends the rest of the conversation finding evidence to support that impression.
  • Medicine. Doctors who anchor on an initial diagnosis can miss critical symptoms that point elsewhere — a documented source of diagnostic error called anchoring bias, a close cousin of confirmation bias.
  • Relationships. Once you have decided someone is hostile or kind, ambiguous behavior gets coded to fit. The same neutral text message reads as warmth or coldness depending on the assumption you brought to it.
  • News consumption. Algorithmic feeds amplify what users already believe. The bias predates the algorithm — but the algorithm rewards it efficiently.
  • Christian discipleship. Believers are not exempt. Reading Scripture to find what we expect to find rather than what is actually there is a constant temptation. The Bereans were commended in Acts 17:11 precisely because they tested even the apostle Paul's teaching against the Scriptures rather than accepting it because it sounded right.

How to Fight It

Confirmation bias cannot be eliminated. The mind is not built to weigh evidence neutrally. But you can build practices that compensate:

Ask what would change your mind. If no possible evidence would alter your view, you are not holding a belief — you are holding a commitment. Both can be legitimate, but they are not the same.

Steelman the opposing position. Most people argue against the weakest version of what they disagree with. The discipline is the opposite: try to state the other side better than its strongest advocates. If you cannot, you do not yet understand it.

Seek disconfirming evidence on purpose. When you think you know how something works, look specifically for the cases where your model would fail. Wason's task can be solved by anyone willing to propose "1, 1, 1" or "5, 4, 3" and pay attention to the answer.

Slow down at moments of high certainty. The feeling of obvious rightness is the moment confirmation bias is most active. Real thinking often feels less satisfying than confirmation bias does, because confirmation bias rewards us with the small dopamine of being correct.

Cultivate trustworthy disagreement. Surround yourself with people who care about you enough to tell you when they think you are wrong. A small number of honest critics is worth more than a large number of cheerful agreers.

The Habit Beneath the Bias

The deepest version of confirmation bias is not intellectual. It is a habit of self-protection. We confirm what we already believe because changing our beliefs is expensive — socially, emotionally, sometimes spiritually. To admit you were wrong is to lose ground you have stood on. To stay wrong but confident is, in the short term, more comfortable.

Honesty about reality requires a willingness to be discomfited. The wisdom traditions called this humility. The cognitive scientists are simply pointing out that without it, your reasoning is mostly an elaborate machine for protecting what you already think.

The first step toward seeing clearly is admitting how much of what feels like seeing is actually defense.

Advertisement

References

Peter C. Wason, On the Failure to Eliminate Hypotheses in a Conceptual Task, Quarterly Journal of Experimental Psychology, vol. 12, 1960, pp. 129-140. Raymond S. Nickerson, Confirmation Bias: A Ubiquitous Phenomenon in Many Guises, Review of General Psychology, vol. 2, 1998, pp. 175-220. Dan M. Kahan et al., Motivated Numeracy and Enlightened Self-Government, Behavioural Public Policy, vol. 1, 2017, pp. 54-86. Daniel Kahneman, Thinking, Fast and Slow, Farrar, Straus and Giroux, 2011, ch. 7-8. Hugo Mercier and Dan Sperber, The Enigma of Reason, Harvard University Press, 2017. The Holy Bible, English Standard Version (Acts 17:11).