In 1988, a group of psychologists ran an experiment in which subjects were asked to evaluate arguments they had either written themselves or read passively. The arguments were the same. But subjects rated the ones they had written as significantly more convincing — not because they were better, but because they were theirs. The effect was not about confidence or familiarity. It was something more specific: we are drawn to our own ideas in a way that is genuinely difficult to override with evidence.
This is one of the more well-replicated findings in social psychology, and it goes by several names depending on which facet researchers are studying. When it involves clinging to beliefs under pressure, it is called belief perseverance. When it involves seeking out confirming information, it is confirmation bias. When it involves interpreting ambiguous data in whatever direction favors our prior view, it is motivated reasoning. These are not the same process, but they share a common root: the human mind protects its existing models of the world more aggressively than we tend to assume.
Why Minds Resist Revision
The traditional model of belief change — what we might call the "information transfer" model — assumes that people update their beliefs when they receive good evidence. Give someone clear facts and logical arguments, and they will adjust accordingly. This model underlies much of journalism, education, and public health communication. It is also largely wrong, or at least deeply incomplete.
The cognitive science literature on this is extensive. When people encounter information that challenges a strongly held belief, the brain often treats it as a threat. The same neural systems involved in threat response — associated with anxiety, defensiveness, and fight-or-flight — activate when core beliefs are challenged. This is not metaphorical. Studies using neuroimaging have found that the anterior cingulate cortex, a region involved in conflict detection, shows heightened activity when people process belief-disconfirming information.
What this means in practice is that the more important a belief is to someone's identity, the harder it is to revise through argument alone. Political beliefs, religious commitments, deeply held moral convictions — these are not just data points in a mental model. They are structural. Changing them feels like damage to the self.
We do not hold beliefs the way we hold objects — we hold them the way we hold identities.
The Backfire Effect and Its Limits
For several years, a phenomenon called the backfire effect received significant popular attention: the idea that correcting misinformation could actually strengthen people's false beliefs by triggering defensiveness. The original 2010 study by Brendan Nyhan and Jason Reifler generated enormous interest.
Subsequent research, however, has complicated the picture considerably. Multiple replication attempts have found that the backfire effect is rare, context-dependent, and possibly an artifact of specific experimental conditions. The more robust finding is that corrections generally do work — people do update their beliefs when given accurate information — but the updating is often partial, slower than the information-transfer model predicts, and most effective when delivered in ways that don't feel threatening to identity.
This is a more nuanced and arguably more useful conclusion than "corrections backfire." It means that how you present information matters enormously. Framing, source credibility, emotional tone, and the social context of the exchange all affect whether information leads to genuine updating or triggers defensiveness.
When Beliefs Are Worth Protecting
It would be a mistake to conclude from all of this that belief perseverance is simply a malfunction. In many contexts, it serves a function. A mind that revised every belief at the first piece of contrary evidence would be epistemically unstable. Holding a belief under some degree of challenge is not stubbornness — it is prudence. The question is whether the resistance is proportionate and responsive to genuine evidence, or whether it has become insulated from evidence altogether.
Psychologists sometimes distinguish between reflective and reflexive resistance to belief change. Reflective resistance involves actually engaging with the evidence and finding it insufficient — a legitimate epistemic response. Reflexive resistance involves dismissing evidence without engaging it, because engaging feels dangerous. The second pattern is where motivated reasoning becomes a problem.
Intellectual humility — the disposition to take seriously the possibility that you are wrong — is consistently associated with better epistemic outcomes: more accurate beliefs, more willingness to update, better performance on prediction tasks. It is not the same as lacking convictions. A person can hold strong views and still approach disconfirming evidence as genuinely interesting rather than threatening. The combination is rarer than it should be.
Practical Implications for Changing Your Own Mind
If you want to be the kind of person whose beliefs track reality more closely over time, a few things seem to help.
Separate belief from identity selectively. Some of your beliefs are genuinely part of your identity and should be. Others are contingent positions you happened to form under particular circumstances. Practicing the distinction — asking yourself which beliefs you hold because they are you versus which you hold because they seem true — creates a little space for honest revision.
Steelman before you rebut. Before dismissing an opposing view, try to state it in its strongest form — the version a thoughtful proponent would actually endorse. This is harder than it sounds. It slows down the defensive reflex and occasionally reveals that the opposing view has more going for it than you assumed.
Track your predictions. People who write down their predictions and then review them — a practice promoted by researchers like Philip Tetlock in his work on forecasting accuracy — tend to improve their calibration over time. The feedback loop between prediction and outcome is what the mind needs to update, and it rarely creates that loop spontaneously.
Expect the process to be uncomfortable. Genuine belief revision almost always involves some discomfort. If you are only comfortable updating on trivial matters, you are probably not updating on the things that matter. The willingness to sit with that discomfort — without either capitulating immediately or hardening into defensiveness — is the actual work of intellectual growth.
None of this makes the mind cooperative. It makes it slightly less resistant. That, it turns out, is enough to matter.
¹ Ziva Kunda — "The Case for Motivated Reasoning" (1990), Psychological Bulletin ² Jonathan Haidt — The Righteous Mind: Why Good People Are Divided by Politics and Religion (2012, Pantheon Books) ³ Peter Wason — "Reasoning about a rule" (1968), Quarterly Journal of Experimental Psychology



