A vivid claim made the rounds in the early 2010s: when you correct someone's misinformation, their belief in the false thing can actually grow stronger. The phenomenon was called the backfire effect. It became a fixture of popular psychology, repeated in op-eds, TED talks, and exasperated dinner conversations: "Don't bother arguing with people — corrections backfire."
The story is elegant, alarming, and mostly wrong. Or, more precisely, half-right in a way that has been overstated. The science of belief change is more interesting than the slogan, and worth knowing if you actually care about persuading anyone about anything.
Where the idea came from
The backfire effect was named in a 2010 study by political scientists Brendan Nyhan and Jason Reifler. In a series of experiments, they presented participants with mock news articles containing false claims (e.g., that Iraq had weapons of mass destruction at the time of the 2003 invasion) followed by corrections. They reported that, for some ideologically motivated participants, exposure to the correction increased belief in the original misinformation rather than decreasing it.
The finding was striking. It seemed to confirm what many already suspected: that people who hold beliefs strongly will not only resist correction but double down on the falsehood when challenged. The phrase "backfire effect" entered the popular vocabulary, often used as shorthand for "facts don't change minds."
What replication revealed
Subsequent research has not been kind to the strong version of the claim. A series of large, well-powered replications have struggled to reproduce backfire effects. A 2019 paper by Thomas Wood and Ethan Porter, "The Elusive Backfire Effect," tested 52 issues across more than 10,000 subjects and found that corrections generally worked — moving beliefs toward accuracy — and produced backfire only in a single, narrow case (and even that did not always replicate).
Nyhan himself, the original author, has subsequently published work acknowledging that backfire is rare. In a 2018 piece for the American Journal of Political Science and elsewhere, he revised the conclusion: corrections typically reduce misperceptions, even among committed partisans, even when the corrections challenge identity-relevant beliefs.
The strong claim — that correcting misinformation predictably entrenches it — does not survive contact with the data.
What the data actually shows
The picture is more nuanced and, in some ways, more useful:
- Corrections generally work. Across most issues, people exposed to factual corrections update toward accuracy. The effect is small but real.
- Corrections rarely backfire. True backfire — moving belief in the wrong direction — is uncommon and usually small.
- But behavior often doesn't change. This is the more durable finding. People may update their factual beliefs while their attitudes, votes, and choices stay the same. A voter may concede that a politician's claim was inaccurate and continue supporting them. The disconnect between belief change and behavior change is the real challenge for fact-checkers.
- Identity-protective cognition is real. Yale legal scholar Dan Kahan's research shows that on issues tied to group identity (climate, vaccines, guns), people process evidence in ways shaped by what their tribe believes. They don't reject facts so much as filter them.
Why the slogan stuck
If the backfire effect is rare, why did "facts don't change minds" become so popular?
Three reasons stand out. First, it confirms a satisfying intuition: that the people who disagree with us are not just wrong but irrational, beyond reach, immune to evidence. The slogan provides a tidy explanation for why argument so often fails.
Second, it relieves us of responsibility. If correcting misinformation backfires, then trying to correct it is pointless — and not trying is morally exonerated.
Third, it has a kind of literary resonance. Boomerang effects, backlashes, and "the harder you push, the harder they push back" are familiar narrative shapes. The brain holds onto vivid framings.
But none of this makes it true.
What actually moves beliefs
The research that has held up — and there is a lot of it — points to several principles for changing minds well:
Source matters. Corrections are more effective when they come from a trusted source — and "trusted" is heavily shaped by group identity. A correction from a co-religionist, a fellow partisan, or a respected in-group member tends to land better than the same correction from an outsider.
Affirm before correcting. A long line of research (Kahan, Cohen, Sherman) shows that people are more open to threatening evidence when their broader self-worth has been affirmed. Briefly recalling a value or success unrelated to the issue makes people more willing to update on it.
Replace, don't just refute. Telling people what is not true is less effective than giving them a coherent alternative narrative. The brain is uncomfortable with informational vacuums; if you only debunk, the falsehood often stays in place because nothing has been put where it was.
Ask, don't tell. Motivational interviewing techniques — open questions, reflective listening — outperform direct confrontation in changing entrenched views. People persuade themselves more durably than they are persuaded by others.
Time matters. A single correction often produces a brief belief update that fades. Repeated, varied corrections from multiple sources produce more durable change.
What this means for honest argument
The popular adoption of "the backfire effect" had an ironic effect. People who believed in it stopped trying to correct misinformation, on the grounds that doing so was counterproductive. The research, however, suggests the opposite: corrections do work, on average, even when they don't transform the listener.
This doesn't make persuasion easy. Belief change is slow, partial, and often invisible to the person doing the convincing. But it does mean that the case for honest, careful, persistent correction is stronger than the slogan suggests.
The next time you hear "you can't change anyone's mind with facts," it's worth remembering: the science you'd be appealing to has itself changed its mind on that question. The evidence won. Quietly, undramatically, but unmistakably.
That, in itself, is a small confirmation that minds can change after all.



