The Backfire Effect — It's why you can't change a Christian's or Trump follower's mind...and they can't change yours

Have you ever argued anyone out of their Christianity or their support of The Trumpster?

Neither have I.

The backfire effect occurs when, in the face of contradictory evidence, established beliefs do not change but actually get stronger. The effect has been demonstrated experimentally in psychological tests, where subjects are given data that either reinforces or goes against their existing biases - and in most cases people can be shown to increase their confidence in their prior position regardless of the evidence they were faced with.

In a pessimistic sense, this makes most refutations useless. (source)

Views: 921

Reply to This

Replies to This Discussion

I was a college student at the time working graveyard shift as a security guard and this nut job accosted me every day with his religious shit. And i got ultra pissed and we sat down in an empty caf and had a heated debate. He left crying and admitted it was all nonsense. But i would not be surprised if he later resumed his faith.

I just think there are exceptions. 

I also witnessed my cousin who was at the time a devout believer ask my dad how he could possibly be an atheist. And it literally took only a few minutes to deconvert my cousin. But in his case he was of the opinion that there was a solid foundation for religious belief. He is a strong atheist ever since.

I think your contention is mainly accurate. But there are exceptions.

I just checked. I never said there are no exceptions.

BTW, I find it annoying when you make a general statement and someone chimes in with either an exception or the statement "there are exceptions." 

It was a general idea, not proposed as a Newtonian law.

Are you always so needlessly contentious? 

Your words follow.

Have you ever argued anyone out of their Christianity or their support of The Trumpster?

Neither have I.


I have run across many  people who think they are special or interesting because they believe in wacky stuff. I find them dull and boring, and they can probably tell.

Yes. Dull and boring...regardless of how animated and passionate they are about their untruth. It's no coincidence almost all of my friends are for the most part, reasonably rational people. Christopher Hitchens was partly right in his comment: there's nothing worse than being boring.

I don't think he meant the naturally shy or the not so socially skilled...but instead those who blab about shit no one cares about, talks about shit they cannot defend or needlessly picks at and contests every sentence you make. That...and buzz killers:

The point of debating isn't to change your opponents mind, it's to change the minds of the audience. Obviously, debating vaccines with an anti-vaxxer isn't going to change their minds... but it may change the mind of someone searching for information on which side of the debate they want to be on.

As an example, I wasn't sure what to make of the debate around minimum wage. On the one hand, working a full week ought to pay enough to provide for yourself and your children. But, on the other hand, it means employers expect a certain amount a productivity from their workers, meaning people unable to perform at that level are essentially doomed, never able to gain experience and earn any money. I watched some debates on youtube and decided I'd rather be employed for peanuts at the start of my life/career than be doomed to never enter the job market. So the debates helped me decide which side of the debate I wanted to support.

It's been a long time. I left for Facebook groups and found a local community of atheists. I just signed on here to try to get an old email and found this thread. But people seriously misunderstand the backfire effect. There are ways around the backfire effect.

The source I would most recommend looking at is Lewandowsky et al. But we have been talking about this in my Facebook group really recently.

Wow, long time no see, John!

Yeah, totally on Facebook these days. Got more into scientific skepticism. How have you been?

This, that and the other.  Pretty much the same.

In general I think unseen has understood the backfire effect, though perhaps the statement

In a pessimistic sense, this makes most refutations useless.

It could be seen as a broad statement that most refutations are useless in general, rather than those whose world view is threatened...but I highly doubt unseen meant the earlier but instead the latter.

I haven't read Lewandowsky though I have read Nyhan and Reifler:

Steele (1988) offers one potentially important psychological explanation for this phenomenon. According to his account, individuals are motivated to protect their general self-integrity from threat, 5 including unwelcome information that calls their beliefs and attitudes into question.1 As such, they tend to reject such information or interpret it in a favorable manner. In this view, individuals who encounter threatening information are motivated to restore their feelings of self-worth; resolving dissonance is one way to accomplish this goal.

While it is true that some tests have not dealt with the "misunderstanding" factor or the "inability to integrate the new information" this group used examples where the information given was very unlikely to be misunderstood by the average person and one's that would be more likely to conflict not just with their world view...but their sense of self-integrity:

For instance, the persistence of the belief that President Obama was not born in the U.S. cannot at this point plausibly be attributed to a lack of 1 The concept of self-worth or self-integrity, which is central to self-affirmation theory, is theoretically and empirically distinct from self-esteem (see, e.g., Cohen et al. 2004). We focus instead on Steele’s theory of the motive to protecting feelings of one’s self-worth (1988). 2 People’s feelings of self-worth may be increased both by the affirmation itself and by the change it induces in the importance of the threatened aspect of the self (Critcher and Dunning 2015). 3 Our psychological focus on the effects of self-affirmation complements Bullock et al. (2015) and Prior et al. (2015), who find that polarization in factual beliefs decreases when respondents are instead provided financial incentives for correct answers. (However, only one of four studies reported in the two articles finds greater belief accuracy as a result of incentives.) 6 information. It seems more likely that conceding the validity of Obama’s account of his birth would require accepting the president’s legitimacy, which would be threatening to so-called “birthers.” We therefore hypothesize that it is threatening for people to concede the validity of potentially uncomfortable facts about controversial political issues, which hinders them from expressing belief in those facts even if they are at least tacitly aware of the validity of the claims in question.

They carried out other tests including troops in Iraq and climate change. Their methodology is sound and the example survey they provide at the end of the study seems quite fine as well. Their argument that conflicting-information-minimizing was not necessarily done to protect a world view, but instead one's self integrity is fascinating. Given the opportunity for self-affirmation, a respondent may be less likely to resist contradictory information and possibly be more receptive to new contradictory information

Our interpretation of these results is that Affirmation can make it easier to cope with dissonant information that one has already encountered about controversial misperceptions, relaxing people’s need to reject facts that could otherwise be threatening. We also find some evidence that Affirmation increases receptivity to new information (Graph) that might otherwise be resisted.

Interestingly there was a difference in response to the first two examples (Iraq and Obama) where the question is a rather easy one with easily obtained, explained and verified information as compared with climate change (which in general can be explained and understood by most for the most part though still a more complicated problem). Also, self-identified republicans tended to resist dissonant information when the question wasn't as simple as say, where Obama was born (not because of the difficulty in understanding the information presented but in the complexity of the problem). For this test they compared the likelyhood of resisting dissonant information with how much the subject identified with the GOP.


© 2022   Created by Rebel.   Powered by

Badges  |  Report an Issue  |  Terms of Service