beliefs are hard to change when they are part of our self-image/world view

Marc Resnick
(note – Updated 1/6/12)
As usual, I read journal articles in groups.  Here is a similar paper (ungated) from Dartmouth that looked at why our political views resist evidence to the contrary.  Once we get an idea in our head, we don’t like to change our minds.  The paper uses political views because these tend to be strongly held.  This research, and the work he reviews, finds that the effect is strongest with opinions that are strongly held, important to our self-image, or important to our world view.
This has implications for every change management initiative you implement.  When employees get used to doing something in a particular way, changing this mindset is very hard, even when you have factual evidence to back the new method.  The connection between the current way “we do things around here” is remarkably strong and tightly connected to the employees’ self-image.
This study looks at political issues, so these examples reflect their method.  But for each one, insert a situation from your own experience where you tried to change someone’s mind and you will see how similar they are.  His issues including the Iraq surge, Obama’s job plan, and global warming.   If you were against the war, you were less likely to give any positive credit to the surge (and if you were for the war, you gave less credit to evidence of other explanations).  If you didn’t like Obama, you wouldn’t give positive credit to his plan for creating jobs (and if you liked Obama, you wouldn’t give credit to alternative explanations).   If you don’t want there to be global warming, you don’t give credit to evidence for it (and if you believe in it, you don’t give credit to contradicting evidence). He cites previous work that found similar effects with abortion, the death penalty, and others.  You can see that this is independent of politics.  They find it equally on both sides of every issue they looked at.
There are many reasons for this.  We are more likely to focus on evidence that supports our opinion (conservatives watch Fox, liberals watch MSNBC, the new job assignment you gave me will never work).  We also counterargue against information we don’t agree with (how many times have you yelled against a talking head on TV?), but gladly accept information that we do agree with (the one time that the new job assignment actually didn’t work).
This study looked at two possible interventions to even out the score.  One thing they tried was to deal with the challenge to our self-image if we turn out to be wrong.  He had participants engage in some task them made them focus on their own good qualities.  In that state, they were less biased against a totally unrelated view.  So next time you are in trying to implement some change, start out by giving your target a self-affirmation.  “Look, you are one of smartest people I know so I am sure you will see how effective this new process will be.”
The second intervention was intended to prevent counterarguing.  They looked at graphic, visual ways of presenting evidence with a basic meaning that jumps out at you from the design.  But it doesn’t give you any specific facts, so you can’t argue against them.  In this condition, people were more receptive and believing of information that contradicts their opinions, even the important ones.
Then, don’t hedge what you say with “I know you really like the old way” or “the old way has been great for years, but . . . ”  The person will just fondly focus on what was good about the old way.  Stick with just the best, hardest, least questionable evidence that you have for the new method and then shut up while you are ahead.