Allocation of blame

By Marc Resnick:

Even more fascinating than the top-down, rationalizing decision making process I talked about in my previous post, I have recently been studying moral and ethical decisions and allocation of blame and punishment.  There are a lot of findings I would love to share with you.  For today, let’s start with this one.

There is an order effect on moral judgments.  In other words, if you say one person is guilty, you are more likely to say the next person is guilty, even if they are totally different crimes.  It is because we get a “priming” effect where we have our <guilty> verdict already activated and it biases our future decisions.

It is also because we have an unconscious desire for internal consistency.  Calling the first person guilty gives us a certain opinion of ourselves (a weak version of thinking that we are a “hanging judge”).  Calling the first person innocent, we get a different opinion of ourselves (“Joe the Merciful”).  In order to be consistent, we are biased towards finding the same way on the next one, regardless of evidence,  It is not a guarantee, but a subtle unconscious influence.

If you ask people afterwards if the first decision affected the second decision in any way, they will honestly deny it because they don’t realize that this is happening.  But when asked why we make moral decisions the way we do, we conduct a post-hoc rationalization that explains the decision based on whatever ethical or legal principles we know of, even if the decision was really a “gut” decision based on not a single real principle or law.

It gets worse.  The more expertise you have in making these decisions (philosophers, ethicists, lawyers), the more likely you are to exhibit this order effect.  The researchers suspect it is because they are even better at coming up with a post-hoc rationalization because they have more of them to choose from.  This is similar to the top-down rationalizing that I talked about in the previous post.  The more information you have, the more ammunition you have to support your biased preferences.

Ironic, huh?