Last time I promised to write about Mistakes Were Made (but not by me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, by Carol Tavris and Elliot Aronson (2007). I know people say this sort of thing a lot when they’re enthused about a book, but if I could make it required reading… you would all be very unhappy with me (but only until you read the book).
Though the book mostly isn’t about politics, the title come from a phrase we’ve all heard time and again from political leaders whenever they’re backed into a corner and have to admit that things have gone badly wrong. They trot out the passive voice: mistakes were made. Made by whom? They don’t say. They’ll do anything to evade responsibility and justify their own behavior.
But it isn’t just politicians. This book’s thesis, stated on p. 2, is that “Most people, when directly confronted with proof that they are wrong, do not change their point of view or course of action but justify it even more tenaciously.”
The scope of this book is broad: politics, law enforcement, personal relationships, psychotherapy, memory, but the thread that runs throughout is the power of self-justification. Because of this blog’s emphasis on learning, let me state that learning involves getting things wrong, and the only way you can grow intellectually (or grow, more broadly, as a human being) is to admit when your ideas are wrong and exchange them for better ideas. And if you’re truly committed to those ideas or that theory, ideology, or belief system, the transition can be a painful process.
But first, back to the book’s broad focus on the idea of self-justification. What do the authors mean by that?
We stay in an unhappy relationship or merely one that is going nowhere because, after all, we invested so much time in making it work. We stay in a deadening job way too long because we look for all the reasons to justify staying and are unable to clearly assess the benefits of leaving. We buy a lemon of a car because it looks gorgeous, spend thousands of dollars to keep the damn thing running, and then we spend even more to justify that investment. We self-righteously create a rift with a friend or relative over some real or imagined slight, yet see ourselves as the pursuers of peace–if only the other side would apologize and make amends. (p. 4)
“Self-justification,” the authors continue, “is not the same thing as lying or making excuses.” It’s more about lying to yourself. “That is why self-justification is more powerful and more dangerous than the explicit lie… it is also the reason that everyone can see a hypocrite in action except the hypocrite.”
The psychology underlying this process is explained by cognitive dissonance, a term that psychologist Leon Festinger coined in the 1950s. A great deal of research his been done on this idea over the past half-century, but the basic idea is pretty straightforward. “Cognitive dissonance is a state of tension that occurs whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) that are psychologically inconsistent, such as ‘Smoking is a dumb thing to do because it could kill me’ and ‘I smoke two packs a day.'” (p. 13)
Dissonance produces discomfort, and so we try to reduce the dissonance, usually by convincing ourselves that one of the cognitions is untrue or at least unimportant. Our remarkable ability to do this, in the face of evidence or what we might generously call ‘common sense,’ leads to all sorts of things that otherwise seem to make little sense.
For example, in studying group initiations, researchers have found consistently that “severe initiations increase a member’s liking for the group… if a person voluntarily goes through a difficult or painful experience in order to attain some goal or object, that goal or object becomes more attractive.” (p. 17) Once you’ve invested in something, especially if you’ve invested heavily, admitting that it was a bad investment of time, money, or, in this case, suffering, creates painful dissonance.
This appears to have a neurological basis. For example, when doing MRIs of people watching political debates, neuroscientists found that “the reasoning areas of the brain virtually shut down when participants were confronted with dissonant information, and the emotion circuits of the brain lit up happily when consonance was restored.” (p. 19)
Having introduced the book’s main idea, I’ll stop here. Next time I’ll describe what I think is one of the book’s most useful concepts. And by then I hope you’ll be ready to buy or borrow the book.