Episode 20: The Cognitive Dissonance Episode

In 1957, Leon Festinger published A Theory of Cognitive Dissonance. Along with a collection of compelling experiments, Festinger changed the landscape of social psychology. The theory, now referenced constantly both in and outside of academic circles, has taken on a life of its own. And it’s still informing new research and analysis more than 60 years later.

For the grand 20th episode of Opinion Science, I want to give you an insider’s look at the theory–its inspiration, the people involved, the classic studies, and the remaining controversies.

Throughout the show you’ll hear from people who have studied cognitive dissonance and who knew the infamous Leon Festinger: Elliot AronsonJoel CooperJeff StoneApril McGrath, and Mike Gazzaniga.

To learn more about cognitive dissonance, check out these two books written by two of our guests: Cognitive Dissonance: 50 Years of Classic Theory and Mistakes Were Made (but Not by Me).


Download a PDF version of this episode’s transcript.

Open: Bihar Earthquake

It’s 1934. Mid-January. The people in India and Nepal are going about their day when out of nowhere, an 8.1 magnitude earthquake rumbled six miles south of Mt. Everest. It was one of the worst earthquakes in India’s history. 12,000 people died, and cities were destroyed. More than 80,000 houses were damaged. Most of the destruction was in northern India and Nepal, but people also felt the quake from more than a thousand miles away, even though these areas were spared much damage at all.

In the aftermath of the earthquake, a psychologist at Patna College—hundreds of miles from the epicenter—started recording the rumors that began spreading. The interesting thing was that during this time of panic and anxiety, people were spreading rumors that seemed like they only compounded their fear. One widely believed rumor was that there would be another severe earthquake on the next lunar eclipse day. Other rumors were that the water in the Ganges River had disappeared and that dangerous tornados and other calamities were on their way.

What sense did it make that people would be spreading anxiety-provoking rumors at a time when there was already so much fear?

Years later, this same question vexed the American psychologist Leon Festinger when he discovered a report of these old rumors. And his explanation grew into the theory of cognitive dissonance.


You’re listening to Opinion Science, the show about opinions, where they come from and how they change. I’m Andy Luttrell. And this week we’re doing something a little different. Most of the time on this show I present a long interview with a social scientist or professional communicator. But this week I want to go all in to tell the story of cognitive dissonance theory. 

In 1976—almost 20 years after the theory of cognitive dissonance was first published—psychology historian Ned Jones wrote: “the dissonance research ‘movement’ has been the most important development in social psychology to date.”  

Fast forward to today, and psychologists are still studying and debating the finer points of dissonance. Thousands of experiments have been conducted, and new books updating the scientific record are still being published.

And it’s one of the concepts from the dungeons of academic psychology that’s gone mainstream. People constantly refer to it in the media and everyday conversations. But of course, when a scientific idea gets loose, it’s bound to take on a life of its own. So, today we’re gonna go deep and get some clarity on this influential and appealing theory. We’ll look at what dissonance is, what it does, and why psychologists have been fighting about it for the last 60 years.

Let’s start at the beginning.

Leon Festinger and the Origins of Dissonance

We wouldn’t have the theory of cognitive dissonance without Leon Festinger. Festinger grew up in Brooklyn in the 20s and graduated from the City College of New York in 1939. Even though he’s known for dissonance theory, he’s done just about everything else, too. After college, he went to the University of Iowa to get his PhD, where he did experiments on taste preferences in rats, developed new statistical tests, and even wrote a mathematical decision theory. When he finished his PhD, World War II was ramping up and as part of the war effort, he became senior statistician for the Committee on the Selection and Training of Aircraft Pilots at the University of Rochester. After the war, he spent time at MIT, then to the University of Michigan, then to the University of Minnesota, all the while studying how people’s beliefs and opinions adapt to their social surroundings. But our story really kicks into gear in 1955 when Festinger moves to Stanford University.

ELLIOT ARONSON: He was a really very bright guy, 39 years old, full professor with one of the highest paid faculty members on campus at age 39.

That’s Elliot Aronson. He was one of Festinger’s first graduate students at Stanford.

ELLIOT ARONSON: He and I arrived at the same time. He was just developing the theory of cognitive dissonance, at that time. And Festinger also had a reputation for being an extremely harsh, critical impatient, angry young man, which he was. And he also had the capacity to be warm and encouraging, but in order to get to the warmth, you had to go through an awful lot. There was a very high barrier there because he, he did not suffer fools gladly.

Professor Mike Gazzaniga knew Festinger from a different perspective. As a new professor at the University of California Santa Barbara, Gazzaniga met Festinger on a visit to Stanford. The two became friends and when both of them later ended up in New York City, they made a habit of getting lunch together every week.

MICHAEL GAZZANIGA: I skipped around. I was first at NYU. Then I went to Stony Brook. Then I went to Cornell. Then I was up to Dartmouth and all that time, the peg in the ground was lunch in New York with Leon.

And they’d talk about everything.

MICHAEL GAZZANIGA:  I mean, occasionally there would be an interlude where he’d talk about a recipe for potato pancakes, but normally it was about stuff, you know, intellectual stuff, and social stuff going on around us and new scientific findings. He had just such an analytical, challenging mind, you know? I mean, it’s just, you surely know people like that. I mean, he was a polymath. I mean, he knew stuff. I mean, from ancient history to the latest statistical variation to mathematics to political process science and of course, obviously social science. He just knew it all and you couldn’t find him thin in any, any area. And if he was thin, he wouldn’t be thin long. If it was of interest to him, he would, you know, Oh, I better go look that up.

But Festinger didn’t pull punches.

MICHAEL GAZZANIGA: I had written a new book and I, and I gave him a call. I was up at Dartmouth at the time, I think. And I said, so Leon, what’s new? You know, one of these conversational starters. He was on the phone in New York. And I remember his quip. He says, Well, I’m reading your new book and apparently not much. So that’s how you got it going with him and, uh, it was just sort of dry humor, you know, irony humor kind of stuff. And he just loved that kind of stuff and he’d call out the logic, whatever logical sequence you were in, he would call out the absurdity if he took a few steps further. He just was a very, very close and dear friend who was everything that people think he was in terms of his creative power, his smartness and all the rest of it. But, uh, I was lucky enough to fall into the category of being a friend too. So that was great.

So how did this feared and revered genius come up with the idea that reshaped social psychology?

Festinger had received a grant from the Ford Foundation, which was started by Henry Ford’s son—yes, that Henry Ford. The guy who invented the Model T car…and I guess my first car, which was a Ford Taurus? Anyhow, the Ford Foundation had reached out to Festinger to see if he was interested in summarizing all the research that had been done on the psychology of communication and social influence. The first topic his team tackled was the spreading of rumors, which led Festinger to the rumors following that giant earthquake.

To remind you where we left off with that, Festinger writes that the thing about these rumors that was so bizarre was that after this earthquake, “the vast majority of the rumors that were widely circulated predicted even worse disasters to come in the very near future.” Why add insult to injury by combining widespread fear with more widespread fear?

The key, Festinger figured, was that these were rumors being spread “among people who felt the shock of the earthquake but who didn’t see any damage or destruction.” So these are people who were still feeling upset and afraid and anxious following their experience and news of the earthquake, but when they looked around, nothing was out of place—no destruction, no clear threat to their safety. “Something’s not right,” they might be thinking. “Why am I still feeling so anxious when there’s nothing scary here?” These pieces just don’t fit.

“But…if I can believe that something bad is about to happen, then the fear I’m feeling makes sense.” Festinger said that these rumors weren’t fear-provoking rumors after all. Instead, we might call them “fear-justifying” rumors. Beliefs that help explain why we’re still afraid.

This was the seed of cognitive dissonance.

Defining Dissonance

Okay, so what is cognitive dissonance, exactly? Well, swimming around in your head are all sorts of thoughts, opinions, and beliefs…Festinger called those cognitions. Just a fancy way of saying “little bits of knowledge.”

And cognitive dissonance occurs whenever those little bits of knowledge are inconsistent with each other. When they don’t fit together. Specifically, it’s when one thought doesn’t logically follow from the other. Like in the case of the earthquake rumors, people’s feelings of fear didn’t logically follow from the fact that their community wasn’t any different than it was before.

Or how about another example? If you care about the environment but you fly internationally five times a year, that’s likely to produce dissonance. We can map it out by defining the two thoughts in question.

Thought #1: “It’s essential that we slow the pace of climate change.”

Thought #2: “I take one of the most environmentally-unfriendly modes of transportation constantly.”

Yikes. The second thought doesn’t logically follow from the first. If you really care about slowing down climate change, then you should not be acting in ways that contribute to it. That’s inconsistent. That’s dissonant.

And the whole idea at the center of the theory of cognitive dissonance is that when we face inconsistent bits of knowledge, it’s unpleasant, even physically uncomfortable. And we feel driven to do something about it.

Reducing Dissonance

JOEL COOPER: People just don’t let it pass. We can’t let it pass any more than we can let you know, let it go when we’re thirsty and we need to get, we need a drink of water. We’ve got to find the water. 

That’s Joel Cooper. He’s a professor of psychology at Princeton University and author of the book: Cognitive Dissonance: 50 Years of a Classic Theory.

JOEL COOPER: Festinger’s saying when you, are in a situation of inconsistency, you’ve got to find the proverbial water. You’ve got to do the resolution. There are really so many things people can do to resolve the inconsistency. The point is we have to find something to do.

So what do we do? How do we feel okay again when we’re hit with a bad case of dissonance? I talked to April McGrath, psychology professor at Mount Royal University. A few years ago she published an article on all the ways in which people deal with dissonance.

She notes that Festinger proposed three ways people address dissonance when they experience it.

APRIL MCGRATH: So, change a cognition or behavior. Add a cognition. Or alter the importance of the cognition.

So let’s explore these three modes of dissonance reduction. And as we encounter them, let’s keep our example in mind: someone who’s struck with the realization that they’re an environmentalist who often travels on airplanes.

Option #1. Change a cognition or a behavior. If two thoughts don’t fit together, you can update one of them to bring everything into balance. Like if an app stops working on your phone, you can update the software to make it compatible again. Same with your thoughts—if one thought isn’t compatible with the other, you might need to update one of them. In this case you could back away from your environmentalist values, I don’t know—become a climate change denier. Then your international trips are fine.

But maybe you’ve invested too much in that belief—you’re the chairperson of the Earth Day committee, you’ve written books on the horrors of climate change. It’s going to be near impossible to actually change that part of the equations. 

So if you can’t change the belief…

APRIL MCGRATH: I mean going forward, I guess you could try to change your behavior and you could try to fly less and do more teleconferencing or something like that.

To stay true to your environmental values, you could update your actions so that your thoughts and behaviors are in harmony.

At least in this example, both changing beliefs and changing behaviors sounds too hard. Fortunately, we’ve got options!

Option #2: Add a cognition. This just means introducing a new thought to the equation that helps you justify the inconsistency.

APRIL MCGRATH: So one reduction mode that I think would fall under adding a cognition is denying responsibility.

If I didn’t cause the inconsistency, it’s not my job to do anything about it. So when it comes to the flying environmentalist…

APRIL MCGRATH: If you thought of it, as you know, it was a necessary part of perhaps your job or what you have to do, then you really don’t have any choice and you have to take these flights. So you’re totally justified in partaking in those behaviors.

But let’s say you’re definitely responsible. You could try to deal with the dissonance by bringing in a different kind of rationalization. Maybe you try to claim that that air travel actually doesn’t harm the environment at all. Then your behavior is fine.

But that might be a little far-fetched.

APRIL MCGRATH: The different ways to reduce dissonance are constrained by reality.

At this point it would be a little hard to deny flying’s environmental footprint. It’s too well-established, and most people aren’t completely delusional—they still need to reduce dissonance within the bounds of reason.

The third option that Festinger outlined was trivializing the inconsistency—minimizing its importance. The amount of dissonance we experience is based on how important the inconsistent thoughts are. When two cherished beliefs clash, it makes for lots of dissonance. But if the pieces are trivial, then we don’t really care that they conflict. This means that when we’re faced with dissonance, rather than changing our beliefs and finding a way to justify the inconsistency, we could just say: “Eh, this one’s not a big deal.”

APRIL MCGRATH: Yeah, you know what, flying isn’t great for the environment, but there’s also lots of other behaviors that negatively affect the environment. And maybe, you know, you haven’t engaged in those recently. So by comparison, doing some flights, isn’t, isn’t quite so bad.

So any of these three strategies—changing one of the inconsistent pieces, adding a new thought that justifies the inconsistency, or just rendering the inconsistency trivial—they all serve to combat the bad feeling we get when we experience cognitive dissonance.

APRIL MCGRATH: And then, well, another option, would be distraction or forgetting. You don’t actually have to do something to directly deal with the discrepancy. And lots of us experience distractions throughout our days. And so we can easily distract ourselves with our phones or other devices. And we can also just, you know, forget about it and move on to something else that’s going to take over our attention.

Faced with all of these ways to reduce dissonance, which ones will people choose? Unfortunately, that’s one of the less developed aspects of dissonance theory, but there’s been some work on it. Although all of these strategies could address the unpleasant feeling that comes with cognitive dissonance, they differ in how useful they are in the moment, but they also differ in how difficult they are. And if you’ve ever resigned yourself to keep watching something on TV because you couldn’t reach the remote…you know people are good at avoiding things that take effort. With dissonance, it can be hard to change our behavior or let go of something we’ve believed for a long time. If the belief is pretty minor, we can rethink it but otherwise it’s easier to simply justify the inconsistency.

Granted, this seems to paint a pretty pathetic picture of human beings… 

APRIL MCGRATH: I think sometimes, you know, when people learn about cognitive dissonance theory, it’s brought up as this example of how irrational people are. And it’s like, Oh, look at how silly we are. And we do all these things to justify our behavior. But, you know, lots of dissonance researchers have made the argument that this is actually an adaptive process. There’s actually a good function served by this, which is that well, you know, we have to make sense of our world. And we also want to be able to act effectively. That discomfort that you feel when you recognize a discrepancy indicates, Oh, something is off here. I need to address something. And you have to figure out what to do in order to go forward. And alleviating dissonance, I think this is something that we do on a daily basis, and it’s important that we’re able to do that because we don’t want to get stuck in these dilemmas. We want to be able to proceed with our day, go on to other things and act appropriately. So as much as the theory is sometimes talked about in a way to say like, Oh, look how ridiculous we are. I think there’s actually a really adaptive process here. And it’s important to be able to alleviate dissonance.


Hey everyone. Just butting in to say thanks for listening. I also want to ask—if you like and support the show, please take a minute to leave a review on iTunes or Apple Podcasts. Share with your friends, social media networks, and that one uncle who seems to think he understands psychology but doesn’t. You can keep up with the show by subscribing and following @OpinionSciPod on Facebook or Twitter. Thanks so much for your support. Back to the show!

Experiment #1: Saying Something You Don’t Believe

Okay, the first example of dissonance we’ll consider is a now-classic experiment by Leon Festinger and Merrill Carlsmith. The study came out in 1959—after Festinger’s book on dissonance and after some of the original studies, but this, this is the study that put dissonance on the map. As Joel Cooper puts it—

JOEL COOPER: my view is that Festinger and Carlsmith–that study changed the landscape

Or as Elliot Aronson calls it: 

ELLIOT ARONSON: the single most important experiment ever done in social psychology

So we know about Festinger—but who’s Merrill Carlsmith? He was actually a college senior who took a class with Festinger and came up with the idea for the study as his final paper. Can you imagine being a college senior and coming up with a study that rocks an entire academic discipline? I have a PhD and I’m still working on that.

ELLIOT ARONSON: I actually worked with Merrill Carlsmith to bring him up to speed as an experiment or because he was a pretty stiff guy when he was an undergraduate. He got a lot better. He became my graduate student at Harvard. So, you know, we stayed together, but as an undergraduate, I knew him and liked him at Stanford when I was a first year graduate student. And so I worked with him, I trained him to be an effective experimenter, and we knew it at the time we thought, you know, I wasn’t sure this, that experiment would work, but I understood the theory. And when those results were coming out, it was very exciting.

So what is this study, and why did it change the game?

Let’s go back to Stanford University in the late 50s. 

Students in the intro to psychology course had the chance to sign up for all sorts of psychology studies. One of them was listed as a 2-hour study on “Measures of Performance.” Why anyone would choose that is beyond me, but they got a bunch of students to sign up.

When a student would arrive, he’d sit down at a table and was given a bunch of spools. His task? Take 12 spools and put them on a tray, one at a time, using just one hand. Then empty the tray. And do it all over again…for half an hour! Thirty minutes of putting spools on a tray, emptying the tray, putting spools back on the tray, emptying the tray, putting spools back on the tray, emptying the tray…

Oh, and the fun had only begun! The experimenter took the spools away and then gave the student a board with 48 square pegs—and these pegs were like knobs you could turn. So for next half hour, the student would turn each of these 48 pegs a quarter turn, then start back at the beginning and rotate them all a quarter turn again, then back to the beginning…another set of quarter turns…

And what was the point of all this spool organizing and peg turning? The point was just to give everyone an unpleasant experience. After spending an hour silently doing monotonous, repetitive tasks for no reason, anyone would have to be thinking, “This has to be the most boring hour of my life.”

But the experiment had only just started cooking. Because when the student finishes, the experimenter has one more favor to ask.

EXPERIMENTER: O.K. Well, that’s all we have in the experiment itself. I’d like to explain what this has been all about so you’ll have some idea of why you were doing this. Well, the way the experiment is set up is this…

The experimenter explains that they’re studying how people’s expectations affect their performance on mundane activities. So sometimes they have an actor pretend to be another student who just took the experiment, and this actor tells the next participant that the experiment is actually really fun and interesting…leading them to expect to have a good time.

EXPERIMENTER: Now, I have a sort of strange thing to ask you. The thing is this…The fellow who normally does this for us couldn’t do it today—he just phoned in, and something or other came up for him—so we’ve been looking around for someone that we could hire to do it for us. We’ve got someone waiting who is supposed to be in that other condition, and we were wondering if you could be our actor today. And if you’re willing to do this for us, we’d like to hire you to do it now and then be on call in the future, if something like this should ever happen again. Do you think you could do that for us?

So just to be clear, the idea is that the person who just spent an hour doing a mind-numbing, painfully boring study is about to go out and tell the next participant that it was actually really fun. And that’s going to make for some dissonance!

In other words, thought #1 is that they just did the world’s most boring study.

Thought #2 is that they told someone it was actually fun. These are inconsistent. Telling someone the study is fun doesn’t logically follow from believing the study is actually boring.

But hang on—there’s one more detail I haven’t told you yet. When the experimenter’s asking the student to be their substitute actor, sometimes he sweetens the deal a bit, saying:

EXPERIMENTER: We can pay you a dollar for doing this for us. 

But sometimes he says: 

EXPERIMENTER: We can pay you twenty dollars for doing this for us.

Now, remember this is 1959. If we adjust for inflation, $1 then is like $9 now. 

But $20 then is more like $180 now.

So these students are either getting paid a piddly amount of money to lie to another student, to tell them that they had an amazing time in the study, or they’re making bank, pulling in a ton of money to tell the same lie. 

Okay, so the student got a dollar or $20, told the next participant that he had a ball putting spools on a tray, and as far as he’s concerned, the study’s over. But the ruse kept going. The experimenter said that the psychology department was interviewing a handful of students who participated in research studies apparently just to see how the research projects were going. So after everything seemed to be done, the student walked to another office to answer some questions about the study.

This interviewer—who apparently had no connection to the boring study—asked a few simple questions about how interesting the activities were and whether they’d be interested in signing up again.

If we go back to our dissonant thoughts—believing the study’s boring but also having told someone it was fun—how might people reduce that dissonance and feel okay?

Well, if you got a ton of money to do it, you could rationalize the inconsistency, thinking, “Sure, I lied, but I got paid a lot to do it.” And so when they got to the interviewer’s office, the students who had been paid $20 were happy to say: “Oh yeah, that study? Garbage. Not at all interesting or fun.”

But what if you only got a measly dollar? That’s not enough to justify the fact that you lied. You’re still stuck with the dissonance of saying one thing but believing another. How can you bring some consistency back into your brain? Well, you can’t take back what you said. But if you could convince yourself that the study was actually pretty interesting, then there’s no more dissonance! What you said to that poor student is actually what you believed—you would have said it even if you got nothing! And sure enough, when they got to the interviewer’s office, the students who only got paid a dollar said the study was interesting and they’d happily participate in another one like it.

Okay, so this study is a powerful test of cognitive dissonance theory—it created dissonance for the participants by making them believe one thing but say another and showed that when dissonance can’t be reduced any other way, people will change their own beliefs to make the dissonance go away.

But why is it…

ELLIOT ARONSON: the single most important experiment ever done in social psychology

Well, at the time, these findings challenged psychology’s primary way of understanding things. Back then, psychologists were focused on how people’s behavior is reinforced by rewards. The bigger the reward, the better we learn.

Probably the most famous psychologist who talked about reward and reinforcement was B. F. Skinner. He designed these simple tools to give pigeons some food as a reward for performing some action. And after rewarding the right behavior over and over again, he could train pigeons to do all sorts of things. I’ve tried the same with my cat, but I’m not having as much luck.

But the basic idea championed by Skinner and his pigeons was that we learn to do things we’re rewarded for. We call it reinforcement theory.

JOEL COOPER: Everybody believes that rewards are critical to motivating behavior. Now Festinger and Carlsmith come and say, no rewards actually can reduce motivation. … And then you, you find major players in the field going after the theory, trying to show you why it was wrong.

The problem was that it was the people who got the smallest rewards—just $1—who ended up liking the whole study the most. This was not okay with reinforcement theory. But in hindsight, it’s clear that Festinger was onto something, and he was bold enough to confront psychology’s assumptions with data to prove it.

ELLIOT ARONSON: And I actually talked to Fred Skinner about those a few years later when I was teaching at Harvard and he couldn’t really explain it, he really couldn’t explain it, or you tried, we will have lunch together. And, uh, and he thought, um, He thought, well, I’ll come up with an answer. So in terms of reinforcement theory, and I’ll give you a call and he never did so, but it’s, it was an amazing experiment.

Experiment #2: A Worthless Initiation

Okay—so the $1/$20 study showed us that we feel dissonance when we say something we don’t believe, and we can convince ourselves that we actually believe what we said as a way to bring balance back to our thoughts.

Let’s look at another kind of dissonance.

Have you ever put a lot of effort into something that ended up not being worth it? Or you spent a bunch of money on something that turned out to be basically garbage? Like imagine a big touring Broadway show is coming to your town. You shell out $100 for a ticket, go to a fancy dinner beforehand, arrange for a babysitter, the curtain rises…and the show is just not very good. The story’s boring, the music isn’t quite to your taste, the acting is…fine. That would make for some dissonance!

Think about it– Thought #1: I just spent a lot of my hard-earned money and devoted my whole evening to seeing this show.

Thought #2: This show sucks.

Well, you can’t get your money or your evening back, so you might try to resolve your dissonance by thinking…hey, you know, the music was actually very well performed, and the story did have some interesting turns that I didn’t see coming. You know what? The show wasn’t that bad. I’m glad we went.

Ta-da! Dissonance averted.

Another of the classic studies in cognitive dissonance tested this sort of scenario to see if people would actually convince themselves that a bad experience was actually worth the effort. It was a study by our friend, Elliot Aronson and his colleague, Judd Mills. 

ELLIOT ARONSON: I had been reading for another course some stuff by John Whiting, the anthropologist, and he was talking about how different indigenous groups usually have initiations into adulthood and some of the initiation were extremely severe. And it dawned on me, because I was also taking Festinger’s course in which he was talking about dissonance theory. And I was thinking, well I wonder if the people end up being more patriotic and really liking being a member of their tribe, much more than people who go through a mild initiation.

And so he constructed an initiation experience for student research participants. They first asked college women to volunteer to participate in a discussion group about the psychology of sex, figuring this would be a topic of interest for college students.

When a student would arrive to participate, the experimenter would say:

EXPERIMENTER: You’re going to join a group that’s been meeting for a few weeks already. They’ve already started this week’s discussion, but we need to make sure you’re ready to join them. Although most people are interested in sex, they tend to be a little shy when it comes to discussing it, which is a problem for our study. So it’s extremely important to arrange things so that the members of the discussion group can talk as freely and frankly as possible. We recently decided to screen new people before they join the group using what we call an “embarrassment test.” We’ll just ask you to read some sexually oriented material out loud so we can see if you’re comfortable discussing the topic.

At this point, the experimenter gives the young women a stack of index cards with words printed on them for her to read out loud.

For some of the women in the study, the words are pretty mild: prostitute, virgin, petting.

But for other women, the words were more explicit: [bleeped]. And they were asked to read a passage from a romance novel where the characters get it on.

And the more severe screening test might be embarrassing today, but this was in the 50s when it would have been even more trying for young women to be saying these things out loud to a male experimenter.

So everyone passes the test, they’ve been thoroughly initiated into this discussion group. Since that day’s discussion was already in progress, the women who just passed the initiation would only be allowed to listen in on the group’s discussion over an intercom. And what lewd and stimulating topics was the group discussing?

DISCUSSANT: So, the book was saying that for mating rituals, birds have, like, feathers—I forget what color, but like…blue? Or something? But that was supposed to be part of the mating stuff for birds—or one kind of bird.

It was a boring, bumbling, goes-nowhere discussion of the secondary sex behaviors in lower animals. Not what these women had signed up for! For the women who did the mild initiation, this was no big deal. But think of the dissonance for the women who went through a more severe initiation!

Thought #1: I just underwent a mild trauma, saying [bleeped] in front of this stranger to prove I’m worthy of the group.

Thought #2: This group sucks.

So when participants filled in a survey at the end of the session, women who did the mild screening test were happy to say, “Yeah, that was a boring discussion and those other women don’t seem that interesting either.” 

But the women who did the more severely embarrassing initiation? They had more dissonance to resolve. Lo and behold, they were more likely to say, “Well, the discussion touched on some interesting topics and those other women made some good points.” In other words—it was worth the effort it took to qualify for this group.

And these women weren’t just putting on an act. They seemed to really believe it.

ELLIOT ARONSON: a lot of the debriefing we did after these experiments really showed us how deeply the subjects were involved in the process of dissonance reduction, to the point where they would argue with us when we gave them the explanation of the experiment after it was over. You know, in my experiment, for example, when I explained our theory to them afterwards, most of the people in the severe initiation condition would say: Well, that’s a very interesting theory.

So you’re saying that that people who go through a lot of effort to get into the group, they ask themselves, why did I do all this work to get into such a boring group? Well, actually, you know, it wasn’t so boring. There was a lot of stuff in there that was interesting. You know, some things that the guys would say to me, “Well that’s an interesting theory and I’m sure it’s right for some people, but that’s not what happened to me. I really liked the group. I liked it from the outset!”

Experiment #3. Caught in a State of Hypocrisy

We’ve seen how cognitive dissonance can spring from saying something you don’t believe or putting lots of effort into something that isn’t that great. Let’s look at one last example of dissonance and the way that people deal with it.

When people tell us to do something that they themselves don’t do, we call them hypocrites. If you’ve ever heard someone tell you: “Do as I say, not as I do,” you’ve heard them basically admit to being a hypocrite.

But if we notice ourselves being hypocrites, that creates dissonance. Like imagine I tell you how important it is for you to eat healthy foods but then I realize that my last five meals were basically French fries and apple pie, those two things—what I told you to do and what I do—they’re inconsistent, so how do I fix my dissonance?

Jeff Stone is a psychology professor at the University of Arizona, and he studies how hypocrisy can actually make people take better care of themselves.

JEFF STONE: We defined hypocrisy as a situation where people are advocating a course of action, like a health behavior. We all know we should engage in, uh, a good example for where I live in Arizona is, people need to practice some unprotected behaviors.

We have a lot of sun here. Skin cancers is a concern. And so it’s well known and well accepted that people should use sunscreen, spend less time in the sun if they can, if they’re going to be out there, wear sun-protective clothing. These are all things that people know they should do, but interestingly, they don’t do them.

And so we can create hypocrisy in a situation like this, where we get people to advocate, “Hey, we should all use sunscreen. We should all practice sun protective behaviors.” That in itself doesn’t cause a lot of dissonance, because there’s nothing really inconsistent about it. But if you then remind them of the fact that they themselves don’t do what they just told everybody else to do, now you’ve created an inconsistency for them. And what we find in our research is that when we put people in those kinds of situations, where we get them to advocate, uh, a health behavior, that’s well accepted and then we make them mindful for the fact that they don’t do what they just told everybody else to do, that induces that inconsistency, that leads to dissonance. We find in our research that that can motivate people to say, Oh man, you know, I believe in this thing, I don’t do it. I should really make a change. And they’re much more likely at that point to adopt the health behavior. In this case, they’re more likely to acquire a sample of sunscreen, as a way of, of resolving the inconsistency between what they’ve said and what they know they do.

So, faced with an inconsistency between what you do and what you tell other to do, you can bring some balance back by changing your own behavior. Practice what you preach. After all, you can’t easily take back what you said, but you can adjust your behavior going forward.

This has some intriguing possibilities for getting people to act the way they know they should be acting. One study in Australia tested whether this kind of hypocrisy-related dissonance could be used to encourage people to conserve electricity. In the hot part of the year, researchers got a list of addresses from air conditioning companies, and they mailed surveys to those addresses. The survey included a question asking people how much they believed it was their personal duty as a responsible citizen to save as much electricity as possible. Pretty much everybody agreed that saving electricity was important, and so a few weeks later, they sent out a follow-up letter to just some of the people who completed the original questionnaire. This letter basically said: “In that questionnaire from a few weeks ago, you told us that you strongly believed in the importance of saving electricity, but our records show that your house actually uses a lot of electricity. So, here’s a pamphlet with some tips for conserving energy. Take care!”

That letter was written to highlight those people’s hypocrisy and inject some dissonance into their lives. When struck with the inconsistency between the values they openly endorsed and their own personal behavior, how could they reconcile that dissonance? 

The researchers were able to get readings from the electricity meters at everyone’s house to see how much energy they were actually using. And over the course of four weeks, the people who got a letter pointing out their hypocrisy actually used less electricity than people who never got a letter like that. The idea is that those households compensated for their hypocrisy by actually living up to the values they proclaimed in that initial survey.

JEFF STONE: When people ask me, you know, how to use this or how it works, I always say that it works really great for things that people know they should do, but they don’t. The fact that it’s something they know they should do, you’ve got the cognition’s heading the direction of, this is the behavior change that we want to accomplish. And then it’s really just, how do you make them aware that they don’t do what they’re telling everybody else to do? And that’s, that’s what kind of sets the stage for the dissonance to occur and motivate them to want to change their behavior. So any, any topic that sort of fits that formula? I think it has a chance of working with.

New Perspectives

It’s pretty clear by now that cognitive dissonance occurs when people grapple with two inconsistent thoughts, and they try to deal with their dissonance using a variety of tools, which can make them update their beliefs, change their behavior, etc. Seems pretty straightforward. Well, this is social science…nothing is straightforward.

Even in the earliest days of dissonance theory, a small war broke out between the dissonance camp and people studying reinforcement theory. Here’s Joel Cooper again:

JOEL COOPER: So you get, Janus and Gilmore. You get Rosenberg, you get, you know, people well established in social psychology, trying to indicate where Festinger and Carlsmith were wrong. And then it gets propelled, right? When people take you seriously and try to dispel your theory, then you’re– then, and especially if subsequent data support your theory, you know, you’ve really changed the game and that’s really what happened.

But while Festinger may have won the war with reinforcement theory, the storm had only started brewing. Psychologists started to pick at dissonance theory, asking whether it was as simple as Festinger said it was, whether it was really just the existence of inconsistent thoughts or whether the story might be a little different.

The New Look

One of the major attempts to tweak dissonance theory was spearheaded by Joel Cooper and his colleagues. And it started in response to a new study that challenged dissonance theory. A psychologist named Milton Rosenberg had been critical of dissonance, and he ran an experiment that apparently should have confirmed dissonance theory…but it didn’t. It was a lot like the study we heard about earlier where students had to lie about how much they liked a study. Except in this case, he gave Ohio State students either 50 cents or $5 to write an essay arguing that the OSU football team shouldn’t allowed to play in the Rose Bowl. So just like the other study, people were paid a lot or a little to say something they didn’t believe. But when Rosenberg did the study, he got the opposite results—the students who got paid more were the ones who went on to actually agree that OSU shouldn’t participate. It was as if reinforcement theory was right!

But Cooper thought there was something fishy about Rosenberg’s version of the study.

JOEL COOPER: I had the feeling that actually Rosenberg, he did something different and that difference might be important. And I think that what Rosenberg did was take away people’s choice, and maybe choice matters, maybe choice matters more than Festinger ever thought that it did.

The idea is that Rosenberg got people to agree to do something in exchange for $.50 or $5, but the students didn’t actually realize that they were agreeing to betray their own beliefs until it was too late. So they never actually had a choice to write an essay they didn’t agree with. They got stuck doing it.

So Cooper and his team ran a new experiment. Sometimes they’d make sure that people understood that they were being asked to write an essay they didn’t agree with and assured them they had the choice to opt out. Other times they did like Rosenberg and forced people to write an essay they didn’t agree with.

Give people a choice, and they feel dissonance. But—

JOEL COOPER: If you take away that choice, then dissonance simply doesn’t apply. And reinforcement works and people change their attitudes as a direct function of the magnitude of reinforcement. So choice is really important.

And it’s not just choice. Cooper started to think that Festinger ignored some other things, too.

JOEL COOPER: What would happen if you did Festinger and Carlsmith’s study, but nobody believes you? You know, what, if you were saying something you didn’t believe but you were saying in a closet where nobody overheard you. Does that create dissonance? Well, in some ways– and you have choice and you could put in reward or no reward, but does anybody care if nobody hears?

So they would do studies where they ask people to say something they don’t believe…

JOEL COOPER: … and they, get lost, you throw them away and nobody’s going to read them… and they don’t create dissonance. And so, you know, that’s, that was another piece of the puzzle.

So he started amassing all these findings where people who acted inconsistent with their beliefs didn’t really experience dissonance unless certain conditions were met. Unless they had a choice in their behavior. Unless they were committed to the behavior. Unless the behavior had unwanted consequences. Unless you could foresee those consequences. This all informed Cooper’s new outlook on dissonance, which he’s appropriately called: The New Look Model of Dissonance.

JOEL COOPER: What we said in the New Look is that dissonance seems to be created when people feel responsible for creating an aversive state of affairs. And by aversive, we mean any state that people would rather not have brought about.

So something you bring about that’s aversive might be convincing somebody to believe in a position that you don’t believe in, like who the next president should be. Then you say something, let us say, if I can be a little political, pro-Donald Trump and somebody hears that and you say, Oh my God. They might believe that. Or they do believe that … like, God, I can’t, that’s an aversive state. Or I suffered to get into this group. And the group isn’t even any good. That’s an aversive state.

Are you responsible for that? Did you cause it? If you take responsibility, now we have cognitive dissonance. Anything that absolves you from responsibility eliminates cognitive dissonance. Anything that turns the state into a positive state eliminates the cognitive dissonance.

And we thought with that simple statement– not quite as simple as inconsistency leads to dissonance, but the notion of being responsible for having brought about an aversive state that leads to dissonance. And that encompasses just about all the research we could think of in cognitive dissonance. There are no longer exceptions to the rule. That was the rule.

The Importance of the Self

Another point of contention in the early days was: do all varieties of thoughts that are inconsistent lead to dissonance? Elliot Aronson recalls his time as a more senior student in Leon Festinger’s lab…

ELLIOT ARONSON: I was his major Domo and training the younger students to be able to do the research and stuff like that. And when someone would come to me with a hypothesis and say, well, Elliot, do you think that’s dissonant? And I would say, you know, I’m really not sure.  If you really want to know what’s dissonant with what ask Leon. And Leon got so, he got so angry at me for doing that because I did it as a joke, you know, but I, but in reality, Leon and I would argue about these things all the time.

And eventually Aronson came to the idea that dissonance is really about how we think of ourselves, who we are as a person. What really matters isn’t that you said something you don’t believe or you tried hard to get into a group that’s actually boring. Instead, the dissonance is really a conflict between what you’ve done and the kind of person you think you are.

ELLIOT ARONSON: So I could recast, for example, the initiation experiment as: I am a very smart person and I’ve done a very stupid thing. I went through hell and high water in order to get into a group that turned out to be boring and worthless. Okay? So those are the two cognitions. One of them is my self-concept. If my self-concept was that I was a dumb guy who always did stupid things, there would be no dissonance. You see?

We can apply the same thinking to other studies we’ve seen. With hypocrisy, the dissonance is not just a discrepancy between what you told people to do and what you yourself do, the dissonance is that you think of yourself as a person of integrity but you don’t practice what you preach! Or with the studies where people say things they don’t believe, the dissonance isn’t the inconsistency between what you said and what you believe. It’s the inconsistency between seeing yourself as an honest person and knowing that you lied to someone. 

But was this a radical revision of the original theory?

ELLIOT ARONSON: I remember seeing Leon and saying, okay, I got it all figured out. And Leon said, you’re right, we argued about whether I should be stating it that way, because he thought that limited the scope of the theory. And he was right that it did limit the scope of the theory, but sometimes you have to limit the scope in order to make it more accurate. And so my idea was not to say– it wasn’t a new theory. It was saying at the center of the theory, dissonance is most painful when it comes into contact with an aspect of our self-concept.

The Legacy of Dissonance Theory

So Cooper is saying things like choice and unwanted consequences matter. Aronson is saying this is about the self. But these are just two examples. A host of new theories have emerged to chip away at how dissonance works: the self-standards model, the radical dissonance theory, the action-based model, self-perception theory, the meaning maintenance model. Psychologists have shown that dissonance works differently in different cultures, they’ve scanned people’s brains as they experience dissonance, they’ve created computer models of dissonance, they’ve tested dissonance in children and capuchin monkeys.

We just can’t get dissonance theory out of our heads. What Ned Jones wrote in 1976 still rings true. That “the cumulative reach of dissonance research is remarkable and that the dissonance movement has been the most important development in social psychology to date.”

More than sixty years ago, Leon Festinger had a brilliant insight and called it the theory of cognitive dissonance. That human beings are driven to toward a coherent mental life, to hold beliefs that match their actions, to treat others the way they think is right, to hold opinions that fit together. It was an audacious proposal at the time, but now is commonplace.

And after all these years, all the experiments, all the findings. What lesson does dissonance have for us? 

Jeff Stone said this:

JEFF STONE: The value is that it’s one of the ways that we know people fail to learn from their mistakes. And the importance of it is that dissonance shows us that people are not rational. They’re rationalizing. That’s so important to understand, that when we make mistakes, we do something that we wish we hadn’t, or, or just, you know, dealing with failure in one way or another. We set out to achieve some goal and it just doesn’t work out. It blows up in our face. We have a real tendency to experience dissonance under those conditions, but look for some way out of it that we don’t really realize the mistake that led to the problem in the first place.  What we don’t do is, you know, go ahead and, and realize, man, I’m somebody who’s capable of making a mistake. I can do something stupid once in a while. I might do something immoral once in a while and, you know, rather than rationalize that and find a way to justify it, what really we need to be able to do so that we don’t make those mistakes again, is sort of confront that head on. But if all we rely on to reduce our dissonance is around rationalizations and our justifications, well we’re never really going to realize that it was us, um, and, and that was our mistake. And so we’ll never, we’ll never really improve. We’ll never avoid that mistake again, in the future. To me, that’s one of the things I hope when I teach this topic in my classes, as students walk away with is their appreciation for that. And it takes some ego strength as I like to say, to kind of look at yourself in the mirror and go, I made a bad mistake here, you know? I’m responsible. And so I have to take responsibility and make it better next time.


Thank you so much to Elliot Aronson, Joel Cooper, Jeff Stone, April McGrath, and Mike Gazzaniga for agreeing to be on the show. When I started Opinion Science, I had the idea to do a special episode like this someday, and I’m hoping to do more deep dives like this in the future. I’m arbitrarily calling this the end of the first season of this podcast, but before it wraps up, I’ll be releasing the full interviews with Joel Cooper and Elliot Aronson over the next couple weeks. We’ll resume regularly scheduled programming soon after that!

Keep up with new episodes of the podcast by following @OpinionSciPod on Facebook and Twitter, or hop on Apple Podcasts, Spotify, etc. to catch up on past episodes. And of course, leaving a review of the show will help this little podcast grow. Thanks for all your support so far.

For a full transcript and references for this episode, you can go to the episode page on OpinionSciencePodcast.com.

Okeedoke—I’m exhausted. I made this episode over the course of a couple months during a global pandemic, while we bought and moved into our first house, and while preparing for an uncertain new academic year. So I’m gonna go take a nap, but I’ll see you again soon for more Opinion Science. Buh bye…


I'm a social psychologist.

Get in touch