Richard E. Nisbett has spent his career studying how people think. He is an emeritus professor of Psychology at the University of Michigan, and his research has influenced how psychologists think about reasoning, introspection, culture, and intelligence. He has written several important books over his career, including The Geography of Thought: How Asians and Westerners Think Differently…and Why and Mindware: Tools for Smart Thinking.
His newest book is Thinking: A Memoir.
In this episode, Nisbett shares samples of his work relating to our inability to know the inner workings of our own minds, whether we can call various cognitive biases “errors” in reasoning, and how culture shapes the way we interact with the world.
Some things that come up in this episode:
- Nisbett’s favorite study: Norman R. F. Maier’s finding that people fail to understand where their insights come from (Maier, 1931)
- The classic set of studies by Richard Nisbett and Tim Wilson on our failure to introspect on cognitive processes (Nisbett & Wilson, 1977)
- The study where a goat entered a classroom (but that was really about intrinsic versus extrinsic motivation; Lepper, Greene, & Nisbett, 1973)
- Nisbett’s work on errors in reasoning (Nisbett & Ross, 1980; Nisbett, 1992)
- Early work by Hazel Markus and Shinobu Kitayama on the effects of culture on how we think about ourselves (Markus & Kitayama, 1991; also see Markus’ book Clash!)
- The “Culture of Honor” (Nisbett, 1996)
- Cross-cultural differences in analytic versus holistic thinking (see Nibsett’s Geography of Thought for a summary)
Transcript
Download a PDF version of this episode’s transcript.
Andy Luttrell:
You’re listening to Opinion Science, the show about our opinions, where they come from, and how they change. I’m Andy Luttrell. Now, normally I start things off with more of monologue or story related to the main idea underlying the episode, but this week’s interview covers so much ground about such fundamental stuff that I figure we can just get right to it because this week I’m talking to the esteemed social psychologist, Richard Nisbett. Earlier this year, he released a memoir titled Thinking. It’s a personal history and traces the trajectory of his research on the psychology of reasoning…and how it can go wrong.
When I started this show, I never expected to be hearing from book publicists, but nowadays I’ll get a fair number of inquiries from publicists and authors looking to see if I’m interested in having them on the show. Often times it’s not a great fit for whatever reason, but a couple months ago, I heard from a publicist asking if I’d be interested in interviewing Richard Nisbett, and I was like, “Oh, yes. Of course.” I was already well aware of Nisbett’s work. And I’m not alone. Malcolm Gladwell said once that “the most influential thinker, in my life, has been the psychologist Richard Nisbett. He basically gave me my view of the world.”
Here’s why. He got his PhD in 1966 at Columbia working with Stanley Schacter, who’s a major player in the history of social psychology. After graduating from Columbia, he began as a professor at Yale before moving to the University of Michigan, where he’s been since 1971. Along the way, his research has shaped the way psychologists think about introspection, errors in reasoning, how to improve reasoning, the deep role of culture in how we think, and intelligence.
In our conversation, we covered all of that ground and more. I’m excited to share it. I do want to mention that we had to jump through a couple technical hoops in this recording, so the audio isn’t quite to the standard I shoot for in this show, but I think you’ll enjoy our conversation all the same.
One quick note, though, about Nisbett’s long-time collaborator, Lee Ross. Although he didn’t come up in this interview, in the memoir, Nisbett’s affection for Lee is so clear. They this really productive collaborative career together, publishing several influential books together. Unfortunately, just a few days after we recorded this interview, Lee passed away. His work was great and influenced generations of social psychologists. So it just seemed prudent to mention this because Lee was such a prominent contributor to social psychology and a dear friend of today’s guest.
So, raise a glass to Lee Ross and settle in for my interview with Richard Nisbett on his long career spent…thinking.
Andy Luttrell:
So, I think we can start with the Nisbett and Wilson paper, and I call it that because to me that’s just the Nisbett and Wilson Effect, right? That’s just… That citation just stands out almost as the concept itself. And so, the notion is about lacking introspective awareness about cognitive processes, which on its own, just that is a little bit of a mouthful, so maybe you could give people some sense of what that means and how you happened across that idea.
Richard Nisbett:
Well, if you ask me how I voted on the new school millage proposal, I’d say, “Well, I value education, and I thought that salary should be higher here.” If you then as an opinion expert ask me, “Oh, do you suppose that you are influenced in that vote by the fact that the vote was held in a school as opposed to a church or anyplace else?” I would say, “You gotta be kidding. You think I would be influenced by something like that?” But I probably was. I just don’t know it. I don’t have a representation of that in my decision-making process.
I mean, judgment processes are like decisions in that you don’t want to see how they’re made, because it’s embarrassing, the range of things that can influence a given judgment or behavior. We don’t have a representation of those things in such a way that we can tell how they’re represented in the cognitive process. We just don’t have it. And many people say, “I’m okay.” But they don’t believe that for one minute. They believe, like I do when I tell people why I did something, that I’m telling the truth and that I’m doing it in part on the basis of inspecting what I was thinking about at the time. But I can’t see the actual cognitive processes.
Now, we don’t have any problem with being told that we can’t see memory processes or perception processes. Of course, we can. I can’t tell you the process that went on in my retina and in my visual cortex that allowed me to see something, but the fact that we do have lots of cognitive representation allows us to think, “Well, we have complete cognitive representation of cognitive processes,” which we absolutely do not have.
Andy Luttrell:
What stands out to you as the best evidence of that? In some ways, it makes some intuitive sense that… Yeah, well, how could I know exactly where I reached this conclusion from? But I don’t know, maybe I’m right. Maybe some people are right. How do we know that people just really can’t crack open the process that led to some outcome judgment decision or whatever?
Richard Nisbett:
Well, my favorite experiment ever was done almost 100 years ago by N.R.F. Maier at the University of Michigan. He hung two cards from the ceiling and told the subject that his task is to tie them together, but they’re too short. You can grab one and you can’t get quite to the other. The room is strewn with things that can be used to do that, like an extension cord and so on, and subjects get a lot of these things, and then after the subject has been stumped for a few minutes, Maier, who’s been strolling around the room, hits one of the ropes and sets it to swinging. Typically, then, within 45 seconds, the subject ties an object to one of the ropes, sets it to swinging, runs and grabs the other rope, and then grabs the swinging rope and ties them together. At which point Maier said, “That’s great. How did you happen to come up with that?” And they typically answer, “I don’t know. It just occurred to me.”
He had… A lot of his subjects were psychologists who came up with answers like, “Well, I had imagery of monkeys swinging through the jungle. The imagery occurred simultaneously with the solution.” And if he asked, “Do you think the fact that I set the rope into motion had anything to do with it?” A lot of them said, “I didn’t see that, so that couldn’t have had an influence.” And so, there’s something going on in their heads, cognitive process, experimenter knows what was going on, the subject is totally blind to it and denies it when the experimenter asked them. And of course, we did many much more simple experiments. A line of nylon stockings to women in a mall and asked them to judge the quality and they’re four times as likely to say that the last one they examined has the highest quality than the first one they examined.
And if you ask, which we did, just, “Do you suppose your judgment of quality was influenced by the position of the pantyhose, the nylon stockings in the array?” They look at you as if one of you is crazy. Either you’re crazy or I’m crazy because it couldn’t possibly have been it.
So, as you know, we had dozens of experiments like that, and interestingly we were not very good at predicting what would be the influencing thing. We knew in advance, “Oh, well…” The nylon stocking study was actually done because we had the stockings imbued with different aromas. And we thought the aroma would influence. The aroma had no impact. We had no guess that the position thing was gonna have any impact. And in general, our experiments where we thought we had these clever expectations, which most people wouldn’t guess, most psychologists would get wrong, we were wrong, and we weren’t original because everybody thought what would happen would happen.
So, anyways, that’s the basic evidence that we drew on for the conclusion. But introspectively it’s… I mean, if you do your introspection right, you’ll realize you’re missing all kinds of things that are influencing you.
Andy Luttrell:
Yeah. I’m glad that you mentioned the stocking study because I was gonna ask specifically about that one, because that is the one that stands out to me for whatever reason. And in part, I think it’s because I’ve just been so curious for all these years where the thought came from that, “Oh yeah, people will pick based on the order that these stockings are laid out.” And actually, I just revisited the paper today to see like do they even say like, “Oh, there’s this theory of why,” but it seems like no. You were trying to do something else, like you said with the aromas, and you discover, right? Talk about lack of knowing what goes on under the hood.
Richard Nisbett:
Exactly.
Andy Luttrell:
And so, were you, this was at a mall, like hundreds of people coming to a table at a mall, and were you one of the ones who was coaching people through this or were these students?
Richard Nisbett:
Early on, and then Tim was the experimenter for most of it.
Andy Luttrell:
Okay. I just wondered if you had any firsthand witnessing people flabbergasted that they could possibly be influenced by the order?
Richard Nisbett:
We didn’t press our luck. We just asked, “Do you suppose you could have been influenced by that?” And you know, they gave us a funny look and said no. But we didn’t ask everybody.
Andy Luttrell:
And so, it also makes me think of in terms of doing research on psychology, oftentimes this starts with this introspective process, or at least the process we think is introspection, right? We go, “Huh. Why is it that I just did this?” I was just talking to Bob Cialdini a little bit ago and he talks about how he was influenced to buy candy bars at his front door, and he had to take a moment to go, “Well, why is it that I did that,” and concoct some sort of naïve theory. But until you test it, you don’t really know. So, in some ways, the enterprise of being a social psychologist opens you up to say, “Yeah, I don’t know that we do have this kind of awareness.”
Richard Nisbett:
That’s right. No, I mean sometimes people will ask me why I did this or why I think that, and if it’s a social psychologist I say, “Remember that I’m Nisbett of Nisbett and Wilson, so what you’re about to hear should not be given much credence.” You can’t say this to your ordinary [inaudible]. Don’t trust what I say. I mean, they think you’re crazy, so it’s just social psychologists that I can say that sort of thing to.
Andy Luttrell:
I have one other question about… This is just a one-off thing from the annals of the literature, which is the paper on intrinsic and extrinsic motivations for kids wanting to engage in some activity, either because they truly want to or they were rewarded, and the only reason I bring it up is because this one methodological detail has haunted me since I came across it, which is that some of the data had to be dropped or some of the class sessions had to be skipped because of “an unanticipated arrival of a goat in the classroom.” Do you have any memory of this at all?
Richard Nisbett:
Yeah. I don’t… I certainly didn’t have a spontaneous memory. I recognize it when you say that. Oh yeah, that’s in the paper. I was not there for the goat incident. It was Mark Lepper who was there.
Andy Luttrell:
Gotcha. Yeah, the results and conclusions are one thing and important, but I always just wonder. I need to get to the bottom of who this goat was and how he got in the classroom. Okay, so moving along a little bit, one of the things that was striking that I’d never quite taken the time to consider is the reaction to the work that you and others have done on errors in judgment and decision making. So, you and others have famously documented these errors that people make when they reason that they’re swayed by some information that ought not to sway them or they overlook information that they ought to be incorporating in their judgment. And you write in the book about this reaction to that that I wouldn’t have seen coming, which is not to say that these biases don’t exist, but that we shouldn’t be calling them errors and that to do so is to be judging people for faulty reasoning when they think that that’s not the case.
So, I’m curious if you could reflect a little bit on what that was like to be in the middle of and how you would characterize this particular kind of resistance to the work that you were doing?
Richard Nisbett:
Well, people were taught in graduate school we don’t judge people’s behavior. We don’t say what’s good or bad. That’s just… We just work here. We just report what happened. So, when somebody comes along saying people make errors in their reasoning, they say, “Well, that’s for philosophers and preachers. That’s not the job of psychologists.” But the reaction from philosophers was not sunny. I mean, these smart people, there were Oxford philosophers saying you can’t assert that people’s reasoning is wrong. It’s correct by definition. So, that was… It was shocking to see that the resistance, that it went on for like a year or two, until this one Oxford philosopher wrote this completely crazy paper. There was a tremendous amount of commentary from philosophers and psychologists, and I assumed that psychologists would side with us. I wasn’t so sure about philosophers. In the end, all psychologists sided with us. People do make reasoning errors. And the great majority of philosophers did, as well.
Andy Luttrell:
In one sense, it seems like yeah, it’s in whose domain is it to say that this is right or wrong? But they were almost saying it’s our domain and we’re saying that this is totally fine reasoning. We shouldn’t call it erroneous. What… Could you sort of talk about why, like what they mean by that? What would that mean, to say we cannot call these errors in judgment if people are doing them?
Richard Nisbett:
Well, they made a false analogy to grammar. They said, “You know, people, they can’t tell you what their grammar is,” and that’s true. I mean, if you start asking me, “What are the rules of English grammar,” I just start producing gibberish. But I don’t make mistakes. I mean, very often. So, it’s that same way with reasoning. I mean, we can’t necessarily say what the rules are that we use to reason, but we don’t make mistakes except rarely, and when it is, it’s a glitch, and we can be corrected, and we understand. But it’s just a terrible analogy. That’s all.
Andy Luttrell:
Hey, at this point in the interview I’m about to ask for an example of judgment errors and Nisbett goes on to give some great examples, but they rest on two concepts that may not be super clear or familiar, so I just wanted to quickly fill you in before we get back to the interview. The first idea is the law of large numbers. It’s an idea from probability and statistics. Basically, this is the notion that individual observations are volatile but averaging across many observations gets you closer to a stable estimate of some probability. So, we all know that a coin has a 50% chance of coming up heads and a 50% chance of coming up tails. If I flip a coin just three or four times, I might be way off from that 50% probability. Maybe the coin comes up heads three out of four times, or maybe even all four times. Based on this limited set of observations, you might think I have some extraordinary coin that comes up heads more often than not. But keep flipping the coin 10 times, 100 times, 1,000 times, and you’ll find that the probability of it coming up heads ends up being a very snug 50%.
Nothing really changed about the coin over time that made it go from 100% heads to 50% heads. We just amassed more and more data that got us closer to the truth. But as our guest explains, people’s intuitions often fail to appreciate that many observations will naturally temper what can otherwise seem like extremes.
Second is the concept of the sunk cost fallacy. Here, the idea is that people continue to do something even when it’s not working out just because they’ve already invested time, money, or effort. So, imagine you go to a movie theater and halfway through the movie you realize that you hate this movie. Should you keep watching it? Well, you might think, “I already paid for the ticket, I drove all the way out here, I should just sit here and finish the movie.” But the ticket cost and the driving time are sunk costs. You already invested them. You’re not gonna get them back. So, it’s irrational to use those things as an excuse to keep watching a movie that you know you hate. But we do this kind of thing all the time. That’s the sunk cost fallacy.
Okay. Hopefully, that gets us all on the same page and we can jump right back into my conversation with Richard Nisbett.
Could you give an example of an error in judgment, just for folks who may not really understand what this means that people… It might seem like of course we can’t make errors in grammar, but how could we make errors in judgment? According to the work that you and others have done.
Richard Nisbett:
Okay. Great question. If you ask college freshmen early on in the term, you say, “As you may know, there are frequently a lot of people early in the season, in the baseball season, with batting averages of .350 or higher. Why do you suppose that is? They’re very high, but nobody ever finishes the season with such a high batting average.” And they’ll say, “Well, the pitchers make the necessary adjustments,” or, “Maybe the guys get cocky and start screwing up.” If you ask them at the end of college that same question, they’ll say, “Well, you know, early in the season there haven’t been that many at bats and your first at bat your batting average is either zero or 1, so extreme scores are more likely if the sample size is lower.” Not everybody says that, but the majority do, and the great majority of people who’ve had a statistics course give you that answer.
So, it’s an error to fail to notice that the question has embedded in it a problem in the law of large numbers. And I like that example because it shows that our reasoning is correctable. Which, by the way, early on I did not believe. I mean, I used to say not only are we stupid, but you can’t make us smarter. And that again came from something that somebody told me in graduate school, which is that you can’t teach abstract rules of any kind. It’s just… So, I started teaching the law of large numbers, correlation principles, cost benefit principles, and to my astonishment, we could teach people in just a very few minutes an abstract rule like the law of large numbers, and they would apply it across a huge, indefinitely large number of domains and problems. Which was completely beyond what I thought was possible and I don’t know why it turned out to be so easy.
Like the sunk cost principle or opportunity cost concept, these are tremendously important rules of reasoning to understand, and people don’t understand them in general, but you can teach them like a couple of examples, and you can change people’s lives. I mean, it’s really important not to be trapped by sunk costs. And I mean, people, they go to an expensive restaurant, they order a meal, they think it’s gonna be great and it’s lousy, and it’s a lot of food, and they finish it anyway because it would be wasteful to not finish the meal. Well, that’s wrong. I mean, you know, that… You suffer twice. Once for the expensive meal and twice for eating it. You don’t have to do that. You can just say, “To hell with it,” you know.
Sometimes you win, sometimes you lose. And just this… Not much more than this much instruction would change the way people think. And we know by calling them up in the guise of an opinion researcher and we give them some problem like in government, and if they’ve been through our experiment, they’ll recognize the sunk cost situation. Not always, of course, but with greater probability than if they haven’t had that instruction.
Andy Luttrell:
And that’s important, too. It’s not like a test at the end of the instruction, right? It’s a separate domain. You would never think that, “Oh, this is those people calling me.”
Richard Nisbett:
That’s right. And you know, I had this quarrel. I still have this quarrel with Danny Kahneman, who says, “Oh no. You can’t…” Somebody told him in graduate school you can’t teach people abstract rules, and probability and statistics and so on are such rules. I say, “But Danny, look.” No, no, it’s like it’s in an academic setting. They’re in an academic setting. That’s the only way they would… So, that’s when we started pretending to be opinion researchers. And that didn’t convince Danny. Don’t ask me why.
Andy Luttrell:
I was about to.
Richard Nisbett:
So, I don’t know. It’s very strange. Why is that? And the point I make in the book… Danny, I mean, he’s fabulous. He’s probably the psychologist whose work I respect most of any living person. But he can be wrong. I mean, it’s interesting. We’re wrong about a lot of things.
Andy Luttrell:
Especially as someone who studies errors in judgment biases.
Richard Nisbett:
That’s right. That’s right.
Andy Luttrell:
So, the last of your work that I wanted to talk about was your work on culture, which is actually kind of what I know you most for, so in my first year of grad school, for my winter break I got a copy of Geography of Thoughts and that was my winter break reading. And it just… That was the thing that unlocked for me what culture can do, right? And how it reshapes even the parts of psychology that you’d go, “Oh. Well, those are the things that are just like part of how the brain works, right?” Processing information and that sort of thing.
And it has become now like anytime I teach, there’s always a little module on and here’s where cultural research says that this doesn’t always happen or it works somewhat differently. And you were kind of at the forefront, as far as I can tell, of looking to culture within social psychology as an important variable. Can you talk a little bit about what that turning point was like? My guess is sort of around the late ‘80s, early ‘90s is where some of that really started to take hold. You know, what was happening with culture before then and what does it look like now?
Richard Nisbett:
Well, the stuff that was going on with the culture that I knew about was not very interesting. I mean, it just… Somebody would go do a dissonance experiment in France and it worked, or they’d do it in Japan, and it didn’t work, and I would say, “Oh. Well, Japanese were not trained as social psychologists, so they just don’t know how to do these experiments, so that’s not a very interesting finding.” Or, you know, showing that opinions differ. Oh yeah, we know that. That attitudes and beliefs are very different across cultures, it wasn’t very interesting to just demonstrate that.
But I don’t know actually what phenomena about culture I was convinced would show big differences in reasoning or perception. I don’t recall what I first thought. But I decided I was gonna do research on it and I walked into Hazel Markus’s office one day and said, “Guess what, Hazel? I’m gonna teach a seminar on culture and psychology.” And she said, “No, you’re not. I am.” So, of course we taught it together and it is the most exciting educational thing I’ve ever been involved in. But-
Andy Luttrell:
And at the time, though, what did a course like that look like? It had to look different than it looks now, like now we have so much of that work.
Richard Nisbett:
That’s right. Well, Hazel I think was beginning to get some stuff showing that there were cultural differences in self schemas, the way people thought about themselves, and I had started my work on the culture of honor, showing that southern males were more likely to commit murder than northern males, and I restricted that finding to whites because I didn’t want the complication of race in there. And I started with that research because I knew I was gonna do research on culture and that was a dangerous thing for a white male to do. So, I said, “Well, actually I know some bad stuff about white males, so I’ll start with that.” Because I grew up in El Paso, Texas, where middle class people shot each other. I have a relative who shot her husband.
And one of the things that struck me when I went to the north is there are a lot of things I didn’t like about the north. I thought people tended to be rude there. I still think that, for that matter. But it became clear to me that middle class people didn’t shoot each other. I thought, “Well, that’s interesting.” So, 30 years later I demonstrated that that was the case. There was this culture and difference. It isn’t really a bad thing about southerners. I mean, it’s too bad, but they’re not bad people. It’s just they’re stuck with this attitude about how you have to deal with insults. You have to respond with force. Even I have it. I mean, I just… You’re stuck with it. It’s our culture that made us that way.
Andy Luttrell:
And it stands out interestingly to me, especially for its time, that the cultural lens was sort of within the U.S., different regions, and how regional differences within a country can give rise to cultural differences. Which contrasts a little bit with probably the stereotype of cross-cultural research, which is that it’s all about East versus West, and I know that that has evolved, and that’s not always even ever really truly been the case, that it’s all that kind of comparison, but it does strike me that that’s often what cross-cultural research boils down to in textbooks and the way that it’s introduced. Do you have a sense? You talk about it a little bit, hint at why that might be the case, but is there a reason why that particular comparison took off in the research sphere?
Richard Nisbett:
Yeah. Shinobu Kitayama got his degree at Michigan, and he got to… struck up a lifelong relationship with Hazel and shortly after he graduated, they start… They had been having these conversations about differences between Japanese and Americans, and I had the same kind of experience. I went just by accident, I gave lectures on social psychology at Peking University in the early ‘80s, and there was a student there who could hardly speak English, but it was clear this guy was really, really smart, and several years later he came to get a degree in psychology at Michigan. And one day he says, “You know, Dick, you and I think very differently.” I said, “Yeah? Tell me more.” So, he essentially told me the whole deal.
Andy Luttrell:
Right there.
Richard Nisbett:
Right there. He says, “You know, you think linearly. I think dialectically. You incorporate very few elements in your thought for coming to a certain conclusion. I look at a huge range of things.” So, I said, “Well, okay, let’s…” I didn’t really exactly believe that, but I said let’s test it. We started testing and then it was astonishing how many differences. So, it started… Hazel’s work and my work were based on East Asians versus Westerners, and now of course most of that, the brand of research that we did and all kinds of others, as well, have been extended to various countries. And now, I would say that the difference is between the weird countries and the non-weird countries. That refers to Western, educated, industrialized, rich, and democratic peoples, and the great majority of other people are quite different. I mean, and different in different ways, but sort of standard Western way of thinking that’s quite different from other ways of thinking.
But you can show differences. You can show we have a student we’re working with, Shinobu and me, did a wonderful study looking at people in a particular Turkish village. There were three basic occupations there: farming, fishing on the open sea, and being a shepherd. Now, the first two of those occupations require substantial cooperation with other people and I had reached the conclusion that it was this cooperative, harmonious stance that’s producing the attention to the social environment, and beyond that to the physical environment, and the particular forms of reasoning that were characteristic of interdependent cultures, and shepherds are out there by themselves.
So, she showed, sure enough, the shepherds look like Westerners and the farmers and fishers look like Easterners. I mean, it’s with respect to reasoning and perception. North Chinese are different from South Chinese, it turns out. North Chinese do a form of agriculture which doesn’t revolve much around cooperation. Rice agriculture is massively dependent on tremendous cooperation.
Andy Luttrell:
Yeah. I find that wave of work to be really compelling. It’s nice to see it given empirical attention, but this idea of like, “Where did these differences come from? Why would they be there?” And when you look historically and say, “Well, you know, the pressures that are on this group for hundreds of years, thousands of years, are likely to result in having to adapt to those pressures.” But by way of wrapping up, just to think big picture about some of these things, I’m curious where even the impetus to write sort of a retrospective, “Here’s all the stuff I’ve done and how it happened,” book came from. Was there a moment where you thought, “This is what I should be doing right now?” What was the moment and why did that seem like an important project?
Richard Nisbett:
Well, there’s been such a… Leaving out the culture work, especially the culture of honor work, everything else has been… It’s been about thinking, and reasoning, and what goes on in people’s heads. So, there’s a coherent theme in my work and I thought there’s a coherent theme there, and I wanted my work to get known to a wider public, because it takes a long time for anything to leak out of academic psychology into the public if it’s ever going to. But the other thing that made me want to do it, I wanted to see if I could write prose that would qualify as adding some literary value. And I succeeded in that I believe.
At any rate, Michael Lewis, the author, you probably know his work. He’s the one who wrote Moneyball, and who wrote a book explaining what happened in the 2008 crash, told me he had written a New York Book Review asking to review my book there because he really liked it very much. I said, “Oh, well. Okay. I succeeded in convincing a good writer that I’d written something of some value.” So, those were the two motives.
Andy Luttrell:
And particularly he’s someone who also… Michael Lewis has a book on Kahneman and Tversky, right? So, he knows that goal that you have of using social science in a broader sense. And Malcolm Gladwell’s name on the front of the book, another person who does that. So, following in footsteps trying to bring the work that we do as social scientists to a more general audience.
Richard Nisbett:
Right. Well, both of those guys were sitting on my shoulder when I wrote… Different shoulders while I was writing that book.
Andy Luttrell:
And so, having taken the time to look back on your career in sort of like a… from start to finish, day one to today, what advice might you have for someone either who is… think about a social psych grad student or think about someone who just will never go to grad school but is curious about understanding reasoning, right? How would you encourage someone to think about questions like this?
Richard Nisbett:
Well, if you’re gonna do research, just use the techniques of social psychology, which are incredibly powerful. And it’s been interesting to me over the last course of my career, I learned these techniques of social psychological research, and it was so clear to me that the other sciences could use that, use those techniques. I mean, economists have not a clue on how to assess… And you might be interested to know that Dick Thaler of Nudge, he got his ideas from reading the text written by Tom Gilovich and Dacher Keltner and me about Lewin, and he said, “God, this Lewin was just… He was… He really knew a lot of stuff.”
And he drew on Lewin’s ideas and Lewin’s methodology and comes up with behavioral economics. People sometimes say, “Oh, social psychology is sort of like behavioral economics.” Yes. Behavioral economics is social psychology with a name change for business reasons.
Andy Luttrell:
Well, I appreciate you taking the time to talk about this work and share it with a broader audience. It was great to meet you.
Richard Nisbett:
Good to meet you. I enjoyed it.
Andy Luttrell:
That’ll do it for another episode of Opinion Science. Thank you to Richard Nisbett for taking the time to share his work with us. His book, again, is Thinking: A Memoir. You’ll find a link to the book in the show notes along with links to some of the research we talked about and a full transcript of this episode.
To learn more about this podcast, go to OpinionSciencePodcast.com. Leave a nice review on Apple Podcasts, read the other nice reviews on Apple Podcasts, follow the show on Twitter, Facebook, LinkedIn. It’s nice to have you around.
Alright, I think that’s all I have to say. Thanks for listening and I’ll see you in a couple weeks for more Opinion Science. Buh bye…