Laura Wallace is a social psychologist who studies what happens when people perceive a communicator as biased. In this episode, we talk about why bias is different from trustworthiness, how perceived bias affects a person’s ability to be persuasive, and how we think about biased communicators in general.

Things we mention in the episode:


Episode Transcript

Andy Luttrell:

It’s hard to trust a salesperson. We bought a car last year, and the sales associate seemed to love every car on the lot, and he loved the expensive ones even more. So, were all those cars actually great, or was he just trying to make a sale? And was he flat out lying to us about how great these cars were, or was he just biased because he’s aligned with the Toyota family? We’re often suspicious of each other’s biases, and these concerns seem to have ramped up in the political sphere, especially related to how people view news media.

In a recent Gallup survey, US adults estimated that 62% of the news they read in newspapers, saw on television, or heard on the radio was biased. In fact, Elizabeth Jensen, public editor for NPR, wrote in 2017 that when they labeled the complaints that came in via email, bias was the label they used the most. She shared a few examples. “It is clear,” one listener wrote, “that NPR’s nationally syndicated shows, and in particular Morning Edition, continue to push a conservative viewpoint.” But another listener wrote, “I am writing to see why, objectively, there’s a clear anti-Trump, liberal-oriented bias against President Trump.” Yet another listener concluded their email by saying, “I can no longer listen to NPR because of it’s now blatant bias.”

Now, I’m not here to weigh in on whether a particular media source is actually biased or not, but it sure seems like people care a lot about each other’s potential biases. So, I’m curious, when do people come to perceive a communicator as biased? And does it change how much they believe the information itself? You’re listening to Opinion Science, the show about the science of our opinions, where they come from, and how they change. I’m Andy Luttrell, and today I’m excited to talk to my friend Laura Wallace. She’s a social psychologist currently finishing as a postdoctoral scholar at Ohio State University, and about to start a new postdoc at George Mason University. We talked about how bias is different from trustworthiness and her research on the roles that these two perceptions play in persuasion.

Andy Luttrell:

So, if you want to just start by giving a background on what do we know about source bias and what don’t we know yet about source bias?

Laura Wallace:

Sure. Yeah, so a lot of my work looks at when we see other people as biased and what are the consequences when we see another person as biased. And when I say source bias or seeing someone as biased, I mean that people perceive that person as having some sort of slanted opinion, usually because they’re motivated to view things in a particular way. So, you can think about grandparents being motivated to view their grandchildren positively. You could think about people with political biases being motivated to view things consistently with their ideology. You could also think about salespeople being motivated to view their product particularly favorably.

The work we have done on perceived bias has shown that perceived bias is different from trustworthiness or expertise, so one thing I think is really interesting and important is that you can actually perceive someone as biased without necessarily thinking that they are dishonest or inexpert. It’s just that they are viewing things a bit differently than what you think the truth is. So, you can think about the grandparents example again, where we tend to think of grandparents as truthful. They’re certainly highly expert. They know a lot about their grandchildren, and yet if you ask them how’s little Johnny at soccer, you know that you’re probably going to get a skewed view of Johnny’s soccer performance.

What’s interesting about source bias is that sometimes when we see someone as biased, it can make them less persuasive. But sometimes, it can actually make them more persuasive. So, most of the time, in fact, it probably makes them less persuasive, but in thinking about when it make them more persuasive, it has to do with when people switch positions. For example, you might think about with everything going on with coronavirus, you might think about these airlines or these restaurants that are now having to close. People who own those places might be motivated to view coronavirus as not a big deal, because they might want to be able to keep their businesses open, so you might view them as having a bias towards wanting to keep their businesses open around this time, so they might come out and say, “It’s not that big a deal. We should be able to continue operating.”

However, if they switched and said, “Oh, coronavirus is very concerning. We should now be closing this restaurant, or we should not be having so many flights.” That would be quite surprising, and it might make them particularly persuasive on that new position, because they would only take that new position if it was particularly persuasive.

Andy Luttrell:

So, I know that you’ve had trouble convincing the world about bias and trustworthiness being separate things, so this is a nice chance to go on record and make the claim. So, in that case there, with someone who switches their position, what is it about that that signals bias that doesn’t necessarily have to do with trustworthiness?

Laura Wallace:

Okay. Yeah, so what I’m actually talking about is rather than an inference of bias based on switching, I’m talking about if you have the perception ahead of time, so a consequence of a perceived bias, versus perceived untrustworthiness, so if I initially think you’re biased on the issue of coronavirus, and initially you think it’s not a big deal and I think you have some bias on that issue, but you switch, then I’m particularly persuaded. What you’re getting at, or what you’re helping me point out, is actually initially perceiving someone as untrustworthy in this context does not have that same consequence. So, if I think you’re a dishonest person, it doesn’t matter if you switch positions, because dishonest people switch positions all the time and it has nothing to do with whether there are good reasons for that position or not.

But if a biased person switches, it could only be because there is particularly strong evidence for that new position.

Andy Luttrell:

And so, that’s what people are thinking, that if they go, “I already thought you were biased in favor of whatever position you were saying originally and now you’re saying something else, I’m inferring that there’s a really good reason for you to have switched.”

Laura Wallace:

Yep.

Andy Luttrell:

Right, and in some ways, I take that reason with more faith that you have a good reason to switch than I would have if I felt you were dishonest in the first place.

Laura Wallace:

Yep. That’s exactly right.

Andy Luttrell:

In that way, do you think that trustworthiness feels more like it’s a trait? Like I’m picturing you as an untrustworthy person, and if I think you’re untrustworthy now, you’re probably always gonna be untrustworthy. But if I think you’re biased, it’s specific to this claim you’re making.

Laura Wallace:

Yeah, so to some extent, there may be truth in that sort of on average, that people think about trustworthiness as more of a trait-like feature, and bias as more situational specific. However, in other work we’ve done, we’ve shown that people, when they initially perceive someone as biased on one topic, can carry that perception over even to unrelated topics. So, for example, someone might say that they are pro-gun control, and then later we ask people if they think they have a biased position on chocolate, and they say yes if they thought the person was biased on the issue of gun control.

Now, this only happens if we don’t kind of remind them that those topics are kind of unrelated. Once we sort of remind people that gun control and chocolate don’t have that much to do to each other, they don’t carry the bias over, but without that reminder, they seem to. So, it does seem that at times, people do form this kind of general impression of bias, like you’re just the kind of person who doesn’t seem to need a lot of evidence for your positions. You’re the kind of person who lets your motivations really color your perspective on things, rather than trying to take a more objective kind of stance. And you may be able to think of somebody who you think just always has a bit of a slanted view of things, like always is letting their motivations color their view more than other people who you might think are really trying to be reasoned, and thoughtful, and correct for their own biases.

Andy Luttrell:

So, in that way, if I think of someone as a biased person, it’s not necessarily that I’m thinking they’re always being biased by the same motivation. It’s just that I think they’re the kind of person who lets their wants and needs color how they view the world.

Laura Wallace:

Yep, that’s right. And you can even think about like people probably vary in how much they try to correct for their own biases, so if I know that… For example, if I had a liberal kind of bias, I might be aware that that might make me more favorable to information that supported a liberal position than a conservative position, and when I encountered it, I might try to correct for a bias that I know I have. Whereas other people may not be worried about trying to do that bias correction at all.

Andy Luttrell:

I think, so bias is one of those terms that when I talk about it with students, for example, I realize psychologists have it kind of baked into their bones what bias means, and we kind of throw it around, and it totally communicates what we think we’re talking about. But I think the general use of the term maybe isn’t always consistent with that, so just kind of to unpack what bias is about and to see if we’re on the same page, it’s just, kind of like I said before, if any goal I have, or want or need I have is changing the way I view something. So, probably most of the time you’re talking about bias in terms of opinion bias, right? So, if I’m hearing someone’s opinion, for me to perceive that person as biased is me saying, “Well, the reason they have that opinion is really about something else that this person wants out of the exchange.” Is that… Would you say that?

Laura Wallace:

Yeah. I would say that’s similar to how I think about it. The most straightforward, I think, way to think about it is that there is some truth value. There’s some objectively correct thing, and we all have some sense of what that is. And our own senses of whether that is correct may be inaccurate or not, but if I think, if I have an idea of what’s true, and you say something that is very different from that, I’m likely to perceive you as biased, because we’re not agreeing on what I think the truth is. So, in some sense, if I think the correct… If I think we could make it really concrete, like about distances, right? So, if I think the length of the park that we’re standing at is 10 feet and you think it’s 15 feet, I will perceive you as biased towards viewing the park as too long.

What’s tricky is we don’t often have truth values. We can’t often go get a ruler and figure out what is the actual length of the park. These are often matters of opinion, so we’re sort of guessing based on what our own opinions are, and I think to some extent, we’re also looking at how people weigh information. So, if I judge one study that supports my view more favorably, and a study that doesn’t support my view unfavorably, even if they have similar methodological flaws, I might think that person is biased. Even if I don’t know what the truth is, I can see that they are not taking a fair approach to arriving at some conclusion.

Andy Luttrell:

So, how common do you think these perceptions are? Like are people constantly… Any time someone voices their opinion, they go, “Hmm, is this a biased one or an unbiased one?” Because in some ways, what you get to catch in a survey is when you directly ask people like, “Hey, think about what this person says. Does this seem biased or not?” Is there any evidence we have, or even just some inner feeling about whether this is something that people go around going, “Hmm, bias, no bias? Bias, no bias?”

Laura Wallace:

Yeah. I mean, this is sort of personal speculation, but my sense is there are some domains where bias is really common and we’re kind of testing for it. So, politics, I think it’s quite common for people to perceive others as biased. Other interactions, like going into buy a car or other big items, we’re probably testing to see whether a salesperson is biased. But especially if we’re sort of going about with our friends who generally agree with us, we might be less likely to be sort of mentally testing whether our conversation partner is biased or not.

Andy Luttrell:

You talk, too, that by and large, bias gets in the way of persuasion. Perceived bias gets in the way of persuasion. So, you described a time where, sure, if I think you’re biased, so long as you then change your mind, then you’re more persuasive.

Laura Wallace:

Yeah.

Andy Luttrell:

But typically, it seems like if I think you’re biased, your message is gonna carry less weight in changing my mind. Is that fair?

Laura Wallace:

Yep. That’s exactly right.

Andy Luttrell:

And so, why? Is it just that people don’t like… They want to punish you for being biased? Or they don’t accept that what you’re saying is valid or applies to them? I don’t have the same bias you do, so this doesn’t speak to me specifically? What’s going on?

Laura Wallace:

Yeah. So, I don’t think it has to do with liking. So, we’ve shown that at least in some circumstances, perceiving other as biased doesn’t necessarily affect how much you like them. So, for example, in one study that we’ve done, we told participants about aid workers trying to figure out how to respond at the start of an Ebola crisis. And we had one aid worker who had served in a particular, had done his Peace Corps in a particular region, and was then advocating to send resources to that region. People tended to see that aid worker as more biased than aid workers who did not have personal experiences with particular regions that they were advocating get aid. Nevertheless, they liked all of the aid workers in equal amount.

But also, that aid worker who had the experience in the region he was advocating for was less persuasive, and I think that is because people are sort of saying, “I don’t think your opinion represents reality.” So, in that case, that aid worker’s motivation to help this region that he has a personal connection to is making him view that region as needing more aid than it actually does. And so, I think people are saying his perception is different from reality and I need to adjust for that person’s opinion.

Andy Luttrell:

So, I’m trying to think of… How do you take this and give a reassuring message to people who are advocates for things they care about, right? Like there’s… The times or the people who are really strong advocates for an issue are probably the ones who others would think are biased when it comes to that issue, right? And so, those just seem inextricably linked. Is there a ray of hope for people? Like I guess one idea I had was if you just say, “Listen, I may be biased, but…” I know there’s some work on that in prejudice, so if people say, “Listen, I know I have biases. We all have biases.” Even though that feels like it’s doing good and enhancing some sense of trust, it can actually backfire. People go, “Oh, you do really have a bias? I’m not gonna listen to you at all then.”

And so, in your sense, from like a public advocacy perspective, would it help to admit to your biases? Or would it hurt to admit to your biases?

Laura Wallace:

Yeah. That’s such a good question. It’s something that we’re testing right now, so I don’t have an empirical answer for you right now, but the hypothesis is that if people admit that they’re biased, it might signal that they are trying to correct for it in their advocacy, and so that bias might not be such a problem if people are willing to admit it.

There are other things people can do to try to mitigate their perceived bias. So, one is acknowledging another side of the issue, so if you just give an advocacy for a position and you only present your side, you’re more likely to be viewed as biased than if you acknowledge that there are some arguments for the other side, as well.

Andy Luttrell: 

In the work that you’ve done on bias, the persuasive messages that you’re showing people, do they tend to be ones that the audience already is amenable to? Or that would challenge their views? Because there’s part of that, when you said admit the other side, I think, “Well, if you’re on my side, I’m happy for you to be biased, because I’m already all in.” And for you to now start saying, “Well, maybe the other side has some…” I go, “No, now you’re biased, because now you’re caving to the other side.” So, just kind of the bird’s eye view of the research that you’ve done or that exists on perceived bias, are we mostly talking about preaching to the choir effects, or challenging people, or talking about totally new stuff?

Laura Wallace:

Yeah. So, there is a lot of work suggesting that people tend to see other people as biased when they disagree, so that effect is certainly there and well established for many decades at this point. The work that we’ve done has primarily used novel topics, so topics for which people probably don’t really have attitudes. So, in the study I was talking about with the Ebola outbreak, that was a novel situation, where people wouldn’t have had attitudes about how to allocate aid prior to coming into the study. We’ve used other topics, like university policies of various sorts, that people wouldn’t come into the study with having opinions on those things.

I think it’s a really important question to try to understand more about how does perceived bias of the source play a role when people are coming into the study with previously held attitudes, and either really strongly agree with the source or really strongly disagree with the source. But we don’t have data on that right now, so that’s a really important open question.

Andy Luttrell:

Because you mentioned political domains before as being a place in which these perceptions of bias can be pretty rampant, but those are so juiced up with prior opinions that people are coming into it. You’re either with me or you’re against me.

Laura Wallace:

Yes.

Andy Luttrell:

And so, I guess maybe you could speak too to like in the wild, surely there… This is all over the place, these perceptions of bias, and as you’ve been doing this research, have you become more aware of accusations of bias in the world?

Laura Wallace:

Well, I do have a Google alert for biased, so I get lots of updates in my inbox of anytime someone perceives another person as biased, which it turns out is a lot. So yeah, I think it’s made me aware of… I think it’s made me more aware that this is quite common, although I also got interested in studying this because I thought these perceptions would be quite common. Related to the political domain, there is some work suggesting that if we think about it, politics is a domain that’s often moralized. There is some evidence suggesting that people are particularly likely to view others that disagree with them as biased when they see the issue as a moral one. So, if politics is a domain that people think of as particularly morally relevant, that could be one reason why perceptions of bias are so rampant in that domain.

Andy Luttrell:

What is it about morality that inspires bias? If we can be Coke, Pepsi, Coke people and Pepsi people, and I could still think you’re biased if you buy Pepsi instead of Coke. What is it if I moralize it, if that’s like a part of my moral identity, why am I more attentive to your bias?

Laura Wallace:

Yeah. I think it may have to do with the idea that bias is really about are we seeing the world the same, and if I think this is a moral issue, things just become much more black and white. And the degrees of what is correct and not correct are just… are sort of narrowed in some sense, and so if you disagree with me on an issue that I think is moral, we just are not seeing the world the same. And of course, my way of seeing the world is the correct one, and so yours must be the wrong one, which you could only have reached through biased thinking.

Andy Luttrell:

So, is there any hope? Because, like the premise of social psychology for 100 years has been people are biased, right? We see the world the way we want to see it and so there’s no getting around that, and you said that… You framed the conversation about politics, not being able to see eye to eye through perceived bias, what’s the hopeful message there for people?

Laura Wallace:

Yeah, so I think it’s twofold. One is as you point out, we all have biases, and that is just sort of a part of being human, is that we’re prone to biases. So, of all the perceptions you have of others, seeing them as biased is really not as bad as seeing them as dishonest or even as stupid. Unfortunately, when people disagree with one another, we have some evidence that they not only see disagreeing others as biased, but also as dishonest, and lacking expertise, and stupid, and dislikable, and I think those perceptions also feed people’s inability to have conversations when they really disagree. And so, the fact that we have shown that people can think about, can perceive others as biased without perceiving them as untrustworthy, or lacking expertise, or dislikable, I think is hopeful.

I’m interested in seeing if we can remind people like, “Hey, just because someone is biased doesn’t make them dishonest. It doesn’t make them inexpert. It doesn’t make them a bad person.” And we all have biases. That’s just a part of being a human, so if we can separate the perceptions and remind people that biases are part of being human, I wonder if that is a way that we can try to open up dialogue across difficult conversations.

Andy Luttrell:

I get the impression, too, that in the last 50 years of research on persuasion, we’ve spent a ton of time thinking about how expert a source seems, or about how trustworthy a source seems, but relatively little about whether we have a perception of bias. So, that’s my impression looking back over all the research. Having been steeped in this a little more, does that seem a fair characterization, or how would you explain sort of the trajectory of why we care about bias more now than we did before?

Laura Wallace:

Yeah, so I would say that’s exactly right and that’s exactly why I did this research, is for 60 years in persuasion research, researchers have primarily cared about what makes a source credible or believable, what makes them the kind of person that you’re gonna believe what they have to say, and researchers have identified two factors. They need to be trustworthy or truthful, and expert or knowledgeable. And the work that I’ve done is to try to say, “Well, also perceiving someone as biased can undermine their credibility in addition to these other two.”

And we hadn’t identified that bias could undermine credibility because persuasion researchers had not really studied it, so why would that be the case? There are some things that can lead people to perceive others as both dishonest and biased, specifically vested interest. So, when someone has a vested interest, it means they can get something out of successfully persuading you. So, for example, you can think of a salesperson, if they sell you something, they get commission. And the fact that they have this vested interest might lead people to view them as dishonest, right? They might be willing to lie in order to sell you something. It might also lead people to perceive them as biased, that is even if they are the most honest salesperson in the world, the fact that they would benefit from selling you something might motivate them to view their product especially favorably, and in fact there’s some psychological research suggesting that yes, in fact, once you own something or you want to sell something, it makes you view it more favorably than if you’re not in that situation.

However, for 60 years it was very common to use vested interest as a means of studying trustworthiness, because it does affect dishonesty, and so researchers hadn’t thought about bias as separate from untrustworthiness or dishonesty and had lumped them together. And so, we have been kind of missing that piece of the puzzle, and so what my work has done is try to show that in fact there are lots of times when we think someone is very trustworthy, and yet we also think of them as biased.

Andy Luttrell: 

What, just if I just back pedal a little bit, why all this? What is it about this topic that makes you interested in it? I like to unpeel why… If it’s not bias, even just like persuasion, what… I mean, you’re now a dyed in the wool persuasion psychologist, so what got you here? And was it where you thought you were gonna go?

Laura Wallace:

Yeah. Good question. So, in college I was doing a lot of work around sustainability issues, but also other social justice issues, trying to address race and gender gaps in a variety od domains. And I was not very successful at persuading others, particularly around the work I was doing on sustainability and climate change, and I thought, “Boy, it would be really helpful if I knew how to persuade other people better.” And I found out, “Oh, there are in fact people who do empirically study persuasion.” I think the perceived bias… I didn’t go into grad school necessarily thinking like, “I’m gonna study perceived bias.” But I do think it naturally fell out of my interests, because as you mentioned earlier, when people really care passionately about something, they can be perceived as biased, and I think I had lots of experiences of feeling like someone dismissed me as biased, and that got me interested in this topic, as well as persuasion more generally.

Andy Luttrell:

Have you thought of doing any work on what it feels like to be dismissed as biased? Because I feel like that is a very… I guess it’s an ever-present experience, but I feel like online now, when you see accusations of bias flying, we’ve been looking at one side of it, or you’ve been looking at one side, right? Why are people flinging accusations of bias? But how do people react to being called biased?

Laura Wallace:

Great idea! Let’s do that study. I mean, so there is… Kathleen Kennedy and Emily Pronin have some work on the spiral of conflict, where they do talk about… It’s a little different, but it’s related, where they talk about if I see you as biased, I don’t think you’re quite as rational, and so if we’re in conflict, I will respond more competitively than cooperatively with you. And when I respond competitively with you, that makes you perceive me as more biased, and makes you respond more competitively to me. And so, then we have this nasty spiral going.

You can imagine that if I think someone has just dismissed me as biased, that would play into that, where that doesn’t make me feel warm and fuzzy towards that other person. It makes me feel defensive.

Andy Luttrell:

And I wonder if we go back to trustworthiness and bias being different, if you call me untrustworthy versus if you call me biased, that does feel a little different, right? Because in some ways… Well, they both make you reflect on a different part of yourself, and either you feel secure that you’re trustworthy, or you feel secure that you’re evenhanded, and both of which are probably not always true for everyone. We probably all feel like we’re a little less biased and more trustworthy than we actually are.

Laura Wallace:

Yeah. I definitely… So, I also think this is a very interesting future research direction, but my own intuition is the same, that being called untrustworthy is much worse than being called biased. And in fact, I think there are times when we really ourselves want to be biased and when we want others to be biased. So, if I ask my partner, “Do I look fat in this?” I’m hoping that he sees a better version of me than might be there, right? I want him to be biased and I want him to be honest about it, right? I don’t want him to lie, I just want him to view me through rose-colored glasses.

Andy Luttrell:

I could see that also in a political climate, too, where you’re like, “I would love to just live in this place where everybody is all leaning in the direction, and that would be great.” And I want everyone to buy into it, or like you say, “I want them to be trustworthy about it, but I’m happy to have people lean in my direction so that I can sort of feel some validation that truly this is how the world is around me.”

Laura Wallace:

Yep. I think that’s right.

Andy Luttrell:

So, have you taken your initial interest in public advocacy back, like now that you’re an expert, I think we can… There are documents that say you’re an expert in psychology. Has this shifted any way in which you engage in public advocacy?

Laura Wallace:

I think to some extent. Maybe this is an ironic effect. I think it has made me less engaged in advocacy than I used to be, because I’m very concerned about being perceived as biased, and I want people to take my science seriously. So, I think that may be a bit of an ironic effect, but it does also I think make me try to be more measured when I do advocate for something. So, acknowledging that there is another side, acknowledging that when I support something, I understand that there are often tradeoffs that come with whatever position I support. So, yeah, that maybe is an ironic effect, but I think it has just made me more careful about what I’m saying publicly.

Andy Luttrell:

Nice. Well, I think that’s the time we have together, so thanks so much for talking about bias, and we’ll keep in touch after this, I know.

Laura Wallace:

Very good. Thanks, Andy.

Andy Luttrell: That’ll do it for this episode of Opinion Science. Thanks to Laura for coming on, and to learn more about her work, check out the show notes for a link to her website. For more about this show, visit OpinionSciencePodcast.com or follow us @opinionscipod on Twitter or Facebook, and this is officially episode five, which if you’re keeping track means that there are episodes one, two, three and four available, so go check those out if you haven’t already. And you know, while you’re already on Apple Podcasts downloading the archives, might as well rate and review the show. Help people find us. Okay, I’ve run out of things to say, so I’ll be seeing you here next time. Bye-bye!

Leave a Reply

Your email address will not be published. Required fields are marked *