Daniel Stone is a behavioral economist exploring partisan divisions in America.
I’m Isaac Saul, and this is Tangle: an independent, nonpartisan, subscriber-supported politics newsletter that summarizes the best arguments from across the political spectrum on the news of the day — then “my take.”
Today's newsletter is a subscribers-only Friday edition.
Why do we hate each other?
That question is at the center of a new book called Undue Hate, written by Daniel Stone, who explores the ways our own biases, psychology, and belief formation are driving us apart not just in the political realm, but in life more generally.
Stone’s thesis is that partisans in America dislike people they disagree with excessively, for a variety of reasons, but that dislike is often driven by mistaken beliefs and incorrect assumptions. To find evidence for his thesis, he reviewed studies on the accuracy of people’s beliefs about opinions held by members of the other political party. And what he found might surprise you: We are not particularly good at understanding our opposition.
We’ve touched on this issue in the past in pieces that comment on the “perception gap” — the difference between what we think the other side believes and what they actually do, which Stone references in our interview below. For example, if you ask Republicans whether they agree with the statement “properly controlled immigration can be good for America,” about 80% say they agree. But if you ask Democrats to estimate how many Republicans agree with that, they guess about 50%.
Likewise, if you say to Democrats that most police are bad people, about 80% say they disagree. But if you ask Republicans to guess how many Democrats disagree with that statement, they think it’s less than 50%.
In our interview with Stone, we don’t just discuss this phenomenon, but the implications of it. What happens when we think people are worse than they are? How do we react? How does it further shape our beliefs? And how does it stop us from having dialogue with people on the other side?
Our conversation has been lightly edited for clarity and length. I hope you enjoy it.
Isaac Saul: I came across your work by reading a piece you wrote in The Conversation, which I thought was fascinating. It was about the way we think about our political opposition and some of our misconceptions about them. But I'd love to just start with some of the basics. I don't know if I know what a “behavioral economist” is [laughs]. Maybe you could start by explaining that to me and a little bit about how your work touches politics?
Daniel Stone: Yeah, sure. I'll take a shot at that. Behavioral economics is the combination of psychology and economics. It incorporates more realistic psychology into the study of decision making and belief formation. So standard or neoclassical economics is sort of famous or infamous for assuming that people are sort of robots, rational maximizers or optimizers — in other words, that people make perfect decisions all the time. And also that we're great with statistics, that we incorporate new information in a statistically optimal way in forming beliefs under uncertainty. And behavioral economics is the attempt to model and understand the way people actually form beliefs and make decisions in a more psychologically realistic way.
Isaac Saul: Are you telling me that people don't take evidence into account when they're making decisions about how they feel on certain issues? [Laughs]
Daniel Stone: Yeah, right. So sometimes we ignore our evidence more than we should and sometimes we overreact to evidence. And sometimes we just make both mistakes. So it's kind of obvious that people screw up, but figuring out the specific patterns and the ways we screw up is not so obvious. We're still working on that.
Isaac Saul: One of the things that you wrote about in this piece, and I know is central to your book, is the “affective polarization bias,” which even as a political reporter and somebody who writes and thinks a lot about political biases and polarization was a term that was new to me. And I'm wondering if you could maybe explain what the affective polarization bias is and how it works.
Dan: No shame in that term being new to you, because it's a new term. It's a term I introduced in my book. Everyone's hearing about it for the first time now, for better or for worse. The term that you probably are familiar with is affective polarization. That's the term for emotional polarization. So rather than parties or people or whoever being polarized in terms of what they believe about an issue, affective polarization refers to polarization of feelings and people, just growing to dislike or feel hostility toward those we disagree with. So affective polarization bias is a term that I suggest for the idea that we actually tend to become excessively affectively polarized as compared to how much we should be by some objective standard.
So in the past, it has been implied that affective polarization is inherently irrational. But it's not clear that it's necessarily irrational to dislike another person, right? And in fact, some people might think that interpersonal feelings are something that can't be evaluated with respect to rationality, right? Feelings are just feelings, so we can't say if they're right or wrong. So people seem to think active polarization was bad, but we couldn't deem it to be objectively wrong and objectively biased.
But in the book, I argue that actually, yes, we can say that sometimes we are objectively too affectively polarized. Because our feelings toward other people are based on our beliefs about who they are, about the actions they take, and the opinions they hold. And if those beliefs are factually wrong and — and they can be, right? — then we can be too polarized.
I might believe that you like to kick dogs when you walk by them for no reason at all. And perhaps that’s not true at all. But if I believe that, that might cause me to dislike you. And if it’s a false belief, I’ll dislike you more than I should.
So given that we can have false beliefs driving our feelings, we can dislike people more than we should. We can like people more than we should. Affective polarization bias refers to the bias towards disliking people more than we objectively should, more than we should if we had accurate beliefs about their character traits and actions. So I claim that we generally have this bias toward people that we disagree with about political issues, but it also results in disagreement on non-political issues.
Isaac Saul: So when you guys go about doing this research, it seems like there's kind of a fundamental underpinning here, which is that people misunderstand the folks whose politics they disagree with. Is that a fair reading of what you've sussed out?
Daniel Stone: That's one of the major findings which supports the claim that affective polarization bias is a real phenomenon. So I think you're aware of a lot of this research. We tend to overestimate the extremism of the other side. We overestimate their consistency and homogeneity. We think they're all one crazy bad type and they're actually more diverse than we realize. We also misunderstand and overestimate how much they hate us. They don't hate us as much as we think. We don't like people that hate us, and so if we overestimate their hate, we like them less.
Another type of evidence I introduced in the book is that we overestimate how selfishly they act in experiments where people face choices between actions that benefit themselves and hurt other people. Or actions which are more prosocial and help everybody in the experiment, but maybe don't help the individual making the decision quite as much.
There have been a decent number of social science experiments on this type of thing where people are brought into the experiment, we ask them if they are a Democrat or Republican. We tell them, ‘Make choice A or B, you get a real payment in dollars from A and other people get another payment if you choose A and different payments from B.’ And some experiments have asked people, well, what do you expect people in the other party to do in these experiments? If you're a Democrat, do you expect Republicans to do the selfish thing or the pro-social thing? If you're Republican, do you expect Democrats to do the selfish thing or the pro-social thing? And there's evidence that in these experiments, people also misunderstand how unselfishly the other side acts, even when people are given bonus payments for guessing this accurately.
So I'll say in the experiment that you can't just bad mouth the other side just for fun. I'm going to pay you a little bit more if you accurately guess how the choice is made by the other party in this experiment. And even when people are paid for accuracy, they're too pessimistic about the other side, which shows part of our hostility toward the other side is based on a genuine misunderstanding of who they are. So there are a bunch of types of data that support this point, which I think is intuitive to most people. Most people think, “yeah, of course we misunderstand the other side.” But the range of data verifying it is pretty impressive.
Isaac Saul: How wide is that gap? When you talk about something like prosocial versus antisocial behavior, are we talking about 5% to 10%? Or are we talking about 30%, 40%, 50%? What's the range of misunderstanding that you guys see in this research?