How partisans see facts through different eyes
Those who want to restrict travel from Muslim countries or sales of assault weapons use one rationale to buttress their arguments and a different one to dismiss their opponents鈥, according to new research from 兔子先生传媒文化作品
In our politically polarized country, we often hear this common refrain: If people on both sides of the aisle could simply look at the same facts, they鈥檇 be able to see eye-to-eye, have measured discussions and enact reasonable laws and policies.
But new research from the University of Colorado Boulder, published in March in the journal Cognition, suggests that when people with differing political views are provided with the same statistics鈥攁nd they believe that those facts are accurate鈥攖hey prioritize the information differently, based on their existing opinions.
The findings provide one explanation for the sharply diverging opinions people have when it comes to polarizing policies鈥攚e鈥檙e literally perceiving the facts differently.
鈥淲e often assume when dealing with the partisan divide that if we could give everyone the same information and get them to believe in the accuracy of that information, we would reduce partisan conflict,鈥 said Leaf Van Boven, a 兔子先生传媒文化作品 professor of psychology and neuroscience. 鈥淏ut what this research shows is that even when you give people the same information, they can have very different partisan reactions to that underlying information.鈥
Van Boven and a team of co-authors set out to study our reasoning using 鈥渃onditional probabilities,鈥 or the likelihood that something will occur based on one or more conditions having occurred.
More specifically, they looked at what statistics people considered to be the most important when contemplating policies that restrict broad categories of people or actions to lower the risk of rare events, such as a travel ban for immigrants from majority-Muslim countries to reduce terrorist attacks and a ban on the sale of assault weapons to curb mass shootings.
The researchers were inspired, in part, by how politicians and pundits used conditional probabilities to defend restrictive policies. After the Sept. 11 terrorist attacks, for example, conservative commentator Ann Coulter advocated for the expulsion of Muslim immigrants from the country, writing that 鈥渁ll terrorists are Muslims.鈥
鈥淲e鈥檙e particularly interested in these policies involving rare events because policy makers often use conditional probabilities to explain why a policy is useful, and there鈥檚 a lot of research suggesting that people actually have a tough time thinking about conditional probabilities,鈥 said Jairo Ramos, 兔子先生传媒文化作品 a graduate student in social psychology and one of the study鈥檚 co-authors.
When evaluating the effectiveness of a policy intended to reduce terrorism, for instance, it鈥檚 more relevant to consider the vanishingly small fraction of Muslim immigrants who commit terrorist attacks, rather than the fraction of immigrant terrorists who come from Muslim countries, the researchers write.
In essence, because the percentage of Muslim immigrants who are terrorists is extremely small, banning all Muslims from entering the country would reduce an extremely small threat to a somewhat smaller threat, the researchers explain. And yet, many people are motivated by statements like the one made by Coulter鈥攖hat the proportion of immigrant terrorist attacks are committed by Muslims is relatively high.
The researchers used a less-polarizing example to clarify this point: professional basketball players. While a majority of NBA players are African American, only a small fraction of African American males play in the NBA.
鈥淵ou would never try to look for potential NBA recruits by starting with all African American males鈥攖hat would be an incredibly wasteful strategy,鈥 Van Boven said. 鈥淏ut it鈥檚 really the same thing we do when we first look at Muslim immigrants.鈥
Considering probabilities
To explore this phenomenon in the context of politically polarized policies, the researchers asked more than 500 American adults to review a list of statistics related to terrorism or mass shootings, then select which statistic they considered the most important for evaluating policies meant to reduce the risk of those events. Participants also considered this question from the perspective of an unbiased expert and someone with an opposing viewpoint to their own. (Co-authors in Israel ran a similar study using a policy to expel asylum seekers from Tel Aviv to reduce crime.)
As suspected, participants selected the probability that supported their existing stance on the policy.
For example, when considering a Muslim travel ban, a supporter of the policy was more likely to point to the fact that 72 percent of immigrants who commit terrorist attacks come from Muslim countries. But an opponent of that same policy was likely to prioritize the probability that an immigrant from a Muslim country is a terrorist is 0.00004 percent.
People are basically starting with the outcome they would like and then looking for evidence to support that outcome. When they stop and think like an expert, they can interrupt that process a little bit. ... We speculate that you first ask what the evidence shows鈥攜ou start with the data and then reason from that perspective.鈥
Similarly, someone who supported an assault weapons ban placed more value on the fact that two-thirds of mass shootings were committed by people who owned assault weapons. An opponent of that same policy pointed to the probability that of the 12 million American adults who own assault weapons, just four committed a mass shooting.
Adopting the perspective of an unbiased expert mitigated some of this polarization, though not completely.
Importantly, the researchers found that both Democrats and Republicans placed more emphasis on probabilities that aligned with their existing views. That鈥檚 because people tend to approach these questions like 鈥渋ntuitive politicians鈥 rather than statisticians, Van Boven said.
鈥淧eople are basically starting with the outcome they would like and then looking for evidence to support that outcome,鈥 he said. 鈥淲hen they stop and think like an expert, they can interrupt that process a little bit. When you think like an expert, we speculate that you first ask what the evidence shows鈥攜ou start with the data and then reason from that perspective.鈥
Another important takeaway from the experiment is that people can agree about the relevance of probabilities and still have different policy stances, based on their own underlying values. For instance, even if someone recognizes that the majority of assault weapons鈥 owners do not commit mass shootings, they might still want to ban assault weapons.
鈥淪omeone might say that the value of reducing mass shootings even by the smallest amount is worth requiring all assault weapons owners to give up their weapons, or that any reduction in the number of terrorist attacks is worth banning all Muslim immigrants from entering the country鈥攖hose might still seem like worthwhile tradeoffs to someone,鈥 Van Boven said.
Acknowledging biases
Van Boven suggests how can we apply these new findings to our own lives: For starters, if you鈥檙e trying to be more open-minded and less biased during political discussions, try mentally stepping into the shoes of an unbiased expert or statistician. Look at the numbers and see where they lead you.
鈥淓ven in these very emotional, high-conflict partisan topics, we can and should approach it through the lens of statistical reasoning,鈥 Van Boven said. 鈥淲e really should start by asking what is the likelihood of these kinds of risks?鈥
Another real-world takeaway: Acknowledge that we all process statistics in this biased way, not just people on the other side of the table.
鈥淥n these worrisome, pressing issues of the day, we end up stuck in inaction because of the tendencies we all have,鈥 Van Boven said. 鈥淚t鈥檚 very easy to blame the other side and say, 鈥楾hey鈥檙e not thinking carefully and they鈥檙e being irrational and unreasonable and that鈥檚 why we can鈥檛 have sensible policies.鈥 But really, it鈥檚 all of us.鈥