Skip to main content

Verified by Psychology Today

Bias

The Cognitive Bias That May Undo Democracy

We think we’re less biased than others. Here's why that matters.

Key points

  • We fall prey to the bias blind spot, believing our own judgments are less biased than the judgments of others.
  • We believe that others are biased by their experiences whereas we are illuminated by our own.
  • The bias blind spot can help explain why partisans view the other party as not committed to democracy.

When you make decisions at work, how impartial are you? If you had to decide on a promotion between a friend and a mere colleague, would you be objective? Do you think others would be as fair as you?

Chances are, you just fell prey to the blind spot bias, the tendency for us to believe that our own judgments are less susceptible to bias than the judgments of others.

The Bias Blind Spot in Action

Finding fault with others but virtue in oneself runs strong. Consider that in one study, even highly educated, self-aware mental health professionals rated their own susceptibility to four specific cognitive biases in forensic evaluations as less than that of their peers. In the corporate world, managers from Forbes 2000 companies and Swiss Human Resource employees rated their colleagues as more susceptible to bias in decision-making than they rated themselves. This propensity begins early in life, as 7-year-olds say they would be less likely than peers to pick an undeserving friend for an award to make them happy.

Source: Ekaterina Bolovtsova/Pexels
Source: Ekaterina Bolovtsova/Pexels

The bias blind spot is especially problematic for judges, who are expected to decide on their own impartiality and whether they should disqualify themselves if they cannot be independent. Do we really expect powerful jurists to realize how their judgments may be compromised when, universally, we all have trouble seeing this in ourselves?

A contributing factor in the loss of confidence in the higher courts is high-profile examples of judges not recusing themselves. Justices though, seem unwilling or unable to acknowledge that they may be biased.

In one brazen case, West Virginia Supreme Court Justice Brent Benjamin failed to recuse himself in an appeal where the losing party’s CEO had donated $3 million to his election campaign. Was it a coincidence that the justice voted with the new majority to overturn the earlier decision, sparing the company from a $50 million judgment? But Justice Benjamin claimed it was the evidence against him that was biased and that motions to disqualify him were filled with “surmise, conjecture, and political rhetoric.”

And who could forget the Supreme Court decision in Bush v Gore that determined the 2000 presidential election in favor of George W. Bush? Sixty-six percent of Al Gore supporters, but only 31 percent of George Bush supporters, thought that members of the Supreme Court had been influenced by their “personal political views.” By contrast, almost all Bush supporters thought the court was acting fairly while very few of Gore supporters did.

Source: Yan Krukau/Pexels
Source: Yan Krukau/Pexels

The tendency to view our own identity-based connections as illuminating but others’ backgrounds as biasing can clearly be seen in a powerful experiment. College students were asked to evaluate various candidates for a campus-wide affirmative action committee. The white students believed they could be objective in considering matters of race, but generally wanted to exclude ethnic minorities from serving on the committee because they worried that their minority backgrounds would make these others too biased, likely too pro-affirmative action. The ethnic minority students had reversed perceptions: They believed their experiences uniquely qualified them to be enlightened members of the committee but considered white students as too biased. White privilege would disqualify them from being able to accurately assess matters of diversity.

A real-world example can be seen in the 1989 Chicago mayoral election, in which 94 percent of voters cast a ballot for the candidate of their own race. More than one-half of the voters believed that race was the determining factor for people supporting the other candidate, but only 8 percent would admit it as a factor influencing themselves. Recall how partisans viewed Congressional behavior during the impeachment hearings of President Trump. They both thought their team was nobly defending something—the Constitution or a president from a partisan witch hunt—while the other team was clearly biased in their attacks or defense of the president.

These studies begin to explain the increasing polarization and divide experienced so strongly in contemporary America—fundamentally, we don’t trust those different from us. Their judgments are dubious and subject to corrupting influences.

Increasing Danger Inherent to the Bias Blind Spot

I suspect that the effects of the bias blind spot may be increasingly dangerous in an era when social media and specialized media proliferate and encourage information silos. Consider the following studies published within the last few months:

  • Susceptibility to false information: We believe that we are less likely to be duped by false information than others. This is a general extension of the notion that while our beliefs are well founded, those of others are formulated by inappropriate factors. Participants in one study were asked to indicate whether various statements (“Perceptual reality transcends subtle truth”; “Your teacher can open the door, but you must enter by yourself”) were profound or not. Some declarations were randomly generated with pseudo-profound buzzwords while others were profound quotes from historical figures. Individuals who were especially poor at “bullsh*t” detection were highly overconfident in their ability. The lowest performers also believed that they were better at detecting BS than the average person. Individuals also believed that they were less likely than others to fall prey to the “truth effect”—which causes us to judge repeated statements as more true than new statements.

These studies suggest that overconfidence in one’s ability to detect misleading information could increase resistance to engaging in fact-checking, make people less willing to revise beliefs in the face of new information, and increase the rate at which they spread false information to others.

  • Perception of democratic values: A study of 2,200 Americans demonstrated the bias blind spot in political perceptions. Across two studies, Democrats estimated that the average Democrat values democratic characteristics 56 percent and 77 percent more than the average Republican. Republicans estimated that the average Republican values democratic characteristics 82 percent and 88 percent more than the average Democrat.

Partisans most prone to this bias were especially likely to support anti-democratic practices, such as illegally redrawing districts to win more seats, using the Federal Communications Commission to restrict or shut down rival news programming, making it hard for the opposing party to run the government effectively, and hurting the other party at the short-term expense of the nation. This was particularly true of Republicans.

In other words, believing that the other party does not adhere to democratic values emboldens individuals to sacrifice their own democratic ideals to counter the perceived threat—subverting democracy in the name of saving it.

Similar effects have been observed among state legislators. These elected officials vastly overestimated the support for undemocratic practices among voters of the other party, but not their own. When they received correcting information about the actual beliefs of the other party, the legislators were less likely to endorse undemocratic practices themselves.

The Bottom Line

To reverse the prevailing trend that sees American democratic norms under assault, we may need to look in the mirror and examine our own perceptions. Often, our own support for trampling democracy is rooted in misguided beliefs about the other party.

References

Druckman, J. N., Kang, S., Chu, J., N. Stagnaro, M., Voelkel, J. G., Mernyk, J. S., ... & Willer, R. (2023). Correcting misperceptions of out-partisans decreases American legislators’ support for undemocratic practices. Proceedings of the National Academy of Sciences, 120(23), e2301836120.

Ehrlinger, J., Gilovich, T., & Ross, L. (2005). Peering into the bias blind spot: People’s assessments of bias in themselves and others. Personality and Social Psychology Bulletin, 31(5), 680–692.

Littrell, S., & Fugelsang, J. A. (2023). Bullshit blind spots: The roles of miscalibration and information processing in bullshit detection. Thinking & Reasoning, 1–30.

Mattavelli, S., Béna, J., Corneille, O., & Unkelbach, C. (2023). People underestimate the influence of repetition on truth judgments (and more so for themselves than for others) [Preprint]. PsyArXiv. https://doi.org/10.31234/osf.io/5mwpz

Pasek, M. H., Ankori-Karlinsky, L. O., Levy-Vene, A., & Moore-Berg, S. L. (2022). Misperceptions about out-partisans’ democratic values may erode democracy. Scientific Reports, 12(1), 16284.

advertisement
More from Hank Rothgerber Ph.D.
More from Psychology Today