Why We Often Stick to Our Beliefs, Even When We're Wrong

Why We Often Stick to Our Beliefs, Even When We're Wrong

Introduction

We all like to think we’re open-minded and rational, but psychology suggests otherwise. A recent paper published in Perspectives on Psychological Science explores how many of our thinking errors—called cognitive biases—may actually share one simple cause: we believe in something first, then look for information that confirms it.

How Beliefs Shape What We See

Beliefs aren’t just strong opinions. They’re our personal versions of reality, shaped by experience, culture, and emotions. Once formed, beliefs act like filters. We tend to notice what fits them and ignore what doesn’t. This is known as “belief-consistent information processing.” For example, if we believe someone is lazy, we’ll likely notice every time they rest, but miss moments when they work hard.

Many Biases, One Root Cause

The paper suggests that different biases—like the spotlight effect (thinking others notice us more than they do), the false consensus effect (believing others agree with us), or in-group favoritism (thinking our group is better)—aren’t all separate problems. They often come from one belief: “My experience or opinion is the right one.” Because of this, we interpret the world in ways that match our views and expect others to do the same.

Why We Trust Ourselves More Than Others

Another common belief is “I make correct judgments.” This belief leads to biases like the “bias blind spot,” where we see others as biased but consider ourselves neutral. It also explains why people with strong political views may see neutral news as unfairly critical of their side—because they assume their view is the truth.

Motivation Isn’t Always to Blame

It might seem like we do this because we want to be right or protect our egos. But the research argues that motivation isn’t always the reason. People can be biased even when they don’t care about the outcome or when they try to be fair. The mind just prefers information that feels familiar and safe.

Can We Think More Clearly?

The paper suggests one promising solution: challenge our beliefs on purpose. Instead of only looking for information that supports what we think, we should also search for facts that might prove us wrong. This strategy—“consider the opposite”—can reduce bias in how we judge people, news, or decisions.

Conclusion

Beliefs are part of being human. They help us make sense of the world. But when we let them guide how we process all new information, they can lead us astray. By understanding how our minds lean toward belief-confirming patterns, we can start to question our thinking and see the world with more clarity—and maybe a bit more humility.

Reference: https://journals.sagepub.com/doi/10.1177/17456916221148147

Back to blog