Let me give you a scenario you’ll recognise immediately. Your team has spent three months on a strategy. The data was clear, everyone agreed. Six months later, results disappoint. In hindsight, the warning signals were there the whole time. But nobody saw them. Because nobody was looking.

This is confirmation bias at work. And it’s one of the most destructive thinking errors I know - precisely because it’s so invisible. You don’t notice it while it’s sabotaging your decisions. You feel the opposite: certainty, alignment, confidence. Everything seems to check out. Until it doesn’t.

Confirmation bias is the tendency to selectively seek, interpret and remember information that confirms existing beliefs. In the workplace, it sabotages hiring, strategy and team decisions by making warning signals invisible. The solution isn’t awareness - it’s redesigning decision-making processes: structured interviews, pre-mortems and the SUE Influence Framework.

What is confirmation bias?

Confirmation bias is what happens when your brain takes a shortcut. Instead of weighing all available information objectively, it filters for evidence that confirms what you already believe. It’s a classic System 1 process: fast, automatic and completely below the radar of conscious thought.[1]

Psychologist Peter Wason showed back in the 1960s that people systematically try to confirm their hypotheses rather than refute them. But the implications reach far beyond the lab. In the workplace, it shows up everywhere.

You search selectively. You Google “why remote work is productive” instead of “is remote work actually productive?” You read the article that proves you right and skip the one that challenges you.

You interpret with a filter. When a candidate gives a vague answer in an interview, you read it as thoughtfulness if you already like them - and as incompetence if you don’t. Same information, different conclusion.

You remember selectively. After a meeting, you remember the three arguments that supported your position. That one pushback from your colleague? Gone.

Everyone in your organisation - from intern to CEO - is susceptible. The question isn’t whether confirmation bias influences your decisions. It does. The question is whether you’re willing to design your decision-making environment so it can do less damage.

People don’t reject new ideas because they’re wrong. They reject them because they don’t confirm what they already believe.

Three situations where it does the most damage

The hiring decision you make in 30 seconds

At SUE we receive applications every day. I know from experience how easy it is to fall into the trap. A candidate walks in, they’re funny, have an interesting background and - oh - worked at a company I admire. Within 30 seconds I’ve formed a positive impression. The rest of the conversation becomes an unconscious confirmation exercise.

This isn’t a personal weakness. This is how the brain works. Research shows most interviewers form their judgement within four minutes. The remaining time isn’t used to evaluate. It’s used to confirm.[2]

The result? You hire on affinity, not competence. The candidate you like gets softer questions. The one you’re less drawn to gets harder ones. And you’ve convinced yourself you conducted an objective interview.

The fix is surprisingly simple: structure the interview. Same questions, same order, for every candidate. Score answers against pre-defined criteria. And withhold your final judgement until all the data is in. This isn’t about eliminating intuition. It’s about stopping intuition from hijacking the process before you have the facts.

The strategy nobody dares to challenge

I’ve seen this dozens of times in our Behavioural Design Sprints. A government agency wants to increase citizen participation. The assumption: people don’t engage because they’re not well informed. The solution: a communication campaign. Brochures, drop-in sessions, a new website.

And then something fascinating happens. Once the team commits to this hypothesis, it unconsciously filters all feedback through the lens of “information deficit.” When citizens still don’t participate, the team concludes more information is needed. Clearer language. Better channels. More explanation.

The possibility that the real barrier is something entirely different - distrust, complexity, a sense that it won’t make a difference anyway - is never seriously investigated. Not because the team is incompetent. But because confirmation bias does exactly this: it makes invisible whatever doesn’t fit with what you already believe.

This is also exactly why at SUE we always start with the Influence Framework before a team chooses a strategy. It forces you to look beyond the obvious drivers (Pains and Gains) to the hidden blockers: the Comforts keeping people in their current behaviour and the Anxieties holding them back from new behaviour. When you systematically look for reasons your strategy might fail, confirmation bias simply has less room to operate.[3]

The team that nodded its way to a bad decision

A product team at a tech company has been working on a new feature for six months. In a meeting, the team lead presents the latest user data. The data is mixed: some metrics are encouraging, others suggest the feature isn’t solving the problem.

Watch what happens in that meeting. Team members who’ve invested months of work unconsciously emphasise the positive data. They rationalise the negative signals. Nobody wants to be the person who says the project might need rethinking. The meeting ends with apparent consensus to continue.

Here confirmation bias and the sunk cost fallacy amplify each other. The team has invested too much to stop and filters information to justify pressing on. Three months later the feature launches. Disappointing adoption. Quietly discontinued.

The most powerful intervention I know for this problem is the pre-mortem. Ask your team before a major decision: “Imagine it’s six months from now and this project has failed. What went wrong?” This reframes the question from “should we continue?” to “what could go wrong?” And it gives people social permission to voice doubts without being seen as disloyal.

Google’s Project Aristotle found that psychological safety - the feeling that you can take risks without being penalised - is the single most important characteristic of high-performing teams. A pre-mortem creates exactly that: a safe structure for putting counter-evidence on the table.

Why “awareness” is never the answer

You hear it all the time: “We need to make people aware of their biases.” It sounds logical. But it doesn’t work. And the Influence Framework shows exactly why.

When you analyse confirmation bias at work using the Influence Framework, you notice something striking: the blocking forces are far stronger than the driving ones.

Pains (what pushes you away from current behaviour): bad decisions get repeated, strategies fail without anyone understanding why. These are real costs. But they stay invisible - precisely because the bias prevents people from seeing them.

Gains (what pulls you toward new behaviour): better decisions, faster error correction, stronger teams. Measurable and valuable. But abstract and future-tense.

Comforts (what keeps you in current behaviour): and this is where it gets interesting. Confirmation bias feels good. It gives you certainty. Seeking information that confirms what you already believe is cognitively efficient - it literally costs less mental energy than seriously considering alternatives. This comfort is exactly why awareness alone doesn’t solve the problem.

Anxieties (what stops you from changing): challenging your own assumptions feels threatening. What if you’re wrong? What if the strategy you’ve been defending doesn’t hold up? The fear of being wrong keeps people locked in confirmation-seeking behaviour. Especially when their professional reputation is tied to the outcome.

The comfort of certainty and the fear of being wrong are together powerful enough to override any rational argument about “making better decisions.” This is exactly the same mechanism as in any other behaviour change challenge: you can’t think your way out of it. The bias operates below conscious thought.[4]

The SUE Influence Framework showing the four forces Pains, Gains, Comforts and Anxieties - applied to confirmation bias at work
The SUE Influence Framework™ makes the four forces visible that determine why people stay in current behaviour - and what stops them from changing.

The answer is therefore always the same: change the environment, not the person. Design decision-making processes that structurally reduce the space in which confirmation bias can operate. Structured interviews. Pre-defined evaluation criteria. Pre-mortems. Devil’s advocates. Blind reviews.

These are environmental interventions, not mindset interventions. Just as changing the default option on a form is more powerful than informing people about their choices.

How confirmation bias interacts with other biases

Confirmation bias rarely operates alone. In the workplace it combines and amplifies other biases in ways that make it especially difficult to diagnose.

The sunk cost fallacy is the most common partner in crime. When you’ve heavily invested in a decision, confirmation bias filters information to justify continuing that investment. Your entire team is convinced the project is going well - not because the data shows it, but because everyone has invested too much to want to see otherwise.

The Dunning-Kruger effect makes it worse: overestimating your own judgement makes you less inclined to look for counter-evidence. Why would you? You’ve already figured it out.

Groupthink is the group-level effect. Individual confirmation bias creates an apparent consensus, which at team level triggers the suppression of dissenting views. Everyone thinks everyone agrees. In reality, nobody has bothered to look critically.

And loss aversion amplifies everything: the fear of losing what you have - your position, your being right, your investment - makes you cling to beliefs that offer confirmation.

Understanding this isn’t academic. It’s practical. Because interventions that only address confirmation bias often fail. A structured interview process combined with pre-defined scoring criteria and blind CV reviews creates a system of interlocking safeguards. That’s what works.

Frequently asked questions

What is a concrete example of confirmation bias at work?

A hiring manager who forms a positive impression of a candidate in the first 30 seconds unconsciously steers the rest of the conversation to confirm that impression. Questions become softer, ambiguous answers are interpreted favourably, red flags dismissed. Research shows most interviewers make their decision within four minutes. The remaining time is spent confirming, not evaluating.

How does confirmation bias affect teams?

Confirmation bias causes teams to confuse agreement with alignment. When a team leader presents a strategy, team members unconsciously filter information to support it. They emphasise data that confirms the plan and downplay risks. The team feels aligned, but has really just avoided the discomfort of disagreement.

Can you eliminate confirmation bias?

No. Confirmation bias is a System 1 shortcut you can’t train away. Awareness alone doesn’t prevent it - even people who know about the bias fall for it. The solution is to design decision-making environments that structurally reduce its influence: blind reviews, structured interviews, pre-defined criteria and appointed devil’s advocates.

What is the difference between confirmation bias and groupthink?

Confirmation bias operates at the individual level - one person filtering information to align with existing beliefs. Groupthink operates at the group level - a team suppressing dissenting opinions to preserve harmony. They reinforce each other: individual confirmation bias creates an illusion of consensus, which triggers groupthink dynamics that further suppress counter-evidence.

How do I recognise confirmation bias in my own decisions?

Ask yourself three questions: (1) Am I only looking at evidence that supports my preferred outcome? (2) Would I evaluate this evidence differently if it supported the opposite conclusion? (3) Have I actively looked for information that proves me wrong? If the answer to question 3 is no, confirmation bias is probably influencing your decision.

Conclusion

Want to learn how to structurally improve decision-making in your organisation? In the Behavioural Design Fundamentals Course you learn to apply the Influence Framework and the SWAC Tool to diagnose and overcome cognitive biases. Rated 9.7/10 by 5,000+ alumni from 45 countries.

PS

At SUE our mission is to use the superpower of behavioural psychology to help people make positive choices. Confirmation bias is perhaps the most treacherous enemy of good decisions - precisely because it makes you feel like you’re getting everything right. The first step is accepting that you’re not immune to it. The second step is to stop trying to become immune, and start designing environments that protect you from yourself.