Confirmation bias occurs whenever you encounter information that relates to something you already believe. It’s not limited to a single moment or setting. It activates during three distinct phases of thinking: when you search for information, when you interpret what you find, and when you decide what to remember. Any time you hold an existing opinion, preference, or expectation, you’re susceptible to filtering new information through that lens.
The Three Stages Where It Happens
Confirmation bias isn’t a single event. It’s a chain of filtering that happens across three sub-processes, each reinforcing the others.
The first is selective exposure: you gravitate toward information that aligns with what you already think and avoid sources that challenge it. This is the stage where you choose which news outlets to read, which people to follow, or which search results to click. The second is selective perception: when you do encounter contradictory information, you either don’t register it or you reinterpret it to fit your existing view. A sports fan watching a referee’s call will genuinely “see” a different play depending on which team they support. The third is selective retention: you simply forget information that conflicts with your beliefs more readily than information that supports them. Over time, your memory of events and evidence skews toward what already made sense to you.
These three stages work together to create a self-reinforcing loop. You seek confirming evidence, reinterpret what doesn’t fit, and forget the rest.
When Emotions and Identity Are Involved
Confirmation bias intensifies when a belief is tied to your identity or values. Political opinions, religious convictions, and views on polarizing social issues are especially resistant to correction because challenging them feels like an attack on who you are, not just what you think.
This is where the so-called backfire effect can occur. When someone encounters a well-sourced correction to a deeply held belief, they sometimes end up believing the original misconception even more strongly. The psychological explanation is that people generate counter-arguments on the spot to defend their worldview, and that mental effort actually reinforces the original position. This backfire effect is most likely to show up with “hot-button” issues where the belief is central to a person’s identity or ideology. Correcting a factual error about a neutral topic is far less likely to trigger it than correcting a politically charged claim.
What Happens in the Brain
Brain imaging studies show that encountering belief-threatening information triggers a measurable conflict response. When people see contradictory statements from a political figure they support, areas involved in conflict resolution and negative emotion light up in ways they don’t when the same contradiction comes from an opposing figure. Your brain literally works harder to process information that challenges your “team” than information that challenges someone else’s.
The prefrontal cortex, the part of the brain responsible for deliberate reasoning, can override initial emotional reactions, but only when people have enough time and motivation to engage it. Quick, gut-level responses tend to be more biased. Slower, reflective thinking offers a chance to catch the bias, though it doesn’t guarantee it.
During Medical Diagnoses
Confirmation bias is a well-documented source of diagnostic error in medicine. Once a clinician forms an initial impression of what’s wrong, they tend to seek and prioritize information that supports that impression rather than considering alternatives. This is closely related to a pattern called premature closure: arriving at a diagnosis too early without fully exploring other possibilities.
A clinical teaching example illustrates this clearly. A patient arrives and initial examination finds no immediately life-threatening injury. The patient later becomes visibly anxious and mentions that he’s prone to anxiety attacks. That single piece of information anchors the medical team’s thinking. As the patient’s condition deteriorates, each new sign gets interpreted through the “anxiety attack” framework rather than prompting investigation of more serious causes. The patient dies. The team wasn’t negligent in the traditional sense. Their thinking was hijacked by a plausible early explanation that made all subsequent information look confirmatory.
During Financial Decisions
Investors are particularly vulnerable to confirmation bias because money and ego are both on the line. Research on investor memory shows two specific patterns. First, people distort the outcomes of past trades, remembering winners as bigger wins and losers as smaller losses than they actually were. Second, people selectively forget losing trades altogether. In one study, participants forgot 39.7% of their losing trades but only 29.9% of their winning trades.
This selective memory feeds overconfidence. Investors who forget their losses trade more frequently, which typically leads to worse returns over time. The bias isn’t happening during any single trade. It’s reshaping the investor’s entire self-narrative about their skill level, making them believe they’re better at picking investments than their actual track record supports.
In Criminal Investigations
Forensic science has its own term for confirmation bias in action: tunnel vision. Once investigators develop a suspect, they tend to interpret ambiguous evidence as supporting that theory while downplaying evidence that points elsewhere. This bias can enter at every stage, from initial evidence collection to expert analysis to courtroom testimony.
One of the most well-known cases involved Brandon Mayfield, an American attorney who was falsely linked to the 2004 Madrid train bombings through fingerprint analysis. FBI examiners matched a partial print to Mayfield, and once that initial identification was made, subsequent reviewers confirmed it. The match was wrong. Independent analysis by Spanish authorities identified a different person entirely. Investigations into the error found that cognitive bias, not technical incompetence, was the primary cause. The examiners saw what they expected to see in an ambiguous print.
Online and on Social Media
Search engines and social media platforms automate confirmation bias through personalization algorithms. When you search for something on Google, the results aren’t sorted purely by objective relevance. They’re filtered through your search history, location, social network, and past clicking behavior. When you click on results that match your existing views (as most people do), you signal back to the algorithm that those results were relevant, which strengthens the filter for next time.
This creates what researcher Eli Pariser called the “filter bubble.” Your information environment gradually narrows around your existing beliefs without you realizing it. The effect compounds in social networks, where people naturally join groups of like-minded individuals. Health misinformation is a particular concern: once someone begins gravitating toward a core of inaccurate health claims, the algorithm feeds them more of the same, and each new piece of confirming content makes the false belief feel more validated. Breaking out of that cycle is difficult because the algorithmic reinforcement mimics the feeling of doing thorough research.
How to Counteract It
The most studied debiasing technique is straightforward: actively consider the opposite. Before settling on an interpretation or decision, deliberately search for information that would prove you wrong. This forces your brain into the kind of reflective processing that can override the automatic filtering.
In professional settings like medicine and forensics, a related strategy is to review evidence without knowing the expected outcome. When a forensic examiner analyzes a fingerprint without being told that investigators already have a suspect, their analysis is less likely to be skewed. The same principle applies to medical second opinions: a fresh set of eyes without the original framing can catch errors the first clinician’s confirmation bias locked in.
For everyday decisions, the practical version is simpler. When you feel certain about something, especially something tied to your identity or emotions, treat that certainty as a signal to slow down and look for disconfirming evidence. The bias is strongest precisely when you feel most confident you’re being objective.

