Bias in psychology refers to systematic patterns of deviation in judgment and decision-making that cause people to think, perceive, or act in predictably skewed ways. These aren’t random errors or occasional lapses in logic. They are built-in tendencies of the human brain, shaped by evolution and reinforced by experience, that distort how we process information. Psychologists have identified well over 100 distinct biases, and they influence everything from snap judgments about strangers to life-altering medical decisions.
Why the Brain Is Biased by Design
Your brain processes an enormous amount of information every second, and it simply cannot weigh all of it carefully and equally. To cope, it relies on mental shortcuts called heuristics: quick, efficient rules of thumb that usually get you a good-enough answer without requiring deep analysis. You selectively focus on information that resonates with what you already know, you build broad conclusions from a small number of past observations, and you lean on whatever comes to mind most easily when estimating how likely something is. These shortcuts are highly economical and usually effective, but they lead to systematic and predictable errors.
The foundational research on this came from psychologists Daniel Kahneman and Amos Tversky, who published their landmark paper in the journal Science in 1974. They identified three core heuristics: representativeness (judging probability by how closely something resembles a category), availability (estimating frequency based on how easily examples come to mind), and anchoring (relying too heavily on the first number or piece of information you encounter). That framework launched decades of research into how these mental shortcuts shape human behavior in courtrooms, hospitals, financial markets, and everyday life.
The Evolutionary Logic Behind Bias
It might seem like biased thinking is a flaw, but from an evolutionary standpoint, many biases exist because they helped our ancestors survive. A useful way to understand this is “error management theory,” which holds that when two types of mistakes have unequal consequences, the brain evolves to favor the less costly one. The classic example: mistaking a stick for a snake is harmless, but mistaking a snake for a stick can be deadly. So brains that erred on the side of seeing threats, even when none existed, survived to pass on their genes.
This logic extends across many domains. Being overly cautious about unfamiliar food, overestimating the hostility of strangers, or assuming the worst about ambiguous situations were all biases that carried a survival advantage in unpredictable environments. The problem is that these same tendencies now operate in modern contexts where they can cause real harm, from stereotyping people to making poor financial choices.
Common Types of Cognitive Bias
Cognitive biases are the most widely studied category. These are the mental distortions that affect how you gather, interpret, and remember information. A few of the most influential ones:
- Confirmation bias: The tendency to seek out, favor, and remember information that supports what you already believe while ignoring or discounting evidence that contradicts it. This is one of the most powerful and pervasive biases in human thinking.
- Anchoring bias: The tendency to rely too heavily on the first piece of information you receive. If you see a jacket priced at $500 before seeing one at $200, the second jacket feels like a bargain, even if it’s overpriced on its own terms.
- Availability heuristic: Judging how common or likely something is based on how easily you can think of examples. After seeing news coverage of plane crashes, people tend to overestimate the danger of flying, even though driving is statistically far more dangerous.
- Bandwagon effect: The pull to adopt beliefs or behaviors as they become more popular, independent of the underlying evidence.
- Authority bias: The tendency to trust information from an authority figure without critically evaluating it, simply because of their status.
- Attentional bias: The tendency to focus disproportionately on certain stimuli while ignoring others, often driven by emotional state. Someone with anxiety, for example, tends to notice potential threats more than a non-anxious person would.
These biases don’t operate in isolation. In any real-world decision, several biases can stack on top of each other, compounding the distortion. A doctor who anchors on a patient’s initial complaint, then selectively seeks confirming evidence for that first impression, can end up with a confident but wrong diagnosis.
Implicit vs. Explicit Bias
An important distinction in psychology is between biases you’re aware of and biases you’re not. Explicit biases are attitudes and beliefs you consciously hold and can reflect on. If someone openly states they distrust a particular group of people, that’s an explicit bias. Implicit biases, by contrast, are automatically activated and operate below conscious awareness. You may genuinely believe you treat everyone equally while still harboring unconscious associations that influence your behavior.
Research using brain imaging has shown that these two types of bias involve different neural pathways. Implicit processing of social stimuli activates the amygdala, a brain region involved in rapid, automatic emotional evaluation. When white participants in studies were shown faces of Black individuals for extremely brief durations (too fast for conscious processing), their amygdala showed greater activation compared to when they viewed white faces. Explicit, deliberate evaluation, on the other hand, engages regions of the prefrontal cortex associated with executive control and conflict detection. This means your brain has the hardware to override automatic biases, but doing so requires conscious effort.
In some situations, implicit attitudes predict behavior better than explicitly held beliefs. Someone who scores low on a questionnaire measuring prejudice may still behave differently toward people of different races in unstructured social interactions, because their automatic associations leak through when they’re not actively monitoring themselves.
In-Group Bias and Social Identity
Social psychology has identified a separate family of biases tied to group identity. Humans rapidly categorize other people into “us” and “them” based on shared characteristics like race, language, religion, or even arbitrary group assignments. This social categorization is so automatic that simply being randomly assigned to a novel group is enough to trigger in-group favoritism in both children and adults.
The consequences of this are far-reaching. People tend to have more positive associations with their own group, remember more favorable details about in-group members, and extend more trust and cooperation to them. At the same time, they show greater vigilance toward out-group members. Young children, for example, generalize socially unacceptable behaviors to out-group members more readily and show better memory for socially threatening stimuli. This isn’t something people learn from a single bad experience. It appears to be a deeply rooted social mechanism that originally served a threat-monitoring function.
How Bias Affects Mental Health Care
Bias doesn’t stay contained in abstract thinking exercises. It has measurable consequences in clinical settings, particularly in psychiatric diagnosis and treatment. For more than two decades, research has documented that African Americans are diagnosed with schizophrenia at higher than expected rates and with mood disorders at lower than expected rates. This pattern persists even when symptom profiles are similar across racial groups, pointing to diagnostic bias rather than true differences in illness prevalence.
The effects cascade into treatment. Studies have found that African Americans and Latinos are less likely than white patients to receive guideline-based care for anxiety and depression, less likely to be prescribed newer medications with fewer side effects, and less likely to receive follow-up visits after psychiatric hospitalization. In some settings, African American patients in emergency and inpatient psychiatric care were prescribed higher doses of antipsychotic medications than other patients. These disparities reflect both individual clinician biases and systemic patterns that compound over time.
Reducing the Impact of Bias
The fact that biases are deeply rooted doesn’t mean they’re unchangeable. Research on “debiasing” has identified several strategies that genuinely reduce biased thinking, though none of them work effortlessly.
The most broadly supported approach is metacognition: deliberately stepping back from your initial gut reaction and engaging in slower, more analytical reasoning. This means learning to notice when you’re making a snap judgment and consciously choosing to evaluate the evidence more carefully before committing to it. It’s a skill that improves with practice, not a one-time fix.
A second technique is called “consider the opposite.” When you’ve formed an initial impression, you deliberately look for evidence that would support the opposite conclusion. Experimental studies have shown this strategy counteracts several types of bias, including biased judgments of personality traits. It works because it forces you to engage with information you would otherwise ignore.
Structured information gathering also helps. Instead of relying on whatever details happen to stand out, using checklists or standardized procedures ensures you consider less obvious but potentially important data. This is particularly valuable in medical and hiring contexts, where first impressions can dominate the entire evaluation. Training in basic statistical and inferential reasoning has also been shown to reduce specific errors, like ignoring base rates when estimating probability.
None of these strategies eliminate bias entirely. But they shift your thinking from a purely automatic mode into a more deliberate one, and that shift is where most of the improvement happens. The first step is simply knowing that these biases exist and recognizing the situations where they’re most likely to distort your judgment.

