Critical thinking in psychology is the deliberate process of evaluating evidence, questioning assumptions, and reasoning through problems rather than relying on gut reactions or surface-level explanations. It sits at the core of psychology as both a scientific discipline and a clinical practice, shaping how researchers design studies, how therapists form diagnoses, and how students learn to separate credible findings from pseudoscience. The American Psychological Association lists scientific inquiry and critical thinking as one of its core learning goals for every psychology major.
Two Systems of Thinking
Psychology frames critical thinking partly through dual process theory, which distinguishes between two modes of thought. The first is fast, automatic, and intuitive. It generates snap judgments, pattern recognition, and gut feelings without conscious effort. When your brain predicts what will happen next based on past experience, that’s this system at work, and when those predictions are accurate, you barely need to think at all.
The second system is slow, deliberate, and effortful. It kicks in when the fast system’s predictions fail or when a problem demands careful analysis. This is the system that evaluates evidence, follows logical steps, and overrides first impressions. Critical thinking lives here. It requires working memory, conscious attention, and the willingness to pause before accepting an easy answer. The relationship between these two systems matters because much of critical thinking involves recognizing when your fast, automatic judgments might be wrong and deliberately engaging the slower, more analytical process to check them.
The Skills It Involves
One of the oldest and most widely used measures of critical thinking, the Watson-Glaser Critical Thinking Appraisal, breaks the concept into five specific skills: inference, recognition of assumptions, deduction, interpretation, and evaluation of arguments. These map closely to what psychologists actually do when they think critically about a claim or a piece of evidence.
Inference means drawing reasonable conclusions from available information without overstepping what the data supports. Recognition of assumptions means identifying unstated beliefs that underlie an argument. Deduction means applying general principles to specific cases and checking whether conclusions logically follow. Interpretation means weighing evidence and deciding what it actually means, rather than what you want it to mean. Evaluation of arguments means judging whether a line of reasoning is strong or weak, relevant or irrelevant.
These aren’t abstract intellectual exercises. A clinical psychologist uses all five when deciding whether a patient’s symptoms fit one diagnosis over another. A researcher uses them when designing a study that can actually test a hypothesis rather than just confirm what they already believe.
Why Falsifiability Matters
One principle separates scientific psychology from pop psychology and self-help speculation: falsifiability. A claim counts as scientific only if it could, in theory, be proven wrong. “People who eat chocolate are happier” is falsifiable because you can design a study to test it. “Everything happens for a reason” is not, because no possible evidence could disprove it.
Psychology trains students to shift from a confirmation mindset to a falsification mindset. Instead of looking for evidence that supports a belief, critical thinkers look for evidence that could contradict it. Research on falsification heuristics shows that when students are taught to actively seek contradictory evidence, their critical thinking measurably improves. They become better at updating their views when the data doesn’t match their expectations, rather than dismissing inconvenient findings.
Cognitive Biases That Get in the Way
Psychology has catalogued dozens of cognitive biases that interfere with clear reasoning. Understanding these biases is itself a core part of critical thinking, because you can’t correct errors you don’t recognize.
- Confirmation bias: the tendency to seek out, interpret, and remember information that confirms what you already believe while ignoring information that doesn’t.
- Availability bias: judging how likely or important something is based on how easily examples come to mind. Plane crashes feel more dangerous than car accidents because they’re more memorable, not more common.
- Anchoring bias: letting the first piece of information you encounter disproportionately influence your judgment, even when that information is irrelevant.
- Hindsight bias: the feeling, after learning an outcome, that you “knew it all along.”
- Authority bias: giving extra weight to someone’s opinion because of their status rather than the strength of their reasoning.
- Sunk cost fallacy: continuing down a path because of the time or effort already invested, even when the evidence says you should change course.
- Conformity bias: adjusting your thinking to match a group standard, even when the group is wrong.
These biases aren’t signs of stupidity. They’re built into normal human cognition, shortcuts the brain uses to process information quickly. The goal of critical thinking isn’t to eliminate them (you can’t) but to recognize when they’re operating and compensate for them.
The Role of Metacognition
Critical thinking depends on metacognition: the ability to think about your own thinking. Metacognition has two components. The first is metacognitive knowledge, which means understanding how your own mind works, what strategies are available to you, and when different approaches are appropriate. The second is metacognitive control, which involves actively planning, monitoring, and evaluating your own reasoning as it happens.
In practice, this looks like pausing mid-argument to ask yourself whether you’re actually following the evidence or just defending a position you’re emotionally attached to. It means planning how you’ll approach a problem before diving in, checking your progress along the way, and honestly evaluating whether your conclusion holds up. Research in higher education has found that students who are explicitly trained in metacognitive strategies show significant improvements in both their self-awareness about their own knowledge and their ability to plan and monitor their thinking.
One effective technique is thinking aloud, either alone or with a partner. Verbalizing your reasoning forces you to make your assumptions explicit, which makes them easier to examine and challenge. It turns the invisible process of thought into something you can inspect.
How It’s Used in Clinical Practice
In clinical settings, critical thinking drives the diagnostic process. A clinician’s first impression of a patient often comes from intuitive pattern recognition, noticing similarities between the current case and past cases. But that initial impression is just a starting point. The clinician then evaluates the evidence, considers competing explanations, and looks for information that might rule out their initial hypothesis.
This process has been described as a form of detective work. Clinicians track how an illness unfolds over time, what treatments have been tried, what worked and what didn’t, and how the patient’s responses fit into a broader picture. They adjust their understanding as new information arrives, a process called reasoning-in-transition. The goal is to avoid locking into a diagnosis too early (anchoring bias) or seeing only the evidence that supports it (confirmation bias).
Clinical training programs often assign students “sleuthing” tasks: looking for undetected drug interactions, questioning dosages, and catching signs that others may have missed. These exercises build the habit of questioning rather than accepting, which is the foundation of critical thinking in any context.
Personality Traits That Support It
Critical thinking isn’t just a set of skills. It also involves dispositions, personality traits that make someone more likely to think critically even when they’re not required to. Research on critical thinking dispositions has identified five key dimensions: systematic analyticity (approaching problems in an organized, thorough way), open-mindedness (willingness to consider perspectives different from your own), confidence in reasoning (trusting your ability to think through complex problems), reflective skepticism (a habit of questioning claims rather than accepting them at face value), and truth-seeking (genuine motivation to find the best answer, even if it’s uncomfortable).
Someone can have strong analytical skills but low truth-seeking, meaning they’re capable of rigorous reasoning but only apply it when the conclusion suits them. Psychology recognizes that building critical thinking means developing both the ability and the willingness to use it consistently.
What Psychology Students Are Expected to Learn
The APA’s 2023 guidelines for the undergraduate psychology major spell out four specific competencies under the critical thinking goal. Students should be able to exercise scientific reasoning to investigate psychological phenomena, interpret and evaluate psychological research, incorporate sociocultural factors into scientific practices, and use statistics to evaluate quantitative findings.
This means a psychology graduate should be able to read a study and identify its strengths and weaknesses, spot when a claim isn’t supported by the evidence presented, recognize how cultural context shapes both research questions and findings, and understand enough statistics to know whether a result is meaningful or likely due to chance. These aren’t specialist skills reserved for researchers. They’re the baseline the field considers necessary for anyone with a psychology degree to navigate a world full of competing claims about human behavior.

