What Is a Threshold in Psychology? Types & Examples

A threshold in psychology is the point at which a stimulus becomes strong enough to register in your awareness. It’s the dividing line between what you can detect and what you can’t. The concept shows up most often in the study of sensation and perception, where researchers measure exactly how much of a signal (light, sound, pressure, taste) your nervous system needs before it “counts” as something you noticed. But thresholds also operate at the biological level, in individual nerve cells, and they shift depending on your environment, attention, and experience.

Absolute Threshold

The absolute threshold is the minimum intensity of a stimulus you can detect 50% of the time. That 50% benchmark matters because detection isn’t binary. A very faint sound doesn’t go from completely inaudible to perfectly clear at one precise volume. Instead, there’s a fuzzy zone where sometimes you hear it and sometimes you don’t. Psychologists define the threshold as the midpoint of that zone.

Some classic demonstrations give a sense of how sensitive human perception can be. On a clear, dark night, the light-sensitive cells at the back of your eye can detect a candle flame from 30 miles away. In a quiet room, your ear can pick up the tick of a clock from 20 feet away. These are idealized examples, but they illustrate that absolute thresholds can be remarkably low. Every sense has its own absolute threshold: the faintest smell you can detect, the lightest touch you can feel, the weakest taste you can identify.

Difference Threshold

While the absolute threshold is about detecting whether something is there at all, the difference threshold is about detecting a change. Also called the “just noticeable difference,” it’s the smallest change in a stimulus you can notice 50% of the time. If you’re holding a 10-pound weight and someone adds a tiny amount, the difference threshold is the minimum added weight you’d reliably feel.

A key principle here is that the just noticeable difference isn’t a fixed amount. It scales with the intensity of the original stimulus. This pattern is known as Weber’s Law: the change you need to notice is a constant fraction of the starting intensity. For weight, that fraction is roughly 1/50, meaning you’d need about a 2% increase to feel the difference. For brightness, the fraction is closer to 1/60. For pitch, it’s much finer, around 1/333. The practical takeaway is that it’s harder to notice a small change when the overall stimulus is already intense. Adding one candle to a dark room is obvious; adding one candle to a room with 100 candles burning is invisible.

Terminal Threshold

At the other end of the scale is the terminal threshold, which is the point at which increasing a stimulus no longer produces any increase in perceived intensity. Your sensory system essentially maxes out. A sound at this level doesn’t get “louder” even if you crank up the physical energy. In practice, stimuli near the terminal threshold are often painful or damaging, which is why this concept comes up less frequently in everyday psychology than absolute or difference thresholds.

How Thresholds Work in Nerve Cells

Thresholds aren’t just a perceptual concept. They have a physical basis in your neurons. Every nerve cell has a resting electrical charge. When a stimulus arrives, it pushes that charge upward. If the charge reaches roughly negative 55 millivolts (a specific voltage measured across the cell membrane), the neuron fires an electrical impulse called an action potential. Below that voltage, nothing happens. This is an all-or-nothing event: a neuron either fires or it doesn’t, and the threshold is the tipping point.

This neural threshold helps explain why perception has a threshold at all. A stimulus that’s too weak to push enough neurons past their firing point simply doesn’t generate a signal strong enough for your brain to register. The absolute threshold you experience at a conscious level is the cumulative result of millions of individual neurons either reaching or failing to reach their own electrical thresholds.

Subliminal Perception

Stimuli that fall below your absolute threshold are called subliminal, meaning they’re too faint for you to consciously detect. The question of whether these stimuli can still influence your behavior has been debated for decades. Some studies have found evidence of subliminal priming, where a word or image flashed too quickly to see consciously still affects how fast you respond to a related word moments later. In controlled experiments, people responded faster on tasks when subliminal cues matched what they were looking for, even though they couldn’t report seeing anything.

The findings are genuinely mixed, though. Some results have been difficult to replicate, and methodological criticisms are common. The current picture is that subliminal stimuli can produce small, measurable effects on simple tasks like reaction time, but claims about subliminal messages controlling complex decisions or preferences go well beyond what the evidence supports.

Why Thresholds Shift

Your thresholds aren’t fixed numbers. They move around based on several factors, and understanding why helps explain everyday experiences that might otherwise seem strange.

Sensory adaptation is the most common reason. When you’re exposed to a constant stimulus, your nervous system recalibrates. Walk into a room with a strong smell and you’ll stop noticing it within minutes. Your threshold for that odor has effectively risen. Research on visual adaptation shows this involves real changes in how your sensory neurons encode information: their sensitivity shifts toward whatever you’ve been exposed to, so that the “new normal” becomes your baseline. This recalibration happens at both the sensory level (neurons literally become less responsive) and the decision level (your brain adjusts its internal criteria for what counts as a meaningful signal).

Attention and expectation also play a role. If you’re actively listening for a sound, your effective threshold drops. You’ll detect fainter signals when you’re primed to look for them. This is partly why a parent can hear their baby stir from another room while sleeping through louder, less relevant noises.

Age raises most sensory thresholds over time. Hearing thresholds climb, particularly for high-frequency sounds. Visual thresholds increase as the lens of the eye becomes less flexible and the light-sensitive cells become less responsive. Touch sensitivity in the fingertips declines. These changes are gradual enough that most people don’t notice them until the shift becomes significant.

Fatigue, medication, and emotional state can all nudge thresholds in either direction. Pain thresholds, for instance, are notably sensitive to mood, stress, and sleep quality.

Signal Detection Theory

Classical threshold theory treats detection as a clean line: either a stimulus is above threshold or it isn’t. Signal detection theory, developed in the mid-20th century, offers a more realistic model. It recognizes that your sensory system always operates against a background of neural noise, random electrical activity that’s present even when no stimulus exists. Detecting a real signal means distinguishing it from that noise, which is fundamentally a statistical problem.

In this framework, detection depends on two independent factors. The first is your sensitivity, which is how well your sensory system can separate signal from noise. This is measured by a value called d-prime: the larger the gap between the “noise only” pattern and the “signal plus noise” pattern in your nervous system, the easier the detection. The second factor is your criterion, a personal decision rule about how much evidence you require before you’ll say “yes, I detected something.” A cautious person sets a high criterion and misses more real signals but rarely reports false ones. A liberal responder catches more real signals but also reports things that aren’t there.

This distinction matters because what looks like a change in sensitivity might actually be a change in decision-making. A radiologist who just missed a tumor on a scan may temporarily lower their criterion, flagging more ambiguous spots on the next set of images. Their eyes haven’t gotten sharper; their willingness to say “that could be something” has changed. Signal detection theory gives researchers a way to separate these two influences, which a simple threshold measurement cannot do.

Fechner’s Law and Perceived Intensity

Once a stimulus is above threshold, how does your perception of its intensity relate to its actual physical intensity? The relationship isn’t linear. Gustav Fechner proposed in 1860 that perceived intensity follows a logarithmic pattern: doubling the physical energy of a sound doesn’t make it sound twice as loud. Instead, you need to multiply the stimulus by a constant factor each time to get the same perceived increase. This is why decibel scales for sound and magnitude scales for earthquakes use logarithmic units. They reflect how perception actually works better than a simple linear scale would.

This logarithmic relationship is a direct mathematical consequence of Weber’s Law. If the just noticeable difference is always a constant proportion of the current stimulus, then stacking those tiny perceptual steps on top of each other produces a logarithmic curve. The practical result is that your senses are compressed: extremely sensitive to small changes at low intensities, but increasingly blunt as intensity climbs. This compression lets you function across an enormous range of physical inputs, from a whisper to a rock concert, without your sensory system being overwhelmed.