How the Brain Processes Information, From Signal to Memory

Your brain processes information by converting physical sensations, like light, sound, and pressure, into electrical and chemical signals that travel through networks of neurons at speeds ranging from 1 mile per hour to nearly 270 miles per hour. This process begins the instant a stimulus reaches your sensory organs and unfolds across multiple brain regions simultaneously. Despite weighing only about three pounds, the brain consumes 20 to 25 percent of your body’s total glucose supply to power this constant activity.

From Sensation to Electrical Signal

Every piece of information your brain processes starts as a physical event in the outside world: photons of light hitting your retina, sound waves vibrating tiny bones in your ear, or pressure triggering receptors in your skin. Specialized sensory cells convert these physical events into electrical impulses, a process called transduction. Once converted, those impulses travel along nerve fibers toward the brain.

Not all signals travel at the same speed. Large, insulated nerve fibers carrying touch and body-position information are the fastest, transmitting at 80 to 120 meters per second, roughly 180 to 270 miles per hour. Pain signals, by contrast, travel along smaller, uninsulated fibers at just 0.5 to 2 meters per second, which is why you feel the impact of stubbing your toe before the sharp pain registers a moment later.

The Thalamus: The Brain’s Relay Hub

Nearly all sensory signals pass through the thalamus, a small structure deep in the center of the brain, before reaching the outer cortex where conscious processing happens. The thalamus acts as a switchboard, routing visual information to the visual cortex at the back of the head, auditory information to the auditory cortex on the sides, and so on. But it does more than just sort and forward. Certain thalamic regions, particularly one called the medial pulvinar nucleus, receive input from multiple senses at once, blending visual, auditory, and motor information before it even reaches the cortex.

This early blending helps explain why your senses feel unified rather than separate. You don’t experience a conversation as disconnected lip movements plus disconnected sounds; the thalamus begins stitching those streams together before your cortex completes the job.

How Neurons Talk to Each Other

Information moves between neurons through a precise sequence of chemical events at junctions called synapses. When an electrical signal reaches the end of a neuron, it triggers an influx of calcium ions into the nerve terminal. That calcium surge causes tiny packets of chemical messengers, called neurotransmitters, to release into the narrow gap between two neurons. The neighboring neuron has receptors that detect these molecules and, depending on the type, either fire off its own electrical signal or stay quiet.

After the message is delivered, the neurotransmitter is either broken down or pulled back into the original neuron so the synapse resets. This entire cycle, from electrical impulse to chemical release to electrical response, takes just a few milliseconds and repeats billions of times per second across the brain. About 70 percent of the brain’s energy goes specifically toward powering these signaling functions.

Parallel and Serial Processing

Your brain doesn’t handle information in a single, orderly queue. It uses two complementary strategies. Parallel processing handles many streams of data simultaneously: when you look at a face, separate groups of neurons process its color, shape, motion, and identity all at the same time. Serial processing, by contrast, focuses on one element at a time, which is what happens when you carefully read a sentence word by word or search for a specific face in a crowd.

Research using recordings from neurons in the prefrontal cortex of primates has shown that both strategies operate in the same brain regions, sometimes within the same task. Neurons tend to process information in parallel immediately after a stimulus appears, then shift toward serial processing as the brain narrows its focus. This combination lets you quickly absorb a scene and then zero in on what matters most.

Bottom-Up Versus Top-Down Processing

Information flows through the brain in two directions, and both matter. Bottom-up processing is driven entirely by incoming sensory data. When you touch a hot stove, pain receptors in your hand send signals to the brain, and you perceive heat and pain based purely on what’s happening right now. No prior knowledge or expectation is required.

Top-down processing works in the opposite direction. Your brain uses memories, expectations, and context to shape what you perceive. After burning your hand once, you instinctively pull back when you see a glowing burner, even before you feel any heat. You can read a sentence with jumbled letters because your brain predicts what the words should be based on experience. Top-down processing is also why you can hear your name in a noisy room: your brain is primed to detect it.

In practice, these two systems work together constantly. Raw sensory data flows up while expectations and predictions flow down, and your conscious experience is the product of both streams meeting in the middle.

The Brain as a Prediction Machine

One of the most influential ideas in modern neuroscience is that the brain doesn’t passively wait for information to arrive. Instead, it actively predicts what’s coming next. Under this framework, known as predictive coding, higher brain areas generate a best guess about incoming sensory input and send that prediction down to lower sensory areas. If the prediction matches reality, there’s little neural activity. If the prediction is wrong, the lower areas fire off an error signal that travels back up, updating the brain’s model of the world.

This has a striking consequence: when sensory input is predictable, early sensory areas actually become less active, not more. The neurons in those regions aren’t representing the full stimulus anymore. They’re only representing the mismatch between what was expected and what arrived. Brain imaging studies have confirmed this across different experimental contexts and at multiple levels of the visual system. Predictability genuinely reduces neural firing in sensory cortex.

This makes the brain remarkably efficient. Rather than building a complete picture of reality from scratch every moment, it maintains a running model and only updates the parts that change. It also explains why unexpected events, a loud bang, a flash of movement, grab your attention so powerfully. They generate large error signals that demand the brain’s resources.

How Information Becomes Memory

Processing information in the moment is only part of the picture. For information to be stored long-term, the connections between neurons must physically change. This happens through a process called long-term potentiation, or LTP, which has been studied extensively in the hippocampus, a brain region central to forming new memories.

When two connected neurons fire together repeatedly in a short window of time (within about 100 milliseconds of each other), the synapse between them strengthens. Future signals cross that synapse more easily, making the connection more responsive. Crucially, this strengthening is specific to the synapses that were active. Other synapses on the same neuron, ones that weren’t involved, remain unchanged. This selectivity means the brain can encode precise associations: the smell of coffee linked to a particular kitchen, a melody tied to a specific memory.

This selective strengthening of connections is often described as the cellular basis of learning. Each time you practice a skill or revisit a memory, you’re reinforcing specific synaptic pathways, making them faster and more reliable.

How Fast It All Happens

The speed of information processing in the brain is difficult to overstate. Research at MIT found that people can correctly identify the content of an image shown for just 13 milliseconds, far faster than the 100 milliseconds scientists previously assumed was the minimum. For context, 13 milliseconds is faster than a single wing beat of a housefly.

Earlier research had suggested that visual information needs at least 50 milliseconds to travel from the retina through the full chain of visual processing areas and back again for deeper analysis. But even when image exposure times dropped to 27 or 13 milliseconds, participants still performed above chance at identifying what they saw. This suggests the brain extracts meaningful information from even the briefest sensory input, likely relying on fast feedforward processing before the slower top-down feedback loops kick in.

Support Systems Behind the Scenes

Neurons get most of the credit, but they rely heavily on supporting cells called glia. Astrocytes maintain the chemical environment around neurons, regulate the blood-brain barrier, and help control the supply of nutrients. Oligodendrocytes wrap fatty insulation called myelin around nerve fibers, which is what allows those fast 270-mile-per-hour signals in touch and movement pathways. Without myelin, signals leak and slow dramatically, which is exactly what happens in conditions like multiple sclerosis.

The brain also demands enormous resources. In adults, it uses 20 to 25 percent of the body’s glucose, despite making up only about 2 percent of body weight. In infants, whose brains are rapidly forming new connections, that figure can exceed 40 percent of the body’s total energy budget. This metabolic cost reflects the sheer scale of computation happening at every moment: billions of synapses firing, strengthening, and resetting in the service of turning raw sensation into thought, memory, and action.