A memory bank is a section of computer memory that operates independently from other sections, allowing the system to read or write data faster by accessing multiple banks at the same time. The term also gets borrowed as a metaphor for how the human brain stores memories. In computing, memory banks are a fundamental building block of how your computer’s RAM is physically organized.
Memory Banks in Computer Hardware
Inside every RAM chip, memory is divided into smaller units called banks. Each bank can handle a data request on its own, which means the processor doesn’t have to wait for one read or write operation to finish before starting another in a different bank. This parallel access is one of the main reasons modern computers can move data so quickly.
The number of banks has grown with each generation of memory technology. DDR4 RAM chips contain 16 banks, organized into four groups of four. DDR5 doubles that to 32 banks arranged in eight groups. More banks means more pages of data can be open simultaneously, which improves efficiency and reduces the back-and-forth needed to pull information from storage.
When two requests hit the same bank at the same time, the second one has to wait. This is called a memory bank conflict, and it introduces a small delay. Chip designers minimize these conflicts by spreading data across as many banks as possible, so requests are less likely to collide.
Banks, Ranks, and Channels
Memory banks exist inside individual memory chips. But your computer’s memory system has layers of organization above that. A memory rank is a set of chips on a memory module (the physical stick you plug into your motherboard) that work together to provide a 64-bit-wide data path. A single-rank module has one such set of chips; a dual-rank module has two, effectively doubling its capacity. Only one rank can be active at a time, selected by a control signal on the module.
A memory channel is the communication path between the processor and the memory module. Modern server processors support eight or more channels, and each channel can have one or more modules plugged in. So the full hierarchy runs: channels connect to modules, modules contain ranks, ranks are made of chips, and each chip is divided into banks. Every level of this structure exists to keep data flowing to the processor with as little waiting as possible.
Bank Switching in Older Systems
In earlier computers and many embedded systems today, processors have a limited address bus that restricts how much memory they can “see” at once. A processor with a 16-bit address bus, for example, can only directly access 65,536 memory locations. Bank switching is a technique that works around this limitation by using an external latch or register to swap between different sets of memory. The processor flips a switch, and a completely different block of memory becomes visible in the same address range.
Unlike virtual memory in modern operating systems, bank switching isn’t automatic. The running program has to know which bank holds the data it needs and explicitly request the swap. Each additional control bit roughly doubles the available memory space. This approach was common in 8-bit and early 16-bit systems and still appears in microcontrollers with tight address constraints.
The Brain as a Memory Bank
People often call the brain a “memory bank,” and while the metaphor is loose, the underlying biology is real. Memories are physically stored as patterns of strengthened connections between neurons. These patterns are called engrams, a concept first proposed over a century ago by the German zoologist Richard Semon. Modern neuroscience has confirmed that learning activates small groups of brain cells, triggering persistent physical and chemical changes in those cells and their connections.
The basic unit of long-term memory storage appears to be the dendritic spine, a tiny protrusion on a neuron where it receives signals from other neurons. During learning, new spines form and existing ones are modified. These changes can persist for a lifetime. A single memory isn’t stored in one location. Instead, an engram consists of connected cell groups distributed across multiple brain regions, with the specific pattern of connectivity holding the information.
The hippocampus, a small curved structure deep in the brain, plays a central role in forming new memories. Rather than acting as a passive filing cabinet, it actively guides how you take in information. It provides short-term memory signals that direct your attention toward what you still need to learn, helping you build coherent memories over time. Once memories are consolidated, they rely more on connections across the outer layers of the brain.
How Much Can the Brain Store?
Research from the Salk Institute estimated the brain’s memory capacity at roughly one petabyte, about a million gigabytes. That’s in the same range as the entire World Wide Web. The estimate came from measuring the precision of synaptic connections. Scientists found that synapses come in about 26 distinguishable sizes, which corresponds to around 4.7 bits of information per synapse. Previous estimates had assumed only one to two bits per synapse, so the revised figure represents a tenfold increase over earlier calculations.
For comparison, a high-end consumer hard drive today holds about 20 terabytes. You would need roughly 50,000 of those drives to match the brain’s estimated storage capacity.
DNA as a Future Memory Bank
Researchers are exploring DNA as a physical medium for long-term data storage. Because each unit of information is encoded using just a handful of atoms, DNA can pack hundreds of terabytes into a volume smaller than a grain of sand. The Georgia Tech Research Institute is working with biotech companies to build commercially viable DNA storage systems that could eventually scale into the exabyte range (one exabyte equals 1,000 petabytes).
The tradeoff is speed. Writing data into DNA and reading it back are slow processes compared to electronic storage. But DNA is extraordinarily durable. Kept at low temperatures, it can preserve data for thousands of years with essentially no maintenance cost. That makes it a strong candidate for archival storage, where you write data once and rarely need to access it, rather than the fast, constantly cycling memory banks inside your computer.

