An epoch is a distinct unit of time, but its exact meaning depends on the field. In geology, it’s a formal division of Earth’s history lasting millions of years. In machine learning, it’s one complete pass through a training dataset. In astronomy, it’s a fixed reference point for measuring where objects are in the sky. The word comes from the Greek “epokhē,” meaning a fixed point in time, and that core idea connects all its modern uses.
Epochs in Geological Time
Geologists divide Earth’s 4.5-billion-year history into a strict hierarchy of time units. From largest to smallest: eons, eras, periods, and then epochs. An epoch is a subdivision of a period, typically spanning several million to tens of millions of years. Each epoch boundary marks a significant shift in climate, life forms, or Earth’s geology that left a detectable record in rock layers.
The most familiar epochs belong to the Cenozoic era, the stretch of time from the extinction of the dinosaurs to today. The Cenozoic contains seven epochs:
- Paleocene (66.0–56.0 million years ago)
- Eocene (56.0–33.9 million years ago)
- Oligocene (33.9–23.0 million years ago)
- Miocene (23.0–5.3 million years ago)
- Pliocene (5.3–2.6 million years ago)
- Pleistocene (2.6 million–roughly 11,700 years ago)
- Holocene (11,700 years ago to present)
The Holocene is the epoch we currently live in. It began at the end of the last ice age, when glaciers retreated and human civilizations started developing agriculture. Some scientists have pushed to declare a new epoch, the “Anthropocene,” to mark the period when human activity began reshaping the planet’s geology and climate. However, in March 2024, the International Union of Geological Sciences voted to reject that proposal. The Holocene remains the official current epoch.
Epochs in Machine Learning
If you’re learning about AI or neural networks, you’ll encounter “epoch” constantly. In this context, one epoch means the model has seen every single example in its training dataset exactly once. It’s a measure of how many times the algorithm has worked through the full set of data it’s learning from.
Training a neural network involves three related terms that often get confused:
- Batch size: the number of training examples the model processes at one time before updating itself
- Iteration: one update of the model’s internal settings, based on a single batch
- Epoch: a full cycle through the entire dataset, made up of many iterations
The math is straightforward. If you have 10,000 training examples and a batch size of 100, the model needs 100 iterations to complete one epoch. In practice, training a model requires many epochs, sometimes dozens or hundreds, because a single pass through the data isn’t enough for the model to learn the patterns reliably. Think of it like studying a textbook: reading it once cover to cover is one epoch, but you’ll probably need to read it several times before the material sticks. The number of epochs is one of the key settings engineers tune to balance learning quality against training time.
Epochs in Astronomy
Stars and planets are constantly moving, and Earth itself wobbles on its axis over long timescales. That means the coordinates used to locate objects in the sky slowly drift. To solve this, astronomers pick a single moment in time as a fixed reference point and call it an epoch.
The current standard is J2000.0, which corresponds to noon on January 1, 2000 (in Greenwich, England). When a star catalog lists the position of a star, those coordinates are based on where Earth’s axis and equator were oriented at that specific moment. Without an agreed-upon epoch, two astronomers could give different coordinates for the same star simply because they measured at different times. Before J2000, the standard was B1950. As Earth’s wobble accumulates over decades, the astronomical community periodically adopts a new reference epoch.
Epochs in Everyday and Historical Use
Outside of technical fields, “epoch” simply means a period of time marked by distinctive events or characteristics. You might hear someone describe the invention of the printing press as the start of a new epoch in communication, or the industrial revolution as an epoch-defining shift. The related adjective “epoch-making” describes an event significant enough to define such a period. In this casual sense, epoch and era are nearly interchangeable, though epoch tends to imply a more sharply defined beginning, a specific moment that changed everything.
Epochs in Sleep and Brain Research
In neuroscience and sleep medicine, researchers break continuous brain-wave recordings into short, uniform chunks called epochs. The standard length is 30 seconds. When a sleep specialist scores your overnight sleep study, they’re classifying each 30-second epoch as a specific sleep stage: light sleep, deep sleep, REM, or wakefulness. This standardized duration, established by the American Academy of Sleep Medicine, lets clinicians and researchers compare results consistently across patients and studies.
The same principle applies in broader brain-wave research. Scientists segment long recordings into fixed-length epochs so they can average repeated signals together, filter out noise, and compare brain responses across trials. The epoch length varies by experiment, but the concept is the same: take a continuous stream of data and divide it into manageable, standardized windows.

