The Anthropocene is a proposed geological epoch defined by the measurable impact of human activity on Earth’s rocks, atmosphere, oceans, and living systems. The idea is straightforward: humans have changed the planet so profoundly that future geologists, millions of years from now, would be able to see a clear line in the rock record where our influence began. Although the term is widely used in science, policy, and popular culture, it has not been formally accepted as an official unit of geologic time.
Where the Term Came From
Atmospheric chemist Paul Crutzen introduced the concept in 2000 at a scientific meeting in Cuernavaca, Mexico. Crutzen, who had won a Nobel Prize for his work on ozone depletion, argued that the current geological epoch, the Holocene, no longer described the state of the planet. Human activity had become, in his framing, “a fully coupled, interacting component of the Earth System itself,” not just a species living on its surface but a force reshaping it.
The idea caught on fast. By the mid-2000s, researchers began publishing what became known as the “Great Acceleration” graphs, which plotted the explosive growth of human activity since roughly 1950: population, energy use, water consumption, fertilizer application, and dozens of other indicators, all curving sharply upward in the second half of the twentieth century. Those graphs became the visual shorthand for the Anthropocene concept.
When It Supposedly Began
One of the longest-running debates around the Anthropocene is when to draw the starting line. Four major proposals have been put forward, each with a different logic:
- The spread of agriculture (thousands of years ago): Early farming and deforestation began changing land surfaces and atmospheric gases long before industrialization.
- The Columbian Exchange (~1492): The mixing of Old World and New World species after European colonization reshaped ecosystems on every continent.
- The Industrial Revolution (~1800): Fossil fuel combustion kicked off a rapid rise in atmospheric carbon dioxide.
- The mid-twentieth century Great Acceleration (~1950): Population growth, industrialization, and nuclear weapons testing left a sudden, globally synchronous mark in sediments worldwide.
Most geologists who studied the question favored the mid-twentieth century start date, largely because it left the clearest physical evidence. A useful geological boundary needs to show up in rock and sediment layers all over the world at roughly the same time. The mid-1900s deliver that in several ways at once.
What the Evidence Looks Like in Rock
If you core into lake sediments, ocean floors, ice sheets, or even coral reefs, the layers deposited after about 1950 look strikingly different from everything beneath them. The signals include:
Radioactive fallout from nuclear weapons testing. The first atomic bomb detonated at Alamogordo, New Mexico, in July 1945, but it was the thermonuclear weapons tests of the 1950s and early 1960s that scattered artificial radioactive isotopes across the entire globe. This “bomb spike” of plutonium and other radionuclides peaks around 1964 and shows up in sediment layers on every continent. The Anthropocene Working Group, the scientific body that spent over a decade studying the question, chose the initial rise of plutonium in 1952 as its preferred marker for the epoch’s starting point. They identified Crawford Lake in Ontario, Canada, as the ideal reference site: its undisturbed, finely layered sediments record a sharp jump in plutonium activity between the layers deposited in 1951 and 1952, coinciding precisely with the onset of thermonuclear testing.
Fossil fuel combustion products are another unmistakable signal. Black carbon particles, inorganic ash spheres, and tiny carbonaceous particles from coal and oil burning show a near-synchronous global increase around 1950. These particles settle into sediments and ice, forming a persistent layer.
Then there are entirely new materials that never existed before humans made them. Elemental aluminum, concrete fragments, and plastics now appear in sedimentary deposits worldwide. Geologists have started calling these “technofossils,” materials that evolve rapidly in form and composition and will persist in the rock record for millions of years.
Changes to the Atmosphere
The atmospheric record is perhaps the most familiar line of evidence. Before the Industrial Revolution, carbon dioxide levels held at about 280 parts per million (ppm) or less. For the past million years, across multiple ice ages and warm periods, CO2 never exceeded 300 ppm. By 2024, the global average reached 422.7 ppm, more than 50 percent above pre-industrial levels.
Methane concentrations tell a similar story, departing from patterns that held steady through the entire Holocene. Both gases began rising around 1850 with industrialization, but the steepest increases came after 1950. Tree rings and fossil shells capture the chemical fingerprint of this shift: a drop in the ratio of carbon-13 to carbon-12, which is characteristic of carbon released from burning ancient organic material like coal and oil rather than from natural sources.
Reshaping the Planet’s Chemistry
The Anthropocene isn’t only about carbon. Humans have fundamentally altered the cycling of other essential elements, particularly nitrogen and phosphorus. Mining phosphorus for fertilizer and animal feed, then spreading it across agricultural land, has increased the amount of phosphorus accumulating in soils and freshwater ecosystems by at least 75 percent compared to pre-industrial levels. Before humans intervened, roughly 1 to 6 teragrams of phosphorus accumulated in soils each year from natural weathering. Today that figure is 10.5 to 15.5 teragrams per year, with the surplus washing into rivers and lakes, fueling algal blooms and oxygen-depleted dead zones.
Nitrogen has been similarly transformed. Industrial processes now fix more nitrogen from the atmosphere than all natural processes on land combined, flooding ecosystems with a nutrient they evolved to treat as scarce. The chemical residues of pesticides, industrial solvents, and leaded gasoline also appear as distinct geochemical signatures in sediment layers deposited after the mid-twentieth century.
The Extinction Signal
A geological epoch defined by human impact would be incomplete without a biological record, and the current one is severe. Since 1500, 17 out of 1,258 known mammal genera have gone extinct, a rate of about 27 extinctions per million species-years. That may sound abstract, but the background rate, the normal pace of extinction derived from hundreds of millions of years of fossil data, is roughly 0.1 extinctions per million species-years. Current rates are hundreds of times higher.
If all mammal species currently classified as threatened were to disappear, the rate would climb to approximately 500 extinctions per million species-years, approaching the scale of the five great mass extinctions in Earth’s history. Some researchers caution that comparing modern extinctions directly to fossil-record extinctions is tricky because the timescales of measurement are so different. A crisis unfolding over centuries looks enormously fast when measured in human time but gets smeared out when viewed through the million-year lens of geology. Even accounting for that, the biological disruption is exceptional.
Why It Was Voted Down
Despite all this evidence, the Anthropocene is not an official geological epoch. In 2024, a subcommittee of the International Union of Geological Sciences voted overwhelmingly to reject the proposal to formalize it. The decision surprised even some scientists within the voting group.
The rejection wasn’t a denial that humans have transformed the planet. Virtually no geologist disputes that. The objections were procedural and conceptual. Some felt that compressing the Anthropocene into an epoch starting in the 1950s was too narrow, ignoring thousands of years of earlier human impact. Others argued that formalizing a geological unit for events still actively unfolding set a problematic precedent. Still others believed the evidence, while dramatic, didn’t fit neatly into the conventions of stratigraphy, the branch of geology that classifies rock layers and time periods.
The practical result is that, officially, we still live in the Holocene epoch, which began about 11,700 years ago as the last ice age ended. But the term “Anthropocene” continues to be used widely across climate science, ecology, policy discussions, and the humanities. Its power was never purely geological. It names something most people can sense intuitively: that the world humans inherited is not the world we now inhabit, and that the difference is written into the planet itself.

