When Did Climate Change Start? The Scientific Timeline

Climate change didn’t start at a single moment. The answer depends on whether you mean when humans first began altering the atmosphere, when greenhouse gas levels started climbing measurably, or when the planet’s temperature clearly shifted. Each of those has a different date, and together they tell a story that stretches from ancient farming to the coal-powered factories of the 1700s to the explosive growth of the mid-20th century.

The Earliest Human Fingerprint: 8,000 Years Ago

Long before smokestacks or engines, humans were nudging the climate. Paleoclimate scientist William Ruddiman proposed that carbon dioxide levels began an unusual rise around 8,000 years ago, right when early farmers in Europe started clearing forests at scale. Methane, another heat-trapping gas, followed a similar anomalous uptick about 5,000 years ago, coinciding with the spread of rice irrigation in Asia. Based on ice cores covering the previous 350,000 years, both gases should have been declining during this period due to natural orbital cycles. Instead, they rose.

This “Early Anthropocene” hypothesis remains debated, but the archaeological evidence lines up well. Farming communities advanced across the Hungarian Plain and into the forests of south-central Europe between 8,000 and 7,000 years ago, matching the initial CO2 upturn almost exactly. Rice cultivation began around 7,500 years ago and shifted to irrigated paddies (which produce methane) near 5,000 years ago. These were small nudges, not the dramatic spike we see today, but they suggest humans have been influencing atmospheric chemistry for millennia.

The Industrial Revolution: 1750 Onward

The clearest starting line for modern climate change is the mid-1700s. Before the Industrial Revolution, atmospheric CO2 held steady at about 280 parts per million (ppm) or less. Once coal-burning factories spread through Britain and then the world, that number began a sustained climb that has never reversed. Scientists and international bodies like the IPCC use 1850 to 1900 as the official “pre-industrial baseline” for measuring how much the planet has warmed.

That baseline isn’t arbitrary. It marks the period just before fossil fuel emissions became large enough to register as a clear warming signal. Recent research published in the Proceedings of the National Academy of Sciences found that a detectable human fingerprint on atmospheric temperature could have been identified as early as 1885, if monitoring systems had existed at the time. The signal showed up first as cooling in the upper atmosphere, a direct consequence of rising CO2, and it preceded the invention of gas-powered cars.

The Great Acceleration After 1950

If the Industrial Revolution lit the fuse, the mid-20th century was the explosion. Researchers at the Stockholm Resilience Centre have documented what they call the “Great Acceleration,” a sharp post-1950 surge in nearly every indicator of human impact on the Earth system: energy use, population, transportation, fertilizer consumption, and greenhouse gas emissions all rocketed upward simultaneously.

This wasn’t just more of the same. The post-1950 period produced changes in the planet’s functioning that fall outside the range of natural variability seen over the entire 11,000-year Holocene epoch. From an Earth system science perspective, the Great Acceleration is the most convincing candidate for when human activity fundamentally shifted how the planet operates. The warming that followed has pushed global temperatures above the average of any century in the past 11,000 years.

Where CO2 Stands Today

Atmospheric CO2 has risen from that pre-industrial 280 ppm to roughly 427 ppm as of early 2025, an increase of more than 50%. That jump has driven about 1.1°C (2°F) of global surface warming above the 1850 to 1900 baseline. The likely range of human-caused warming through the 2010s sits between 0.8°C and 1.3°C, with a best estimate of 1.07°C.

That warming isn’t distributed evenly. The Arctic has warmed two to three times faster than the global average for most of the 20th century, and during the early 2000s that ratio jumped to four or five times faster. Winter Arctic warming has increased almost continuously since 1990, reshaping ice coverage and ecosystems at a pace that climate models have struggled to replicate.

When Scientists First Understood the Risk

The science predates the worst of the problem. In 1856, American scientist Eunice Foote ran a simple experiment: she filled glass cylinders with different gases, placed thermometers inside, and set them in sunlight. She found that carbon dioxide and water vapor absorbed more heat than other gases, and she directly connected this finding to the possibility that changes in atmospheric CO2 could alter the climate. Her work was presented at the American Association for the Advancement of Science that same year.

Three years later, in 1859, John Tyndall conducted more detailed laboratory measurements of heat absorption by various gases, work that has traditionally received more credit. But Foote’s earlier experiments established the core insight: the gases already in our atmosphere act as a blanket, and adding more of them changes how much heat the planet retains. Scientists have understood the basic mechanism behind climate change for nearly 170 years.

Putting the Timeline Together

The short answer is that human-driven climate change began gradually with early agriculture thousands of years ago, accelerated meaningfully with the Industrial Revolution around 1750, and intensified dramatically after 1950. The pre-industrial CO2 level of 280 ppm served as a rough ceiling for hundreds of thousands of years. Today’s concentration of 427 ppm has no precedent in human history, and the warming it produces has already exceeded anything the planet experienced during the Holocene.