Air pollution started long before factories or cars existed. The earliest evidence dates back roughly 49,000 years, when Neanderthals inhaled smoke from campfires inside caves. But the story of air pollution as a widespread, deadly problem unfolds across thousands of years, from prehistoric cooking fires to Roman lead smelting to the coal-choked cities of industrial England.
Campfire Smoke 49,000 Years Ago
Chemical analysis of Neanderthal teeth from the El Sidrón cave site in Spain revealed signals of smoke inhalation from campfires dating to around 49,000 years ago. That makes it the oldest direct evidence of any species breathing polluted air. These weren’t isolated exposures. Once early humans began living in enclosed shelters and burning fuel for warmth and cooking, poor air quality became a constant feature of daily life.
At Çatalhöyük, a 9,000-year-old settlement in modern Turkey, researchers reconstructed a typical dwelling and burned the same fuels its inhabitants would have used. Dried animal dung, a common fuel, produced fine particulate concentrations above 150,000 micrograms per cubic meter inside the structure. For context, the World Health Organization recommends annual outdoor exposure stay below 5 micrograms per cubic meter. Skeletal remains from the site tell a matching story: black carbon deposits were found on the interior rib surfaces of at least three individuals, a sign of lung disease caused by chronically inhaling soot.
Roman Smelting Left a Global Footprint
Pollution became continental in scale with the rise of ancient Rome. Arctic ice cores preserve a detailed record of atmospheric lead stretching back more than 2,500 years. Lead deposits in those cores began climbing during the Iron Age, peaking in the late second century BCE at the height of the Roman Republic. The source was massive lead and silver mining and smelting operations across Europe.
When the Roman Republic collapsed into civil war during the first century BCE, lead levels in the ice dropped nearly to natural background. They surged again around 15 BCE as the Roman Empire stabilized and mining resumed. The high pollution of the Pax Romana persisted until the Antonine Plague swept through the empire in the 160s and 180s CE, devastating populations and disrupting industry. Lead levels that high would not be seen again until the High Middle Ages, around the early 1100s. Even so, Arctic lead pollution during the industrial peak of the early 1970s was roughly 40 times higher than anything the Romans produced.
Medieval London and the First Pollution Laws
The first known attempt to regulate air pollution came in 1272, when King Edward I banned the burning of sea coal in London. The penalty was severe: anyone caught burning or selling it could be tortured or executed. The ban came at the urging of noblemen and clergy who had grown tired of the city’s smoky air.
The underlying problem was straightforward. Until the 12th century, Londoners burned wood. As the city grew and nearby forests shrank, wood became expensive. Cheap, soft, bituminous coal dredged from deposits off the northeast coast filled the gap. But sea coal burned inefficiently, converting much of its energy into thick smoke rather than heat. That smoke, pouring from thousands of chimneys, mixed with London’s natural fog to create something new and choking. The writer John Evelyn described it in the 1600s as a “Hellish and dismall Cloud of SEACOALE” mixed with “otherwise wholesome and excellent Aer,” leaving residents breathing nothing but “an impure and thick Mist accompanied with a fuliginous and filthy vapour.”
The Industrial Revolution Made It Deadly
Coal burning scaled up dramatically in the 18th and 19th centuries. Before the Industrial Revolution began in the mid-1700s, atmospheric carbon dioxide sat at around 280 parts per million. Within 150 years, industrial cities like Manchester and London had become dangerously polluted on a scale no previous era had experienced.
A correspondent for the Manchester Guardian wrote in 1888 that the city’s atmosphere was “poisonous, mortality at its highest rate,” producing “the most depressing effect upon the spirits of those unfortunate inhabitants who are compelled to live within its boundaries.” The data backs up the rhetoric. Researchers studying registration districts from 1851 to 1860 found that coal use had a strong positive correlation with death rates, particularly from respiratory diseases. Pollution accounted for roughly one third of the gap in death rates between cities and the countryside, with children under five hit hardest. In the decade from 1891 to 1900, respiratory diseases, mainly bronchitis and pneumonia, caused 21 percent of all deaths of young children in England and Wales.
One striking finding: the mortality gap between heavily polluted and cleaner areas in late 19th-century Britain was steeper than the equivalent gap in modern China, despite China’s well-documented air quality problems.
Two Kinds of Smog
For centuries, air pollution meant one thing: sulfur-laden smoke from burning coal and heavy oil. London’s recurring “pea-souper” fogs were the defining example, culminating in the Great Smog of December 1952. A cold snap combined with windless conditions trapped coal smoke over the city for days. Initial government estimates placed the death toll at around 4,000, but later reassessment put the figure at approximately 12,000 excess deaths between December 1952 and February 1953 from both immediate and lingering effects.
Meanwhile, a different kind of pollution was emerging in Los Angeles. Starting in the 1940s, the city experienced stinging, hazy smog that looked and behaved nothing like London’s coal smoke. Chemist Arie Haagen-Smit identified the culprit: sunlight was driving a photochemical reaction between hydrocarbons from oil refineries, partially unburned automobile exhaust, and nitrogen oxides (a combustion byproduct) to form ground-level ozone. This was an entirely new pollution chemistry, born from the automobile age rather than the coal age, and it required entirely new regulations to address.
Nature Pollutes Too
Not all air pollution comes from human activity. Dust storms, sea salt, volcanic eruptions, wildfire smoke, and organic compounds released by vegetation all contribute particulate matter to the atmosphere. MIT researchers ran a simulation that removed every human emission source from the air and found that over 50 percent of the world’s population would still be exposed to fine particulate levels exceeding current WHO guidelines, purely from natural sources. In the Amazon, for instance, elevated particulate concentrations come largely from carbon-containing aerosols produced by fires.
This natural baseline matters for understanding the full timeline. Air pollution did not begin with humanity. Volcanic eruptions have been injecting sulfur dioxide and ash into the atmosphere for billions of years. What humans added, starting with those Neanderthal campfires, was sustained, concentrated exposure in enclosed spaces, and eventually emissions at a scale that altered atmospheric chemistry worldwide.
A Timeline of Escalation
- ~49,000 years ago: Earliest evidence of smoke inhalation, from Neanderthal teeth in Spain
- ~7000 BCE: Indoor burning at Çatalhöyük produces extreme particulate levels and signs of lung disease
- ~500 BCE to 200 CE: Roman lead smelting creates the first continent-scale atmospheric pollution, detectable in Arctic ice
- 1272: King Edward I bans sea coal burning in London, the first known air pollution regulation
- Mid-1700s onward: Industrial coal burning drives atmospheric CO₂ above its pre-industrial baseline of 280 ppm
- 1850s–1900s: Coal pollution accounts for roughly a third of excess urban deaths in Britain
- 1943: Los Angeles identifies photochemical smog, a new type of pollution driven by cars and sunlight
- 1952: London’s Great Smog kills an estimated 12,000 people, spurring modern clean air legislation
- 2021: The WHO tightens its recommended annual PM2.5 limit to 5 micrograms per cubic meter

