Wildfires have always been part of the natural landscape, but they became a dramatically worse problem starting in the early 2000s. Between 2005 and 2018, the annual number of fires in the western United States nearly doubled compared to the period from 1984 to 1999, and fires in the Great Plains quadrupled. The story of how that happened stretches back more than a century, to a well-intentioned policy decision that quietly set the stage for the crisis we see today.
A Century of Putting Every Fire Out
After a series of catastrophic wildfires in the early 1900s, including the Great Fire of 1910 that burned roughly three million acres across the Northern Rockies, the U.S. Forest Service adopted an aggressive suppression strategy. The agency’s policy was to extinguish every wildfire by 10 a.m. the following morning, keeping each blaze as small as possible. This approach held for decades and, on the surface, it worked. Fires stayed small and communities stayed safe.
But fire plays a critical role in forest health. It clears out fallen trees, dead branches, leaves, and pine needles that accumulate on the forest floor. Without regular fire cycling through, that debris piled up year after year, creating enormous reserves of fuel. By the time land managers began recognizing the problem in the mid-20th century, many western forests were carrying fuel loads far beyond anything that would have existed naturally.
The 1988 Yellowstone Wake-Up Call
The 1988 Yellowstone fire season forced the country to confront what decades of suppression had created. Before that year, the National Park Service had allowed naturally caused fires to burn themselves out, while suppressing only human-caused fires. But the fires that summer grew far beyond expectations, and media coverage left the public believing the entire park was being destroyed. The Department of the Interior changed its policy, requiring suppression of any naturally occurring fire that reached a certain size or threatened people and structures.
Yellowstone also marked a shift in public understanding. For the first time, mainstream audiences began hearing that fire wasn’t purely destructive, that forests actually needed it. The ecological recovery of Yellowstone in the years that followed helped build that case. But the broader lesson, that a century of fire suppression had loaded forests with dangerous amounts of fuel, took much longer to sink in.
The Sharp Escalation After 2000
The numbers tell a clear story. In the late 1980s and early 1990s, annual wildfire acreage in the U.S. hovered between one and three million acres most years. The year 2000 marked a visible turning point: 7.4 million acres burned. From that point forward, years exceeding seven million acres became routine rather than exceptional. In 2015, 2017, and 2020, more than ten million acres burned each year.
Fire frequency shifted dramatically across every region. In the West, the median number of fires jumped from about 174 per year before the shift to 270 per year afterward. The Great Plains saw an even sharper increase, going from 34 fires per year to 113. Even the eastern United States, not typically associated with wildfire risk, saw annual fire counts climb from 57 to 94. Researchers studying this period have described it as a regime shift, not a gradual trend but a fundamental change in fire behavior across the country.
Why Fires Got Worse, Not Just More Frequent
Several forces converged in the late 20th and early 21st centuries to supercharge wildfire risk. Rising temperatures increased what scientists call vapor pressure deficit, essentially how aggressively the atmosphere pulls moisture out of vegetation. Drier plants ignite more easily and burn more intensely. Research has tied this drying trend directly to human-caused warming, and the effect has been especially pronounced in western forests during warm months.
What’s particularly striking is where fires are now showing up. Wildfires are expanding into higher-latitude and higher-altitude regions that historically had enough moisture to resist burning. As background aridity increases, even relatively mild weather patterns can now trigger fires in places that were previously low-risk. The geography of wildfire danger is literally expanding.
Drought cycles compound the problem. The western U.S. experienced severe drought conditions in the early 2000s and again in the late 2010s and early 2020s, each time producing record-breaking fire seasons. These droughts aren’t new in isolation, but they’re landing on a landscape that’s already warmer and drier on average than it was 40 years ago.
More People Living in Fire-Prone Areas
The wildfire problem isn’t just about fire. It’s about where people have chosen to live. The wildland-urban interface, the zone where houses meet or intermingle with undeveloped wildland, has been the fastest-growing land use type in the United States. Between 1990 and 2010, the number of houses in these zones grew from 30.8 million to 43.4 million, a 41 percent increase. The land area covered by this interface expanded by a third, from 581,000 to 770,000 square kilometers. Roughly one in three houses in the country now sits in this zone.
Nearly all of that growth, 97 percent, came from new housing construction rather than expanding vegetation. People built homes in beautiful, wooded areas without fully accounting for the fire risk. Within the perimeters of wildfires that burned between 1990 and 2015, there were 286,000 houses by 2010, up from 177,000 in 1990. More homes in fire’s path means more destruction, higher suppression costs, and greater pressure on firefighting resources even when the fires themselves aren’t historically unusual.
The Cost of the New Normal
Federal wildfire suppression spending reflects the scale of the change. In the five years ending in 1989, federal agencies spent an average of $728 million per year on fire suppression (adjusted to 2020 dollars). By the five years ending in 2020, that figure had more than tripled to $2.5 billion annually. That’s just the cost of fighting fires on federal land, not the rebuilding, insurance losses, or economic disruption that follow.
Wildfire smoke has introduced another cost that’s harder to measure. Decades of progress on air quality under the Clean Air Act steadily reduced fine particulate matter across the country. In recent years, increasing wildfire activity has begun flattening or reversing those improvements in parts of the U.S. Smoke events now regularly push air quality into unhealthy ranges for millions of people, sometimes hundreds of miles from the nearest fire.
A Problem With No Single Start Date
There’s no single year when wildfires “became” a problem. The roots trace to early 1900s suppression policies that allowed fuel to accumulate for generations. The consequences started becoming visible around 2000, when acreage burned jumped sharply and never returned to previous levels. Climate change, development patterns, and the legacy of suppression have reinforced each other, creating a compounding crisis rather than a single event. What’s different now is that all three factors are intensifying simultaneously, and the fires burning today are the result of decisions and trends stretching back more than a hundred years.

