Why Allergies Are Getting Worse, From Pollen to Pollution

Allergies are genuinely getting worse, not just in perception but in measurable ways. Pollen seasons are now 20 days longer than they were in 1990, with 21 percent more pollen in the air. Roughly one in four American adults reports having seasonal allergies, and the forces driving that number upward are accelerating.

The reasons aren’t simple. A combination of climate change, air pollution, shifts in childhood microbial exposure, and the way modern cities trap heat are all compounding to make allergic disease more common and more severe than it was a generation ago.

Climate Change Is Supercharging Pollen

Rising temperatures and higher carbon dioxide levels are the single biggest driver behind worsening seasonal allergies. Warmer springs mean plants start producing pollen earlier, and warmer autumns mean they keep going longer. That extra 20 days of pollen season isn’t spread evenly across the calendar. In many regions, the season starts a week or more before it used to and lingers well into what was previously a low-pollen window in fall.

But it’s not just timing. Higher CO2 in the atmosphere acts like fertilizer for pollen-producing plants. In controlled experiments, doubling atmospheric CO2 levels caused ragweed to produce 61 percent more pollen. Ragweed is already the leading trigger of fall allergies across North America, and each plant is now releasing significantly more allergenic material than it did decades ago. The pollen grains themselves may also carry more of the proteins that trigger immune reactions, though that effect is harder to quantify.

This means allergy sufferers face a double hit: more days of exposure and a heavier pollen load on each of those days.

Thunderstorms Can Make It Suddenly Dangerous

Extreme weather events add another layer. During thunderstorms, pollen grains get swept into the upper atmosphere, where moisture causes them to swell and rupture into tiny fragments. These fragments are small enough (under 3 micrometers) to penetrate deep into the lungs, far deeper than intact pollen grains normally reach. Cold downdrafts then carry these micro-particles back to ground level in concentrated bursts.

The result can be dramatic. During a thunderstorm event in England in June 2021, emergency department visits for asthma spiked 560 percent compared to the previous four weeks. A study of over 63,000 asthma-related ER visits in Louisiana found that on thunderstorm days, each unit increase in precipitation rate raised the risk of an ER visit by 14.5 percent. As severe storms become more frequent with climate change, these mass exposure events are expected to become more common.

Air Pollution Primes Your Immune System

Pollen would be less of a problem if it landed in a clean respiratory tract. But most people, especially in cities, are breathing air laced with fine particulate pollution from vehicle exhaust, industrial emissions, and wildfire smoke. These particles don’t just irritate airways on their own. They act as amplifiers for allergic reactions.

Diesel exhaust particles are particularly well studied. Animal research has shown that when the lungs encounter an allergen like dust mites alongside diesel particles, the immune system mounts a much stronger allergic response than it would to the allergen alone. The exhaust particles promote the accumulation of specific immune cells in the lungs that “remember” the allergen. Weeks later, even a single re-exposure to that allergen triggers a heightened reaction, including airway constriction and inflammation. In practical terms, pollution exposure can turn a mild sensitivity into full-blown allergic asthma.

Indoor air quality plays a role too. Modern homes are more tightly sealed for energy efficiency, which traps volatile organic compounds from cleaning products, paints, new furniture, and building materials. Formaldehyde, one of the most common indoor air pollutants, causes eye, nose, and throat irritation and can worsen respiratory sensitivity over time. These chemicals don’t cause allergies directly, but chronic low-level irritation of the airways can lower the threshold for allergic reactions to kick in.

Modern Childhoods May Train the Immune System Poorly

The immune system needs early exposure to a wide range of microbes to learn which substances are genuinely dangerous and which are harmless. Without that training, it’s more likely to overreact to things like pollen, pet dander, or food proteins. This concept, originally called the hygiene hypothesis, has evolved into a broader understanding of how microbial diversity shapes immune development.

At birth, an infant’s immune system is skewed toward a type of response associated with allergic tendencies. Normally, exposure to bacteria, fungi, and other microorganisms in the first months of life triggers a shift toward a more balanced immune profile. Regulatory T cells, a type of immune cell that acts as a brake on overreactions, develop in response to these early microbial encounters. When that exposure is limited, by overly sanitized environments, fewer siblings, less outdoor play, reduced contact with animals, or frequent antibiotic use, the brake doesn’t fully engage.

The “old friends” theory takes this further, arguing that humans co-evolved with certain environmental microbes and gut bacteria that are essential for proper immune regulation. Urban living has disrupted that relationship. Children raised on farms or in rural areas with more microbial diversity consistently show lower rates of allergies and asthma than children raised in cities, even when other factors are controlled for.

Cities Create Their Own Allergy Hotspots

Urban environments combine nearly all of these risk factors in one place. Concrete and asphalt absorb and radiate heat, creating urban heat islands where temperatures run several degrees higher than surrounding rural areas. That extra warmth extends local pollen seasons and increases pollen concentrations. It also promotes the dispersal of air pollutants, which then interact with that pollen to amplify allergic responses.

At the same time, urban children grow up with less microbial diversity, more exposure to traffic pollution, and more time indoors breathing recirculated air. The convergence of these factors helps explain why allergy rates are highest in cities and why rapidly urbanizing countries are seeing sharp increases in allergic disease.

Who’s Most Affected

Allergies don’t hit every group equally. Among American children, Black children (21.3 percent) and white children (20.4 percent) are nearly twice as likely to have seasonal allergies as Asian children (11 percent), with Hispanic children falling in between at 15.3 percent. Boys are somewhat more likely to develop seasonal allergies than girls (20 percent versus 17.7 percent), though that gap tends to narrow or reverse after puberty. Globally, allergic rhinitis affects between 10 and 30 percent of the population depending on the region, with the highest rates in industrialized countries.

These disparities likely reflect a mix of genetic susceptibility, environmental exposures, and access to green space. Communities near highways or industrial zones face higher pollution loads, and lower-income housing is more likely to have mold, pest allergens, and poor ventilation.

Early Allergen Exposure Can Help

One area where the understanding of allergies has shifted dramatically is infant feeding. For decades, parents were told to delay introducing common allergens like peanuts, eggs, and shellfish. Current guidelines from the FDA and major allergy organizations now recommend the opposite. Introducing peanut-containing foods as early as 4 to 6 months, especially for infants at high risk due to severe eczema or egg allergy, significantly reduces the chance of developing a peanut allergy.

This reversal reflects the same principle behind the microbial diversity research: early exposure teaches the immune system tolerance. Delaying introduction gives the immune system time to encounter these proteins through the skin (particularly through eczema-damaged skin) before the gut, which appears to promote sensitization rather than tolerance. For high-risk infants, a blood test or skin prick test can help determine the safest approach to introduction.

The broader takeaway applies beyond peanuts. The immune system is most flexible in the first year of life, and diverse early exposures, both dietary and microbial, appear to set the stage for lower allergy risk throughout childhood and beyond.