Yes, food allergies are increasing, and the rise is real, not just a matter of greater awareness. Around 33 million people in the United States now have at least one food allergy. Emergency department visits for food-related anaphylaxis in New York City roughly doubled over a decade, climbing from about 36 per million people during 2005 to 2010 to nearly 74 per million during 2011 to 2014. The trend is especially pronounced in children, though adults are far from immune.
How Common Food Allergies Are Now
Current CDC data puts food allergy rates at about 5.3% of U.S. children and nearly 7% of adults. Boys are slightly more affected than girls (5.9% vs. 4.7%). Food Allergy Research & Education (FARE) estimates the numbers at roughly 5.6 million children and over 27 million adults, totaling about 33 million Americans.
These figures represent a meaningful jump from previous decades, and the increase shows up not just in survey data but in harder-to-fake measures like hospitalizations and epinephrine prescriptions. The pattern holds in other wealthy, urbanized countries too. Australia, the UK, and parts of Western Europe have documented similar rises, and food allergy rates tend to track closely with economic development and urbanization across regions.
Why Allergies Are Rising: The Gut Connection
The leading explanation centers on changes to the microbial environment children grow up in. In 1989, epidemiologist David Strachan proposed the “hygiene hypothesis,” suggesting that reduced childhood exposure to infections leads the immune system to overreact to harmless substances like food proteins. Since then, the theory has evolved. Researchers now believe the key issue isn’t fewer infections per se, but disrupted colonization of the gut by beneficial bacteria during infancy.
The gut’s microbial community plays a central role in training the immune system to tolerate food. Animal studies illustrate this vividly: animals raised in completely germ-free environments have underdeveloped immune tissue in their intestines, low antibody levels, and a much harder time learning to tolerate food proteins. When researchers gave newborn rats doses of bacterial components alongside food proteins, those animals developed significantly lower allergic responses compared to untreated animals.
Several modern lifestyle factors are thought to disturb this early microbial training. Antibiotic use in infancy, smaller family sizes (fewer siblings sharing germs), cesarean births (which bypass the bacterial transfer that happens in the birth canal), and highly processed diets all alter the bacterial communities that colonize an infant’s gut. Research tracking birth mode and sibling effects has found stronger evidence for a causal chain: these factors change the gut microbiome, and those microbial changes increase allergy risk.
Vitamin D and Geography
Where you live matters. Populations farther from the equator, where sunlight exposure is lower and vitamin D production drops, consistently show higher rates of food allergy. In the U.S., prescriptions for epinephrine auto-injectors are two to four times higher in New England states (8 to 12 per 1,000 people) compared to southern states (about 3 per 1,000). Australia shows the same gradient, with more anaphylaxis hospitalizations in its southern regions than in tropical northern areas.
The connection appears to start early. Studies have found associations between vitamin D status at birth, influenced by the season a baby is born in, and the risk of developing food allergies later. Populations with lower vitamin D levels, including those in northern latitudes and young infants, are more likely to develop food allergies. This doesn’t mean vitamin D supplements will prevent allergies, but it does help explain why food allergy rates are highest in countries and regions with less sun exposure.
Early Introduction Changes the Odds
One of the most important shifts in understanding food allergy prevention came from a landmark trial that tested whether feeding peanut products to infants could prevent peanut allergy. It worked. Children who ate peanut starting in infancy and continuing to age 5 had dramatically lower allergy rates than those who avoided peanut entirely. A follow-up tracked these children to age 12 and found the protection held: peanut allergy affected just 4.4% of the early-introduction group versus 15.4% of the avoidance group, even years after the structured feeding ended.
This finding flipped decades of medical advice. Pediatric guidelines previously told parents to delay introducing allergenic foods, which may have inadvertently contributed to rising allergy rates. Current guidelines now recommend introducing peanut and other common allergens in infancy, particularly for high-risk children.
Not All Reported Allergies Are Confirmed
Part of the story is also better awareness and, sometimes, overestimation. Self-reported food allergy rates are significantly higher than what clinical testing confirms. In one large national survey, the estimated prevalence of peanut, tree nut, fish, or shellfish allergy among adults dropped from 9.7% based on self-report to 3.5% when researchers applied stricter criteria more consistent with confirmed diagnoses.
Some people who believe they have a food allergy may actually have a food intolerance (like lactose intolerance) or sensitivity that doesn’t involve the immune system in the same way. This shows up in the data: about 34% of people who reported a milk allergy had drunk cow’s milk in the previous month, and roughly a quarter of those reporting shellfish allergy had recently eaten shellfish. These patterns suggest many self-reported allergies are either mild, misidentified, or outgrown.
That said, the increase in clinically severe reactions, measured through emergency visits and hospitalizations, confirms that the rise isn’t purely a reporting artifact. More people are genuinely developing immune-mediated food allergies than in previous generations.
Adults Are Developing Allergies Too
Food allergies aren’t just a childhood problem. About one-third of all adult food allergy diagnoses are adult-onset, meaning the person had no history of reacting to that food as a child. Shellfish is by far the most common trigger, implicated in anywhere from 28% to 59% of adult-onset cases across multiple studies. Tree nuts are the second most common, followed by fish. Shellfish allergy prevalence also increases with age, which is unusual for allergic conditions.
This adult-onset pattern suggests the immune system can lose tolerance to foods at any point in life, not just during the developmental windows of infancy and childhood. The mechanisms behind this are less well understood than childhood food allergy, but the burden is substantial and likely underestimated.
New Allergens, New Labels
The list of officially recognized major allergens is also expanding. In January 2023, sesame became the ninth major food allergen under U.S. federal law through the FASTER Act. Packaged foods containing sesame must now declare it on the label, joining milk, eggs, fish, shellfish, tree nuts, peanuts, wheat, and soybeans. The addition reflects growing recognition that sesame allergy is common enough and severe enough to warrant the same protections as the original eight.
The patterns of which foods cause allergies also vary significantly around the world. While peanut and tree nut allergies dominate in Western countries, other regions see higher rates of allergy to foods like buckwheat, chickpeas, or rice, shaped by local diets and genetics. The global picture is one of increasing prevalence nearly everywhere, but with different specific allergens rising in different populations.

