Why Do Some People Have Food Allergies?

Food allergies happen because the immune system misidentifies a harmless food protein as a threat and mounts a full defensive response against it. This isn’t a simple malfunction with a single cause. Genetics, gut bacteria, skin health, environmental cleanliness, vitamin D levels, and the timing of food exposure in infancy all play a role in whether someone’s immune system learns to tolerate a food or attack it. Roughly 8% of children and up to 10% of adults in the United States are affected, though when diagnosis is confirmed with objective testing rather than self-reported questionnaires, the global average drops closer to 3%.

What Happens Inside the Body

A food allergy develops in two stages. The first is sensitization, which can take days to weeks and produces no symptoms at all. During this phase, immune cells encounter a food protein and incorrectly flag it as dangerous. This triggers a chain of events: specialized immune cells called B cells begin producing a type of antibody called IgE, which is designed specifically for that one protein. These IgE antibodies then attach to the surface of mast cells, which are packed with inflammatory chemicals and stationed throughout your skin, gut lining, and airways. At this point, nothing has happened yet. Your body is simply primed.

The second stage is the reaction itself. When you eat the same food again, the protein links up with those waiting IgE antibodies on the mast cells, and the mast cells release their contents, primarily histamine and other inflammatory compounds, into surrounding tissue. This is what causes the familiar symptoms: hives, swelling, throat tightening, stomach cramps, vomiting, or in severe cases, anaphylaxis. The whole process, from bite to reaction, can take minutes.

Genetics Set the Stage

Food allergies run in families, but not as strongly as you might expect. Having one family member with any allergic disease (asthma, eczema, hay fever, or food allergy) raises an offspring’s risk by about 1.4 times. Having two or more allergic family members raises it to 1.8 times. That’s a meaningful bump, but it’s far from a guarantee.

Researchers have identified 16 specific regions of DNA linked to food allergy at a statistically significant level, and 10 of those overlap with other allergic conditions like asthma and eczema. The genes involved fall into two broad categories. Some affect how the immune system recognizes and responds to proteins, particularly genes in the HLA region on chromosome 6, which controls how the body presents foreign molecules to immune cells. Others affect the physical barriers that keep food proteins out of places they shouldn’t be. The most well-studied of these is a gene called filaggrin, which helps build and maintain the skin’s outer layer. Mutations that disable filaggrin weaken the skin barrier, and this turns out to be a surprisingly important part of the story.

The Skin as a Gateway

One of the more counterintuitive discoveries in allergy research is that food allergies often begin not in the gut, but through the skin. This idea, known as the dual allergen exposure hypothesis, proposes that eating a food protein tends to teach the immune system to tolerate it, while encountering that same protein through damaged or inflamed skin tends to trigger sensitization.

The evidence is striking. In one study, peanut allergy in children was not linked to whether their mothers ate peanuts during pregnancy or breastfeeding. Instead, it was significantly associated with the use of skin creams containing peanut oil on infants with eczema. The inflamed, broken skin of eczema essentially let peanut protein slip past the barrier and reach immune cells in exactly the wrong context, triggering an allergic response rather than tolerance.

Research on filaggrin mutations reinforces this. A large 18-year study found that filaggrin mutations were strongly linked to food allergy, but only in people who also developed eczema. The gene doesn’t appear to directly cause food allergy through the immune system. It causes food allergy by weakening the skin barrier, which then allows sensitization to happen through the skin. This is why eczema in early infancy is one of the strongest predictors of developing food allergies later.

Too Clean for Our Own Good

Developed countries have seen a steady rise in allergic diseases since the 1980s, and that rise mirrors a decline in childhood infections over the same period. Developing countries show the opposite pattern: more infections, fewer allergies. This observation is the foundation of the hygiene hypothesis, first proposed by epidemiologist David Strachan, which suggests that reduced exposure to microbes in early life leaves the immune system poorly calibrated.

The basic mechanism involves a balance between two modes of immune response. Infections with viruses and bacteria generally activate one branch of the immune system (a Th1 response), which suppresses the branch responsible for allergic reactions (a Th2 response). When children encounter fewer infections, the allergy-promoting branch faces less opposition and is more likely to dominate. The result is an immune system that overreacts to harmless proteins like those in food.

This concept has expanded over the years. The “old friends” hypothesis points to organisms humans co-evolved with, particularly intestinal parasites, as important immune regulators. The microflora hypothesis focuses on bacteria in the gut. Both argue that modern life, through antibiotics, sanitized environments, processed diets, and cesarean births, disrupts the microbial exposures that historically kept the immune system in check.

Gut Bacteria and Immune Tolerance

The community of bacteria living in your intestines plays a direct role in training the immune system to tolerate food. Children with food allergies consistently show less bacterial diversity in their guts compared to children without allergies. A less complex microbial ecosystem is associated with a higher risk of sensitization.

Specific patterns appear repeatedly in research. Children with food allergies tend to have lower levels of beneficial bacteria, particularly Bifidobacterium, Lactobacillus, and Faecalibacterium, which help maintain the gut lining and produce compounds that promote immune tolerance. At the same time, they show higher levels of pro-inflammatory bacteria like Escherichia-Shigella and Ruminococcus gnavus. In infants, early increases in Escherichia-Shigella combined with drops in Bifidobacterium were associated with allergic symptoms that persisted beyond age two.

Children with peanut allergy specifically show reduced gut microbial diversity with increased levels of Bacteroides and Klebsiella and lower levels of protective species. Children with cow’s milk allergy placed on milk-free formulas show declining Bifidobacterium and rising levels of Clostridioides and Escherichia-Shigella. The pattern is consistent across different food allergies: less diversity, fewer protective bacteria, more inflammatory ones.

Studies in mice raised in completely sterile environments show just how fundamental this relationship is. Without any gut bacteria, these animals develop smaller immune structures in their intestines, produce fewer antibodies, and are far more vulnerable to immune dysfunction. The gut microbiome isn’t just correlated with allergy risk. It actively shapes how the immune system develops.

Vitamin D and Where You Live

Geography is a surprisingly strong predictor of food allergy risk. Populations living farthest from the equator, where sunlight is weaker and vitamin D production is lower, consistently show higher rates of food allergy. In the United States, prescriptions for epinephrine auto-injectors are two to four times higher in New England (8 to 12 per 1,000 people) than in southern states (about 3 per 1,000). Australian data shows the same gradient: more anaphylaxis hospitalizations in the southern state of Tasmania than in tropical North Queensland.

Vitamin D appears to be the link. In one study, infants with low vitamin D levels (at or below 50 nanomoles per liter) were 11 times more likely to develop peanut allergy, nearly 4 times more likely to develop egg allergy, and more than 10 times more likely to have multiple food allergies compared to infants with adequate levels. Birth season also matters, since babies born in months with less sunlight start life with lower vitamin D stores, and this has been associated with higher allergy risk later on. Vitamin D influences immune function by helping regulate inflammatory signaling, and lower levels appear to tip the balance toward the kind of immune responses that produce allergies.

The Nine Most Common Triggers

While over 160 foods can cause allergic reactions, nine are responsible for the vast majority. In the United States, these are milk, eggs, peanuts, tree nuts (such as almonds, walnuts, and pecans), wheat, soybeans, fish, shellfish (such as shrimp, crab, and lobster), and sesame. The first eight account for about 90% of all food allergic reactions. Sesame was added to the list in 2021 under the FASTER Act and is now a required allergen label on packaged foods.

Which allergens are most common varies by age. Milk and egg allergies are most prevalent in young children and are often outgrown. Peanut, tree nut, fish, and shellfish allergies tend to persist into adulthood. Sesame allergy is increasingly recognized and is more common in populations where sesame is a dietary staple.

Why Early Feeding Matters

For decades, parents were told to delay introducing allergenic foods to infants. That advice has been completely reversed. Current guidelines from the American Academy of Pediatrics recommend introducing peanut, egg, and other major food allergens at 4 to 6 months of age, regardless of whether the child has a family history of allergies or has been tested for food sensitivity.

This shift was driven largely by a landmark trial called LEAP, which showed that feeding peanut products to high-risk infants (those with severe eczema or egg allergy) dramatically reduced their likelihood of developing peanut allergy compared to avoiding peanut entirely. The logic aligns with the dual allergen exposure hypothesis: if the gut is the pathway to tolerance and the skin is the pathway to allergy, then getting food proteins into the gut early, before sensitization can occur through the skin, gives the immune system the right first impression. Waiting too long leaves a window where skin exposure, especially through eczema, can prime the immune system in the wrong direction.