Fortification is the process of adding vitamins or minerals to everyday foods during manufacturing to prevent nutritional deficiencies across a population. Salt, flour, cooking oil, sugar, and milk are among the most commonly fortified foods worldwide. If you’ve ever noticed “enriched” or “fortified” on a food label, you’ve already encountered it. The practice is one of the most cost-effective public health strategies in existence, often costing less than two cents per person per year to operate at a national scale.
How Fortification Works
At its simplest, fortification means mixing a precise amount of a micronutrient into a food product at the factory level before it reaches store shelves. The nutrients are added in forms the body can absorb readily. For example, the synthetic form of folate (a B vitamin) that gets added to flour is actually absorbed more efficiently than the folate found naturally in leafy greens. Estimates suggest natural food folate may be only about 50% as bioavailable as its synthetic counterpart, though some researchers believe that gap is smaller than previously thought.
The foods chosen for fortification are ones that nearly everyone in a given population already eats regularly, in fairly consistent amounts. Salt is a classic vehicle because consumption is widespread and relatively uniform. Flour, cooking oil, and sugar serve the same purpose in different regions depending on local diets. The goal is to reach the greatest number of people without requiring them to change their eating habits.
Types of Fortification Programs
Not all fortification programs work the same way. They fall into a few broad categories based on who they’re designed to reach and who initiates them.
- Mass (large-scale) fortification targets an entire population by adding nutrients to staple foods that most people consume daily. These programs are typically mandated by governments. Large-scale operations process more than 50 metric tons of food per day and are concentrated among major food manufacturers.
- Targeted fortification focuses on specific groups with higher nutritional needs, such as children, pregnant women, or beneficiaries of social protection programs. Fortified school meals or specialized supplementary foods fall into this category.
- Voluntary (market-driven) fortification happens when food companies choose to add nutrients to their products. Breakfast cereals and plant-based milks are common examples. Governments still set regulatory limits on what can be added and how much, but the decision to fortify is made by the manufacturer.
- Biofortification takes a fundamentally different approach. Instead of adding nutrients during processing, scientists breed crops to be naturally richer in specific vitamins or minerals. Orange-fleshed sweet potatoes, for instance, contain far more vitamin A than conventional white varieties. Iron-rich beans and zinc-enriched wheat are other examples. This method is especially valuable for rural communities that rely on homegrown food and have limited access to industrially processed products.
The Nutrients Most Often Added
The specific vitamins and minerals added to foods depend on what a population is lacking. Globally, the most common fortificants include iron (added to flour and cereals), iodine (added to salt), vitamin A (added to cooking oil, sugar, and flour), folic acid (added to grain products), vitamin D (added to milk and oils), and zinc. The WHO’s most recent fortification guideline, published in 2025, focuses specifically on adding vitamins A and D to edible oils and fats as a public health measure.
In the United States, the FDA’s fortification policy limits additions to essential nutrients that have an established recommended daily intake. Manufacturers can’t simply add any compound they like. Before a nutrient qualifies, regulators evaluate whether a genuine dietary gap exists in the population. If there’s no documented deficiency, fortification isn’t considered scientifically justified.
Public Health Successes
The most dramatic early example of fortification’s power came with iodized salt. In the early 1900s, goiter (a swelling of the thyroid gland caused by iodine deficiency) was widespread across the American Midwest and Great Lakes region, known as the “goiter belt.” On May 1, 1924, the first boxes of iodized salt went on sale in Michigan. By 1935, goiter rates in the state had dropped by 74% to 90%, with the steepest declines among children who had used iodized salt consistently for at least six months. Similar results appeared across other affected regions.
Folic acid fortification of flour tells a similar story. After the United States mandated adding folic acid to enriched grain products in the late 1990s, the prevalence of neural tube defects (serious birth defects of the brain and spine) fell by about 28% nationally. Programs that tracked pregnancies more comprehensively found reductions closer to 35%. These declines happened without asking women to take supplements or change their diets. The nutrient simply showed up in the bread, pasta, and cereal they were already eating.
Biofortified crops are building their own evidence base. Studies have shown that vitamin A-biofortified orange-fleshed sweet potato raises circulating vitamin A levels in people who eat it. Provitamin A maize improved vitamin A stores in young children in Zambia and even improved visual function in children who were deficient. Iron-biofortified beans and pearl millet improved hemoglobin and iron stores in studies conducted in Rwanda and India.
Risks of Getting It Wrong
Fortification is not without limits. Adding too much of a nutrient, or adding it to too many different products, can push people’s intake above safe thresholds. The FDA warns that random or uncoordinated fortification “could result in over- or underfortification in consumer diets and create nutrient imbalances in the food supply.”
To prevent this, regulators use upper intake levels set by nutrition science bodies. These represent the maximum daily amount of a nutrient considered safe for the general population. Before mandating fortification, the FDA models current dietary intakes to find a level that’s effective for the people who need it (for folic acid, that’s women of childbearing age) while remaining safe for everyone else, including children and older adults who may consume fortified foods from multiple sources throughout the day.
This balancing act is why fortification decisions are made at the government level rather than left entirely to market forces. A single fortified cereal poses little risk. But when flour, milk, juice, snack bars, and supplements all contain the same added nutrient, cumulative intake can climb quickly.
Cost and Global Reach
One reason fortification is so widely adopted is its remarkably low cost. Modeling studies in Burkina Faso estimated the cost of large-scale fortification programs at roughly one to one and a half cents per person per year, or about two to three cents per person actually reached by the fortified food. For comparison, that’s a fraction of what individual supplementation programs cost, and it requires no behavior change from the people it benefits.
The WHO frames fortification as one piece of a broader food-based strategy, not a standalone solution. It works best alongside efforts to improve diet diversity and address the root causes of malnutrition. But for filling specific nutrient gaps at a population level, particularly in places where diets are limited or monotonous, few interventions deliver as much impact per dollar spent.

