The Recommended Dietary Allowance (RDA) and Adequate Intake (AI) are both daily nutrient targets designed for individuals, but they differ in how they’re calculated and how confident scientists are in the numbers. The RDA is built on precise data about human nutrient requirements, while the AI is an informed estimate used when that precise data doesn’t exist. Both fall under the broader Dietary Reference Intakes (DRI) framework maintained by the National Academies of Sciences, Engineering, and Medicine.
What the RDA Tells You
The RDA is the average daily intake of a nutrient sufficient to meet the needs of 97 to 98 percent of healthy people in a given age and sex group. That high coverage percentage is intentional: it’s set well above what the average person needs so that nearly everyone in the population is covered, including people whose requirements run higher than typical.
To arrive at the RDA, researchers first establish a value called the Estimated Average Requirement (EAR), which is the intake level that meets the needs of exactly half the population. They select a specific marker of adequacy for each nutrient, review the literature, and pin down that midpoint. The RDA is then calculated by adding a statistical buffer: two standard deviations above the EAR (RDA = EAR + 2 SD). When the data on individual variation isn’t strong enough to calculate a true standard deviation, a default coefficient of variation of 10 percent is assumed, making the formula RDA = 1.2 × EAR.
This math matters because it means the RDA isn’t a rough guess. It’s grounded in measured biological requirements and pushed upward with a known safety margin. If your usual intake of a nutrient meets or exceeds the RDA, you can be quite confident your intake is adequate.
What the AI Tells You
The AI exists for nutrients where scientists can’t establish an EAR. Without that foundational midpoint, the two-standard-deviation formula can’t be applied, so no RDA can be calculated. Instead, experts look at the intake levels observed in healthy populations and set the AI based on what appears to sustain good nutritional status in practice.
The result is a number that serves the same practical purpose as the RDA (a daily target for individuals) but rests on weaker statistical footing. The AI is essentially an educated best estimate rather than a value derived from controlled requirement studies. Healthy people whose average intake meets or exceeds the AI are still assumed to be at low risk of inadequacy, but the degree of certainty behind that assumption is lower than it is for the RDA.
Key Differences at a Glance
- Evidence base: The RDA requires dose-response studies that quantify how much of a nutrient people actually need. The AI relies on observational data about what healthy people typically consume.
- Statistical precision: The RDA is calculated to cover 97 to 98 percent of the population using a specific formula. The AI has no equivalent statistical guarantee.
- Hierarchy: An AI is only set when an RDA cannot be. If future research produces enough data to establish an EAR for a nutrient, its AI can be replaced with a formal RDA.
- Practical use: Both are used the same way as individual intake targets. The difference lies in how much confidence you can place in the number, not in how you apply it.
Which Nutrients Have an RDA and Which Have an AI
Most vitamins and minerals you’ll see on a nutrition label carry an RDA. These include vitamin A, the B vitamins (B1, B2, B3, B6, B12, and folate), vitamins D and E, calcium, iron, zinc, magnesium, selenium, phosphorus, copper, iodine, and molybdenum. For all of these, enough human requirement data exists to support the full EAR-to-RDA calculation.
A smaller group of nutrients has only an AI. Common examples include vitamin K, pantothenic acid (B5), biotin (B7), choline, potassium, chromium, manganese, and chloride. In each case, researchers haven’t been able to define the population’s requirement distribution precisely enough to set an EAR. That doesn’t mean these nutrients are less important. It simply means the science of measuring exact human needs for them hasn’t caught up yet.
Why the Distinction Matters in Practice
If you’re comparing your diet to nutrient recommendations, the RDA and AI function identically as goals: hitting the number means you’re likely getting enough. The real difference shows up in population-level assessments. Dietitians and public health researchers can use the EAR (the foundation of the RDA) to estimate how many people in a group fall short of their needs. That kind of analysis isn’t possible for AI nutrients because no EAR exists to serve as the statistical cutoff.
For an individual tracking their own intake, the takeaway is straightforward. When you see a nutrient value on a reference chart, check whether it’s marked as an RDA or an AI. If it’s an RDA, the number carries strong statistical backing. If it’s an AI, the number is still the best available target, but there’s more uncertainty baked in. Either way, consistently falling below the listed value raises the likelihood that your intake is inadequate.
The Framework Is Still Evolving
The DRI system is not static. As of early 2026, the National Academies have recommended several updates to the process, including broadening the target population from “apparently healthy” individuals to the “general” population. These changes could shift how future RDAs and AIs are determined, potentially reclassifying some nutrients as enough new evidence accumulates to convert an AI into a formal RDA.

