Intelligence is shaped by both genetics and environment, and the two are so deeply intertwined that asking which one “wins” misses how they actually work together. In adults, genetic factors account for roughly 50 to 85% of the variation in IQ scores, depending on how the analysis handles certain statistical complexities. But that still leaves a substantial portion determined by life experiences, and the balance shifts dramatically depending on a person’s age, socioeconomic background, and even their IQ level itself.
How Much Comes From Genetics
Traditional estimates place the heritability of intelligence in adulthood at 75 to 85%. These numbers come largely from twin studies comparing identical twins (who share all their DNA) with fraternal twins (who share about half). When researchers account for additional factors like the tendency of similar people to have children together, the picture shifts: one careful reanalysis estimated additive genetic effects at 44%, with another 27% from more complex genetic interactions, and 18% from individual-specific environmental influences.
The strongest evidence comes from identical twins raised in completely separate homes. On average, these twins still have IQ scores that correlate at about 0.75, meaning their scores track closely despite growing up in different families. But when researchers looked more carefully at pairs with different amounts of schooling, the picture changed significantly. Twins with similar educations differed by only about 6 IQ points and correlated at 0.87. Twins with large differences in schooling diverged by over 15 IQ points, with a correlation dropping to 0.56. Even with identical DNA, environment clearly bends the outcome.
Genetics Matter More as You Age
One of the more counterintuitive findings in intelligence research is that genes become more influential over time, not less. In early childhood, the family environment plays a large role in IQ scores. Shared environmental influences, meaning the things siblings in the same household experience together, can account for 20% or more of the variation in young children’s scores. But after adolescence, as people leave home and build their own lives, shared environmental effects essentially disappear from the data.
This happens because as people gain more freedom to choose their own environments, they tend to gravitate toward settings that match their genetic predispositions. A child with a genetic inclination toward curiosity might seek out books, challenging courses, and intellectually stimulating friends once they’re old enough to make those choices. The genes don’t change, but they increasingly drive which environments a person ends up in.
There’s an interesting twist, though. Research published in Psychological Science found that higher-IQ individuals maintain stronger environmental influence on their intelligence deeper into adolescence, as if their brains stay in a more flexible, developmentally sensitive state for longer. Lower-IQ individuals transition earlier to the adult-like pattern where heritability dominates. This suggests that intelligence itself may extend the window during which environment can shape the brain.
Socioeconomic Status Widens the Gap Over Time
Growing up in poverty doesn’t just set children back at the starting line. It causes them to fall further behind as they age. A large study tracking over 14,000 children from age 2 to 16 found that children from low socioeconomic backgrounds scored about 6 IQ points lower than their wealthier peers at age 2. By age 16, that gap had nearly tripled. Higher socioeconomic status was linked to both a higher starting point and faster gains over time.
This pattern reflects the accumulation of environmental advantages and disadvantages. Wealthier families typically provide better nutrition, more cognitive stimulation, less chronic stress, higher-quality schools, and greater access to books and enrichment activities. These aren’t one-time boosts. They compound year after year, steadily pulling scores apart even when children start out relatively close.
Nutrition Can Shift IQ by Double Digits
Some environmental effects are dramatic enough to rival genetics in scale. Iodine deficiency during pregnancy and early childhood is one of the clearest examples. A meta-analysis of studies conducted in China found that children exposed to severe iodine deficiency lost an average of 12.45 IQ points compared to children in iodine-sufficient areas. When mothers received iodine supplements before and during pregnancy, their children recovered about 8.7 of those points.
Iron deficiency produces similar, if somewhat smaller, cognitive effects. These findings matter because they show that in environments where basic nutritional needs aren’t met, the genetic potential for intelligence simply can’t express itself fully. The DNA for building a capable brain might be there, but without the raw materials, the construction falls short.
Education Raises IQ Consistently
Each additional year of formal schooling raises IQ by an estimated 1 to 5 points, with an overall average around 3.4 points per year. This finding comes from a meta-analysis spanning over 600,000 participants across 42 datasets, using multiple research designs to isolate the effect of education from the simple fact that higher-IQ people tend to stay in school longer.
The researchers used three approaches to tease this apart. Studies that tracked the same people over time and controlled for their earlier IQ found gains of about 1.2 points per year. Studies exploiting policy changes that forced some people to stay in school longer found gains of about 2 points. And studies using age cutoffs for school entry, where children born days apart start school a full year apart, found gains closer to 5 points. All three designs pointed in the same direction: schooling genuinely builds cognitive ability rather than just selecting for it.
Brain Training Doesn’t Work the Way You’d Hope
If environment matters so much, can you boost your intelligence through targeted cognitive training? The short answer is: not in any meaningful, lasting way. A consensus statement from scientists at Stanford and elsewhere concluded that while brain training games produce measurable improvement on the specific tasks you practice, there is little evidence these gains transfer to broader real-world cognitive abilities.
Some studies show small improvements on related lab tasks, but these tend to fade once training stops. The commercial brain training industry often markets these narrow, temporary gains as general and lasting improvements, which the scientific community has pushed back against. Cognitively challenging activities are good for you in the same way physical exercise is, but expecting a brain game to permanently raise your IQ is not supported by current evidence.
The Flynn Effect and Its Reversal
One of the most powerful pieces of evidence for environmental influence is the Flynn Effect: the observation that average IQ scores rose by about 2 to 3 points per decade throughout the 20th century. This increase was far too fast to be genetic, since human DNA doesn’t change meaningfully in just a few generations. Better nutrition, expanded access to education, smaller family sizes, and greater exposure to abstract thinking through technology all likely contributed.
Recently, though, this trend has stalled or reversed in several developed nations. Western Europe is projected to see a drop of about 3 IQ points. The causes aren’t fully understood, but researchers have pointed to changes in education systems, screen time displacing certain types of cognitive activity, and the possibility that the “low-hanging fruit” of environmental improvements (like eliminating widespread malnutrition and ensuring basic schooling) has already been picked in wealthy countries.
Why the Question Itself Is Misleading
Asking whether intelligence is “nature or nurture” implies the two operate independently, like separate ingredients you could measure in a recipe. In reality, genes and environment interact continuously. Your genes influence which environments you seek out, and your environment determines which genetic potentials get expressed. A child with strong genetic predispositions for intelligence who grows up with severe iodine deficiency and no schooling will score very differently than the same child raised with adequate nutrition and 12 years of education.
The best current framework treats heritability not as a fixed property of intelligence but as a description of a specific population in a specific environment. In a society where everyone receives excellent nutrition and schooling, genetic differences would explain most of the remaining variation in IQ. In a society with vast inequalities in nutrition and education, environmental factors would explain more. The heritability number doesn’t tell you how much genes “matter” in an absolute sense. It tells you how much of the variation in a particular group, at a particular time, traces back to genetic differences versus environmental ones.
So the honest answer is that intelligence is always both. Genes set a wide range of possible outcomes, and the life you live determines where in that range you land.

