Biocultural evolution is the process by which human biology and culture shape each other over time. Rather than treating genetics and culture as separate forces, biocultural evolution recognizes that they form a feedback loop: cultural practices change the environment humans live in, which shifts the biological pressures on their bodies, which in turn influences what cultural behaviors succeed. This two-way interaction has been a driving force in human development for hundreds of thousands of years and continues today.
How the Feedback Loop Works
The core idea is straightforward. Humans don’t just adapt to their environment the way other animals do. They actively reshape their environment through culture, including tools, food practices, clothing, shelter, and social organization. Those changes then create new biological pressures. Over generations, the human body evolves in response to the world that culture built, and that biological evolution feeds back into what cultural practices are possible or advantageous.
This concept is formalized in what researchers call Dual Inheritance Theory, which treats genetic inheritance and cultural transmission as two coupled systems. One system acts on populations of human beings through natural selection, shaping learning mechanisms and physical traits. The other operates on cultural practices themselves, optimizing them across a huge diversity of environments. The two systems are constantly influencing each other, which is why human evolution can’t be fully explained by genes alone or culture alone.
A related framework, niche construction theory, highlights how humans modify selection pressures in their own environments. Animal burrows buffer temperature and humidity. Human habitation and clothing do the same, but on a far greater scale. When a cultural practice like farming or building cities persists across enough generations, it becomes a stable selective environment that reshapes human biology. Offspring inherit not only genes from their parents but also a modified ecological world that determines which genes are advantageous.
Sickle Cell and Malaria: The Classic Example
The earliest and most famous demonstration of biocultural evolution came from geneticist Frank Livingstone in 1958, linking population growth, farming practices, and the sickle cell gene in West Africa. The chain of events starts with a cultural decision: slash-and-burn agriculture. Clearing forests for farming created pools of standing water, which became ideal breeding grounds for mosquitoes carrying malaria parasites.
As malaria became widespread, it created intense biological pressure. People who carried one copy of the sickle cell gene had red blood cells that resisted heavy malaria infection. They survived at higher rates and had more children than people without the gene. Over generations, the frequency of the sickle cell gene rose dramatically in malaria-prone farming regions. A cultural practice (agriculture) changed the environment (mosquito habitat), which changed the biology of the population (higher frequency of a protective gene). The agricultural adaptation was, as Livingstone argued, the ultimate determinant of whether malaria parasites ended up inside human red blood cells.
Milk, Starch, and the Genes Diet Built
Lactose tolerance is one of the clearest examples of culture rewriting human genetics. Most mammals lose the ability to digest milk sugar after weaning. But in populations that domesticated cattle and practiced dairying, the ability to digest lactose into adulthood became a massive survival advantage. Archaeological evidence from pottery residues shows milk use beginning after 8,500 years ago in western Turkey, spreading to Romania and Hungary around 7,900 years ago, and reaching Britain by about 6,100 years ago.
The genetic variant that allows adults to digest lactose sits not in the lactase gene itself but in a neighboring gene that controls lactase expression. The selection pressure behind this variant was enormous. Estimated selection strengths range from 1.4 to 19 percent, among the highest measured for any human gene in the last 30,000 years. To put that in perspective, most genetic advantages confer a fraction of a percent of improved survival. Dairying cultures essentially turbocharged evolution at this gene. The cultural practice came first, then the biology followed.
A similar story plays out with starch digestion. Humans carry a gene called AMY1 that produces salivary amylase, the enzyme that breaks down starch in your mouth. Unlike most genes, AMY1 exists in multiple copies, and the number of copies varies widely between people. Populations with traditionally high-starch diets (farming societies that relied on grains and tubers) carry significantly more AMY1 copies than populations with low-starch diets. About 70 percent of people from high-starch populations have at least six copies of the gene, compared to just 37 percent in low-starch groups. Chimpanzees, for comparison, have only two copies.
What’s especially interesting is the timeline. Hunter-gatherers already had variable AMY1 copy numbers as far back as 45,000 years ago, but a significant increase in copy number among European populations occurred over just the past 4,000 years, coinciding with the deepening reliance on agriculture. The cultural shift toward farming created biological pressure favoring people who could extract more energy from starchy foods.
Skin Color as a Biocultural Compromise
Human skin pigmentation is often presented as a simple story of UV radiation and latitude. The full picture is far more biocultural. Clothing, diet, and shelter all played roles in shaping the skin tones seen across different populations.
In equatorial regions with intense UV exposure, some of the darkest skin pigmentation evolved in populations that traditionally wore very little clothing, spent long hours outdoors, and ate diets rich in fish and other sources of vitamin D. Because their diet supplied plenty of vitamin D, there was no biological cost to having very dark skin that blocked UV-driven vitamin D production in the skin. In fact, a vitamin D-rich diet may have released the biological constraint that would otherwise limit how dark skin could become, allowing protective pigmentation to reach its maximum.
At far-northern latitudes, the situation reversed. Survival in extreme cold required sewn clothing, a cultural innovation that covered most of the body and drastically reduced the skin area available to produce vitamin D from sunlight. With UV levels already low and most skin covered, there was intense selective pressure for lighter pigmentation to maximize whatever vitamin D production was possible. But populations in these regions that maintained traditional diets rich in oily fish, marine mammals, and reindeer consumed enough dietary vitamin D that the pressure to depigment was partially offset. In some northern groups, skin remained relatively darker than latitude alone would predict, precisely because their food culture supplied what their skin could not.
These are not purely genetic adaptations or purely cultural ones. They are biocultural compromises shaped by the interaction of UV environment, clothing traditions, food procurement practices, and genetic variation over thousands of years.
Humans as Builders of Their Own Evolution
What makes biocultural evolution distinct from standard natural selection is the active role humans play. Most organisms adapt passively to environmental changes. Humans change the environment first, then adapt to the changes they created. This is niche construction at its most powerful.
Building shelters buffered temperature extremes, reducing selection for certain cold-tolerance traits. Cooking food (a cultural technology) altered the nutritional landscape. Farming reshaped entire ecosystems, creating new disease pressures and new dietary possibilities simultaneously. Each of these cultural innovations didn’t just solve an immediate problem. It created a new selective environment that pushed human biology in new directions.
The process hasn’t stopped. Modern environments are introducing novel pressures at an unprecedented pace. Myopia rates among children and adolescents have risen from 24 percent to 36 percent between 1990 and 2023, a shift linked in part to increasing exposure to screens and reduced time outdoors. While this isn’t genetic evolution on a generational timescale, it illustrates the same core principle: cultural choices (screen time, indoor lifestyles) reshape the biological development of human bodies.
Biocultural evolution isn’t a single historical event or a theory confined to our distant past. It’s the ongoing reality of being a species whose culture and biology are permanently entangled, each one continuously reshaping the other.

