The phrase “nature versus nurture” dates to the 19th century, coined by the English scientist Francis Galton. But the underlying question, whether human traits come from inborn qualities or lived experience, is far older. Philosophers were arguing about it in ancient Greece more than 2,000 years ago. The debate has never really stopped; it has just changed shape as science caught up.
The Ancient Greek Origins
The earliest known version of this debate appears in Plato’s dialogue the Meno, written around 380 BCE. Plato argued that all learning is actually recollection, that everything we will ever know is already inside us before we are taught. He called this doctrine anamnesis. In his view, our senses simply remind us of knowledge we were born with. He extended the idea in the Phaedo, arguing that our concept of equality, for instance, couldn’t have come from looking at two sticks that happen to be the same length. The concept had to be innate.
Plato’s reasoning followed a pattern philosophers still use today: sensory experience is too messy and incomplete to explain certain kinds of knowledge, so that knowledge must come from somewhere else. He pointed especially to mathematical ideas as proof that the mind arrives pre-loaded.
Aristotle pushed back. He rejected the idea that reality exists as abstract forms separate from the physical world. For Aristotle, the nature of a thing is embedded in the thing itself, not floating in some higher realm. Our understanding of the world comes through perception, a process where the mind absorbs the form of what it encounters. Because experience is rich enough to explain knowledge, there’s no gap that inborn ideas need to fill. This more grounded, observation-based approach made Aristotle the original empiricist and set the template for centuries of thinkers who would follow.
John Locke and the Blank Slate
The debate resurfaced with new force in the 17th century. In 1689, the English philosopher John Locke published An Essay Concerning Human Understanding, in which he described the mind at birth as “white paper, void of all characters.” All the materials of reason and knowledge, he argued, come from experience. This idea, known as tabula rasa (Latin for “blank slate”), was revolutionary. It implied that people are shaped entirely by what happens to them, not by what they’re born with. Locke’s work became a cornerstone of the empiricist tradition and had enormous influence on education, politics, and eventually psychology.
Galton Names the Debate
The phrase “nature versus nurture” as we know it came from Francis Galton, Charles Darwin’s half-cousin, in the second half of the 19th century. Galton was deeply interested in heredity and intelligence, and he framed the question in terms that would stick: how much of a person’s character and ability is set by biology, and how much by upbringing and environment? By giving the debate a memorable label, Galton turned a sprawling philosophical question into a focused scientific one. He also, controversially, leaned heavily toward the “nature” side, laying groundwork for the eugenics movement.
Behaviorism Swings Toward Nurture
In the early 20th century, the pendulum swung dramatically toward environment. John Watson, an American psychologist who published his influential book Behaviorism in 1924, famously claimed he could take any healthy infant and shape that child into any type of adult, given complete control over the child’s environment. Watson saw heredity as nearly irrelevant. What mattered was conditioning: the rewards, punishments, and associations a person encountered growing up.
Watson likely overstated his case, and not everyone agreed with him even at the time. His contemporary Edward Thorndike, for example, believed that a newborn’s future was substantially shaped by heredity. But behaviorism dominated American psychology for decades, and with it came the assumption that environment was the primary driver of human behavior. This era pushed genetics to the margins of psychological research.
Twin Studies Bring Genetics Back
Starting in the 1970s, large-scale twin studies began producing results that were hard to ignore. A landmark 1976 study of 3,000 twin pairs found a striking pattern: identical twins were consistently more similar than fraternal twins across cognitive ability, personality, and interests. The differences between identical and fraternal twins were remarkably uniform across all these trait categories, suggesting a strong genetic component to nearly everything measured.
The Minnesota Study of Twins Reared Apart, launched in 1979, took this further by studying identical twins who had been separated at birth and raised in different families. About 70% of the variation in IQ scores was associated with genetic differences. On measures of personality, temperament, occupational interests, and social attitudes, identical twins raised apart were roughly as similar as identical twins raised together. Growing up in the same household, it turned out, didn’t make siblings much more alike than their genes already made them.
These findings gave rise to what behavioral geneticists call the “first law”: all behavioral traits show significant genetic influence. But the flip side was equally important. No trait was 100% heritable. Environment consistently accounted for about half the variation. The surprise was which environmental influences mattered. It wasn’t the shared family environment (the home, the parenting style, the neighborhood) that made the biggest difference. It was what researchers called “nonshared environment,” the unique, individual experiences that make siblings in the same family different from one another.
The Human Genome Project Changes the Question
When the Human Genome Project completed its draft sequence of the human genome in 2001, many expected it to settle the nature side of the debate once and for all. Instead, it did the opposite. The human genome turned out to contain far fewer genes than anticipated. As Craig Venter, one of the project’s leaders, put it: “We simply do not have enough genes for this idea of biological determinism to be right.”
Large-scale studies scanning the entire genome for links to disease and behavior found real genetic associations, but these explained only a fraction of the risk for most conditions. Genetics alone couldn’t account for the diseases people developed, the personalities they had, or the abilities they showed. The project exposed a fundamental imbalance: scientists had invested enormously in understanding the “nature” side and far less in rigorously measuring the “nurture” side.
Why Scientists Stopped Picking Sides
The discovery that finally broke the either/or framing was epigenetics, the study of how environmental signals change the way genes work without altering the DNA sequence itself. Your cells have chemical tags, particularly small molecules attached to DNA and to the proteins that package it, that act like dimmer switches. They can turn genes up, turn them down, or shut them off entirely. These tags respond directly to environmental inputs like diet, stress, and chemical exposures.
This means the environment doesn’t just compete with genes. It talks to them. A nutritional change can alter which genes are active. A stressful experience can shift how certain brain-related genes are expressed. Because these chemical patterns are easily influenced by environmental factors, two people with identical DNA can end up with meaningfully different traits depending on what they’ve been exposed to.
Behavioral genetics added another twist: measures of environment themselves show genetic influence. Parenting styles, social support networks, and even the life events people experience are all partly shaped by genetics, with heritabilities averaging around 25%. In other words, your genes influence which environments you seek out and encounter, and those environments then influence how your genes are expressed. Nature and nurture don’t just interact. They are, in a real biological sense, inseparable.
The modern consensus is that virtually every human trait reflects both genetic and environmental contributions, woven together through mechanisms that make the old “versus” framing obsolete. The question is no longer which one matters more. It’s how they work together, moment by moment, across a lifetime.

