Life expectancy at birth in 1776 was roughly 35 to 40 years, depending on where you lived and whether you survived childhood. That number sounds shockingly low, but it doesn’t mean most people dropped dead at 38. It’s a statistical average dragged down dramatically by the enormous number of children who died before their fifth birthday. If you made it to adulthood in the late 18th century, living into your 60s or even 70s was entirely possible.
Why the Number Looks So Low
The single biggest factor distorting that average is child mortality. Across historical populations, roughly 27% of newborns died in their first year of life. Data from Sweden, one of the few places keeping reliable records in this era, shows that 40% of children died before the age of 15 during the period from 1750 to 1780. When nearly half of all people born never reach adulthood, the average age at death plummets even if the survivors live long lives.
Think of it this way: if one child dies at age 1 and another person lives to 70, their average life expectancy is 35.5, even though no one actually died anywhere near that age. That mathematical quirk is exactly what happened in the 1776 era. The “average” didn’t reflect a typical lifespan so much as a world where babies and young children died at staggering rates from infections, malnutrition, and complications that are easily treatable today.
What Actually Killed People
Infectious disease dominated. The three most feared killers were smallpox, typhus, and dysentery, and they thrived in exactly the conditions that 18th-century life created: crowded housing, no sanitation systems, contaminated water, and zero understanding of how infections spread. Germ theory wouldn’t arrive for another century. Doctors didn’t know that the instruments they used on patients, or the straw bedding in hospitals, carried deadly pathogens from one person to the next.
Smallpox alone had a fatality rate of about 14% among those who caught it naturally. During the Revolutionary War, George Washington considered it a greater threat than the British army itself, writing that should the disease “rage with its usual Virulence, we should have more to dread from it, than from the sword of the enemy.” The only defense available was inoculation, a crude procedure where pus from an infected person’s sore was inserted into a small cut on a healthy person’s skin. It worked, dropping the fatality rate to around 2%, but it still meant deliberately giving someone smallpox and hoping for a mild case.
Hospitals were often more dangerous than the conditions they treated. Military records from the era describe soldiers entering hospitals with minor ailments and dying after contracting typhus or smallpox from other patients. Crowded wards, no ventilation, and contaminated surgical tools turned places of healing into breeding grounds for infection.
The Extra Risks Women Faced
For women, childbirth added a layer of danger that men simply didn’t encounter. Between 1% and 1.5% of all births ended in the mother’s death from exhaustion, dehydration, infection, hemorrhage, or seizures. That might sound like a small percentage for any single birth, but women in this era typically gave birth to five to eight children. Over a lifetime of pregnancies, a woman’s cumulative risk of dying in childbirth ran as high as 1 in 8.
There were no antibiotics to fight post-birth infections, no blood transfusions for hemorrhaging, and no surgical options that didn’t carry enormous risk on their own. A complicated delivery that would be routine in a modern hospital could easily become fatal.
How We Know These Numbers
Official census records didn’t exist in most places before the 1800s, so demographers piece together 18th-century life expectancy from indirect sources. In England, the primary records came from parish clergy who logged baptisms, marriages, and burials. Larger towns published “Bills of Mortality,” essentially weekly death counts. Tax records from levies on land, hearths, and windows provided population estimates, and poor-law administrators tracked people moving between parishes.
Sweden’s statistical office, established in 1749, produced some of the most reliable early demographic data. Colonial America had less systematic record-keeping, so estimates for the colonies rely more heavily on church records, plantation logs, and town registers, all of which have gaps and biases. Enslaved people, Indigenous populations, and rural poor were dramatically underrepresented in these records, meaning the true mortality picture was almost certainly worse than surviving data suggests.
Life Expectancy If You Survived Childhood
Once past the dangerous early years, the picture brightened considerably. A person who reached age 20 in the late 1700s could reasonably expect to live into their mid-60s. Many of the Founding Fathers lived well past that: Benjamin Franklin died at 84, John Adams at 90, Thomas Jefferson at 83. These were privileged men with access to better nutrition and living conditions than most, but even among the general population, reaching 60 or 70 was not unusual for those who cleared the childhood gauntlet.
The gap between life expectancy at birth and life expectancy conditional on surviving childhood is one of the most misunderstood aspects of historical demography. People in 1776 were not biologically aging faster. They faced a brutally high chance of dying young from infections and complications that modern medicine has largely eliminated, and those early deaths pulled the average down for everyone.
How Far We’ve Come
Average life expectancy in the United States today is about 77 years, roughly double the 1776 figure. Almost all of that gain came from reducing deaths in infancy and childhood through clean water, sanitation, vaccines, and antibiotics. The child mortality rate in the U.S. has dropped from nearly 50% to well under 1%. Maternal mortality, once a 1-in-8 lifetime risk, now occurs in roughly 1 in 5,000 births. The world that produced a 35-year life expectancy wasn’t full of people aging rapidly. It was full of threats we’ve since learned to prevent.

