Food shortages across Europe, particularly during and after World War II, directly caused tens of thousands of starvation deaths, triggered explosive outbreaks of infectious disease, and reshaped the health of generations born during famine. These consequences rippled far beyond the immediate hunger crisis, permanently altering the bodies and genes of people who were merely in the womb when the food ran out.
The Scale of Agricultural Collapse
Europe’s food supply didn’t simply dip during the war years. It cratered. In 1940, wheat harvests fell by 50% compared to the previous year and over 60% compared to peak production in 1934. Oat production dropped 74%. Olive oil fell 54%. Rye, broad beans, and wine production all saw declines between 25% and 50%. The destruction of farmland, loss of labor to military service, disrupted trade routes, and deliberate wartime blockades combined to gut the continent’s ability to feed itself.
By the war’s final months, conditions in occupied territories were catastrophic. In the western Netherlands during the winter of 1944-1945, average adult caloric intake dropped to roughly 1,000 calories per day by November 1944, then plummeted to just 580 calories per day by February 1945. People ate grass, sugar beets, and tulip bulbs to survive. In occupied Germany after the war, civilian rations in the British zone fell to as low as 1,015 calories per day in March 1946 and stayed below 1,200 calories for months.
Mass Death From Starvation
The Dutch Hunger Winter alone caused an estimated 22,000 to 25,000 excess deaths, with some researchers placing the figure as high as 65,000 when indirect hunger-related deaths are included. At least 8,300 deaths were officially recorded as caused by starvation between December 1944 and July 1945, though this is widely considered an undercount since it excludes rural areas and deaths that occurred after liberation. Men died from hunger at three to four times the rate of women.
These numbers reflected just one country. Across the continent, the United Nations Relief and Rehabilitation Administration estimated that 10 to 12 million tons of food would be needed within a single year to prevent millions more from starving in liberated countries. By September 1945, UNRRA had shipped over 2.1 million tons of relief supplies, more than 75% of it grain and flour, to countries including Greece (roughly 100,000 tons per month), Poland (160,000 tons), and Yugoslavia (330,000 tons).
Tuberculosis and Infectious Disease Surges
Hunger doesn’t just kill through starvation. It wrecks the immune system, making people far more vulnerable to infections they might otherwise survive. Tuberculosis mortality in Amsterdam tripled in six years, rising from about 35 per 100,000 before the war to 105 per 100,000 at the peak of the famine in 1945. The spike reversed quickly once food supplies returned, dropping back to prewar levels by 1947, which confirmed that malnutrition was the driving force.
One of the starkest demonstrations came from prisoner-of-war camps in Germany. Russian and British prisoners lived in equally harsh conditions and received 1,600 calories a day with minimal protein. But British prisoners got an additional 1,300 calories daily from Red Cross packages, including extra animal protein. The Russian prisoners, on their restricted diet, developed tuberculosis at 13 times the rate of the British prisoners. Same camps, same exposure to the bacteria, vastly different outcomes based entirely on nutrition.
Lifelong Health Damage to Children Born During Famine
Some of the most consequential results of Europe’s food shortages didn’t appear for decades. Babies who were in the womb during the Dutch Hunger Winter grew up to have significantly higher rates of heart disease, obesity, type 2 diabetes, and glucose intolerance compared to people born just before or after the famine. Males exposed during their second trimester had 55% higher odds of developing diabetes. Females exposed during their first trimester showed elevated rates of obesity.
The explanation lies in what scientists call the “thrifty phenotype.” When a developing fetus receives inadequate nutrition, its body adapts by permanently recalibrating metabolism to hoard energy and resist insulin, essentially preparing for a lifetime of scarcity. These adaptations become harmful when food is actually available later in life, making the person far more prone to metabolic diseases like diabetes and obesity. The combination of these fetal changes with normal aging, weight gain, and sedentary habits creates a perfect storm for chronic disease.
Women born during the famine also faced roughly double the risk of developing schizophrenia, with relative risks reaching 2.17 to 2.56 depending on how broadly the condition was defined. The famine’s psychological toll extended beyond just caloric deprivation. Pregnant women under extreme stress release excess cortisol, which crosses the placenta and permanently alters how the baby’s stress-response system develops, raising lifelong risks of insulin resistance and metabolic problems.
Permanent Changes Written Into DNA
Perhaps the most striking discovery is that prenatal famine exposure physically changed how genes function. Researchers studying Dutch Hunger Winter survivors six decades later found that famine had altered the chemical tags on their DNA, the switches that control whether genes are turned up or turned down. Genes involved in growth, fat storage, immune function, and cholesterol transport all showed measurable differences compared to unexposed siblings.
These changes depended on timing and sex. People exposed around conception showed altered activity in genes controlling insulin signaling, fat-regulating hormones, and cholesterol metabolism. Those exposed late in pregnancy showed different patterns. In several cases, the effects appeared only in men or were far more pronounced in one sex than the other. A gene involved in appetite regulation, for instance, showed significantly higher activation in men exposed to famine but no change in women.
The takeaway is that a few months of severe food shortage during pregnancy left a chemical imprint on DNA that was still detectable more than 60 years later. These weren’t mutations passed down through generations but persistent changes to gene activity that shaped health outcomes across an entire lifetime. Europe’s wartime hunger didn’t just starve a generation. It reprogrammed the biology of the next one.
Political and Institutional Transformation
The sheer scale of postwar hunger forced a reorganization of international food policy. UNRRA, established in 1943, became the first large-scale international relief operation, shipping hundreds of thousands of tons of food to devastated countries and laying the groundwork for the broader humanitarian aid system that followed. The Marshall Plan, launched in 1948, was driven in large part by the recognition that Europe’s food crisis threatened not just lives but political stability, as hunger created conditions ripe for social unrest and the spread of communism.
Within individual countries, rationing systems that had been wartime necessities continued for years after the fighting stopped. Britain maintained food rationing until 1954, nearly a decade after the war ended. These systems, and the political pressure they created, accelerated the development of agricultural policy, food safety standards, and social welfare programs across Western Europe. The Common Agricultural Policy, one of the European Union’s founding initiatives, was a direct institutional response to the memory of wartime and postwar hunger, designed to ensure that European nations would never again face food dependency and mass starvation.

