How Long Has Hunger Been a Problem in the World?

Hunger has been a problem for as long as humans have existed. Our earliest ancestors evolved in environments with sporadic food availability, routinely going days or longer without eating. The human body itself is evidence of this: our metabolism developed a built-in switching mechanism, shifting from burning stored sugar to burning fat and ketones during periods without food, specifically to keep the brain sharp and muscles functional while searching for the next meal. Hunger isn’t a modern failure. It’s the condition our species was built to survive.

Prehistoric Hunger Shaped Human Biology

Long before recorded history, natural selection favored people who could think clearly and move efficiently on an empty stomach. The bodies we walk around in today were shaped by millions of years of uncertain food access. When food runs out, the liver’s sugar reserves deplete within roughly a day, and the body switches to burning fat and producing ketones as fuel. This metabolic flexibility kept our ancestors alive, alert, and capable of finding food even after extended periods of deprivation.

This deep evolutionary history also left a less helpful legacy. As early human diets shifted toward more meat and fewer carbohydrate-rich plants, the body developed a tendency toward insulin resistance to prevent blood sugar from dropping dangerously low between meals. That same tendency, useful when food was scarce, now contributes to obesity-related diabetes in a world where calories are abundant for some populations. The biology of ancient hunger is still operating inside modern bodies that no longer need it.

The Farming Revolution: A Trade-Off

Around 10,000 to 12,000 years ago, humans began domesticating crops and animals in the Near East. This should have solved hunger, and in some ways it did, by making food more predictable. But the shift to agriculture also introduced new vulnerabilities. Hunter-gatherers ate a varied diet of wild plants, animals, and sometimes fish or shellfish. Early farmers narrowed that diversity dramatically, relying on a handful of grain crops. Archaeological isotope analysis from sites across Europe shows that coastal communities shifted from mixed diets that included marine foods to diets dominated almost entirely by terrestrial crops once farming arrived.

That dependence on a few crops meant a single bad season could be catastrophic. Climate records from ancient Near Eastern farming sites reveal exactly this pattern. The wettest, most favorable growing conditions occurred in the early period of agriculture, roughly 11,000 to 9,000 years ago. But water availability for crops began declining after that, with a notable dry spell around 8,200 years ago and an even more severe one around 4,200 years ago. That later event coincided with the Bronze Age and is linked to the abandonment of large, well-established settlements. People who had built entire communities around farming literally walked away when the harvests failed.

The First Recorded Famines

Written records of mass hunger stretch back to the third millennium BC, roughly 5,000 years ago, in Egypt and Mesopotamia. These are the earliest documented famines, but they were certainly not the first. They simply mark the point when civilizations had developed writing systems sophisticated enough to record the crisis. From that point forward, the historical record is dense with famine: India, Korea, Western Europe, Ethiopia, Russia, the New World. Historian Cormac Ó Gráda has cataloged famines spanning from those ancient Mesopotamian records through the present day across nearly every inhabited continent.

What changed over time was not whether famines happened but what caused them. Ancient famines were overwhelmingly driven by weather, floods, and droughts. As societies became more complex, war, trade disruptions, and political decisions increasingly played a role. By the medieval and early modern periods, many of the worst famines were partly or fully human-made, caused by sieges, colonial extraction, or deliberate neglect.

Industrialization Created New Forms of Hunger

The industrial revolution of the 18th and 19th centuries reshaped hunger rather than eliminating it. Two centuries ago, only about 5 percent of the world’s population lived in cities. Today, more than half does. That shift fundamentally changed how people access food. Rural populations can grow at least some of what they eat. Urban residents buy almost everything, which means hunger in cities is primarily a problem of income, not harvest.

Research from southern African cities illustrates how this plays out in practice: four out of five poor urban households do not have enough to eat at any given time. Food accounts for an enormous share of low-income households’ total spending, and when prices rise or wages drop, families cut the quality and quantity of what they eat, reduce dietary variety, skip health care, and work longer hours. Urban hunger looks different from rural famine, but it is no less real and in many ways harder to see from the outside.

The Green Revolution’s Partial Fix

The mid-20th century brought the closest thing to a technological solution hunger has ever seen. The Green Revolution, a wave of agricultural advances including high-yield crop varieties, synthetic fertilizers, and expanded irrigation, tripled global cereal production with only a 30 percent increase in cultivated land. Without it, global caloric availability would have dropped by an estimated 11 to 13 percent, a gap that would have meant hundreds of millions more people going hungry.

But the Green Revolution’s gains were uneven. Countries with existing infrastructure, irrigation systems, and access to global markets benefited most. Regions without those foundations, particularly in sub-Saharan Africa, saw far less improvement. The technology also carried costs: soil degradation, water depletion, and a continued narrowing of crop diversity that echoes the same vulnerability early farmers faced thousands of years ago.

Measuring Hunger as a Global Problem

The idea that governments should define and track minimum nutritional requirements is surprisingly recent. The first recommended dietary allowances were announced in 1941, created not out of humanitarian concern but as wartime preparation. The League of Nations, the British Medical Association, and the U.S. government each separately commissioned scientists to establish minimum caloric and nutrient requirements so their populations would be physically capable of fighting a war. Those standards evolved into the frameworks international organizations still use today to measure who is and isn’t getting enough to eat.

Where Global Hunger Stands Now

As of 2024, an estimated 673 million people worldwide experience hunger, roughly 8.2 percent of the global population. That number is moving in the right direction: down from 688 million in 2023 and 695 million in 2022. But the improvements are not happening everywhere. Africa’s trajectory is heading the wrong way, with the prevalence of undernourishment projected to rise from 19.1 percent in 2019 to 25.7 percent by 2030. Sub-Saharan Africa faces the steepest climb, with undernourishment projected to reach 29.4 percent by 2030. In Middle Africa specifically, the figure could hit 38 percent.

The United Nations’ Sustainable Development Goal 2 aims to end all forms of hunger and malnutrition by 2030. Africa is not on track to meet that target, and current efforts are considered insufficient. Sub-Saharan Africa is the only region in the world where childhood stunting from malnutrition is still increasing in prevalence.

So to answer the original question plainly: hunger has been a problem for the entire span of human existence, at minimum 200,000 years and arguably far longer if you count our pre-human ancestors. What has changed is the nature of the problem. Prehistoric hunger was a universal condition of life. Ancient and medieval hunger was driven by climate and war. Modern hunger is increasingly a problem of economics, politics, and distribution. The biology is the same. The causes have shifted. And for nearly 700 million people alive today, the problem remains unsolved.