Paleolithic humans ate a far more varied diet than most people assume, combining meat from large and small animals with a wide range of wild plants, including tubers, nuts, seeds, wild grasses, and even insects and honey. The exact mix shifted dramatically depending on geography and climate, with estimated macronutrient ranges of 14% to 35% protein, 21% to 55% carbohydrate, and 12% to 58% fat by calories. There was no single “paleo diet.” There were dozens, shaped by what each environment offered.
Plants Were Central, Not a Side Dish
One of the most persistent misconceptions about Paleolithic eating is that it was dominated by meat. Recent archaeological evidence tells a different story. A 2025 study published in the Proceedings of the National Academy of Sciences analyzed starch grains roughly 780,000 years old, found on stone tools at a site near Gesher Benot Ya’akov in Israel. Those tools, the earliest evidence of humans processing plant foods, were used to crack and crush acorns, cereals, legumes, and aquatic plants like yellow water lily and a now-extinct water chestnut. The carbohydrates in these starchy foods would have been critical for fueling the energy demands of the developing human brain.
Direct evidence from human teeth confirms this picture. Researchers studying dental calculus (hardened plaque) from early modern humans in East Asia, dating between 120,000 and 80,000 years ago, recovered starch grains from acorns, roots and tubers, and wild grass seeds. About 40% of the starch grains came from acorns alone. One individual even had dental cavities likely caused by heavy carbohydrate consumption from those same acorns. This is the earliest direct evidence of a human diet built around these specific plant foods.
By the late Upper Paleolithic, plant gathering had become remarkably sophisticated. At Ohalo II, a 23,000-year-old site in Israel, archaeologists recovered over 90,000 plant remains from 142 different species. Wild wheat and barley were present, but small-grained grasses vastly outnumbered them: 16,000 small grass grains compared to about 2,500 barley grains and just 102 emmer wheat grains. This was broad-spectrum foraging, not selective harvesting, and it pushed back the timeline for significant grass collecting by 10,000 years.
Meat, Fat, and the Top of the Food Chain
Plants played a bigger role than the stereotype suggests, but animal foods were still a major calorie source for most Paleolithic groups. Isotope analysis of Neanderthal remains consistently shows nitrogen values 3 to 5 parts per thousand higher than contemporary herbivores, placing them at or above the level of other top carnivores. In all cases studied, Neanderthals obtained most or all of their dietary protein from large herbivores.
The animals themselves were different from anything on a modern menu. In North America, Clovis groups (around 13,000 years ago) hunted megafauna. Of the 15 archaeological sites where Clovis spear points have been found alongside extinct animals, 80% involved mammoth. Across Eurasia, the prey list included horses, bison, reindeer, aurochs, and red deer, depending on the region and era. As large animals declined, diets shifted toward smaller game, fish, shellfish, and birds.
Wild game carried a nutritional profile very different from modern livestock. It was leaner, with a higher proportion of structural fats and a fat composition that bore little resemblance to grain-fed beef or pork. Organ meats, marrow, and brain were prized calorie-dense foods that provided fat-soluble nutrients difficult to get from muscle meat alone.
Honey, Insects, and Overlooked Calories
Insects were likely a consistent, if hard to trace, part of the Paleolithic diet. They preserve poorly in the archaeological record, but indirect evidence points to their importance. In the Magdalenian cave of Les Trois Frères in southwestern France, dating to roughly 17,000 to 12,000 years ago, a cave grasshopper was engraved on an animal bone. The famous Altamira cave in Spain, with paintings going back 36,000 years, depicts what researchers interpret as edible insects and wild bee nests. Given their abundance and nutritional density (many insects are 40% to 70% protein by dry weight), early humans almost certainly ate them opportunistically.
Honey was another high-value food. Wild honey and honeycomb would have provided one of the few concentrated sugar sources available. Modern hunter-gatherer groups like the Hadza of Tanzania obtain up to 15% of their calories from honey during peak season, and there is no reason to think Paleolithic foragers passed up a similar opportunity when they found it.
Cooking Changed Everything
Raw wild plants are tough to digest. Many tubers and seeds pass through the gut with most of their calories locked away. Cooking broke that barrier. Experiments show that cooking starchy foods increases the energy your small intestine can extract by 28% to 109%, depending on the food. Across starches, proteins, and plant fats, a reasonable estimate is that cooking boosted net calorie gain by about 30%.
The timeline of when this shift happened is still debated. Hard archaeological evidence of regular fire use only becomes common around 400,000 years ago. At Qesem Cave in Israel, impressive evidence of fire use goes back to 420,000 years ago, while at nearby Tabun Cave, habitual fire use appears by 350,000 to 320,000 years ago. However, genetic evidence complicates the picture. Genes involved in processing cooked food show signs of positive selection in humans, Neanderthals, and Denisovans, which means adaptation to cooked diets likely began before these lineages split, somewhere before 550,000 to 765,000 years ago.
This matters because cooking didn’t just make food taste better. It fundamentally expanded the menu. Tough roots, bitter acorns, and fibrous tubers that would have been marginally useful raw became reliable calorie sources once they could be roasted or heated on hot stones. Cooking may have been what made a plant-heavy diet viable in the first place.
How the Diet Varied by Region
Paleolithic diets were shaped by latitude and local ecology more than by any universal human preference. Groups living in tropical and subtropical environments had year-round access to fruits, tubers, nuts, and leafy greens, with carbohydrates making up a larger share of calories. Groups in colder northern environments, where plant growth was seasonal and limited, relied more heavily on animal fat and protein for much of the year. Arctic-adapted populations may have gotten over 50% of their calories from fat.
A 2023 analysis in the American Journal of Clinical Nutrition looked at cross-cultural variation in these patterns and found that real Paleolithic macronutrient ranges were markedly wider than what the popular “paleo diet” recommends. The actual spread, 14% to 35% protein, 21% to 55% carbohydrate, and 12% to 58% fat, reflects a species that was extraordinarily flexible in what it could eat and thrive on. That flexibility, not any single food ratio, is the defining feature of the Paleolithic diet.
Wild Foods vs. Modern Foods
Even when Paleolithic foods overlap with modern categories, the nutritional content was dramatically different. Wild fruits were smaller, more fibrous, and far less sweet than anything in a grocery store. A wild apple ancestor is closer in sugar content to a modern carrot than to a Honeycrisp. Wild game was lean and carried a different fat profile than feedlot meat. The diet provided abundant protein, high fiber, and micronutrients from a wide variety of plant and animal sources, but virtually no refined sugars, no dairy, and no grain-based starches in the form we know today.
The carbohydrates Paleolithic humans ate came packaged with fiber, water, and cell walls that slowed digestion. Even honey, the most concentrated sugar source available, was seasonal and required significant effort to obtain. The result was a diet that could be surprisingly high in carbohydrates in some environments while still producing a very different metabolic effect than modern processed foods.

