Food has changed dramatically over the past century, and the shifts go far deeper than new products on grocery shelves. Fruits and vegetables have lost 25 to 50% of their nutritional density in the last 50 to 70 years. Ultra-processed foods, which barely existed before 1900, now make up more than half the average American diet. Meanwhile, portion sizes have quietly grown, crop diversity has collapsed, and the balance of fats in everyday meals has tilted sharply. Here’s what actually happened and why it matters for what ends up on your plate today.
Fruits and Vegetables Are Less Nutritious
The apple you eat today is not the same apple your grandparents ate. Common fruits and vegetables, including tomatoes, potatoes, apples, oranges, and bananas, have lost roughly 25 to 50% of their nutritional density over the past half-century. The declines span nearly every mineral that matters: sodium has dropped 29 to 49%, calcium 16 to 46%, iron 24 to 27%, copper 20 to 76%, and zinc 27 to 59%. Potassium and magnesium have each fallen by roughly 16 to 24%.
Tomatoes illustrate the scale of the problem. Between 1975 and 1997 alone, tomato protein content dropped by about 73%, iron by 82%, and vitamin C by roughly 52%. These aren’t small fluctuations. They represent a fundamental shift in what you get from produce.
Three forces drive this decline. Modern crop breeding prioritizes yield, size, and pest resistance over nutrient content. Soil depletion from intensive farming means plants pull fewer minerals from the ground. And atmospheric changes, particularly rising carbon dioxide levels, cause plants to produce more starch and sugar relative to protein and minerals. What’s striking is the pace: researchers estimate that only about 20% of this nutritional dilution happened in the first 70 to 80 years of industrial agriculture, while the remaining 80% has occurred in just the last 30 to 40 years. The trend is accelerating.
The Rise of Ultra-Processed Food
Before 1900, ultra-processed foods were rare. People ate meals assembled from recognizable ingredients: whole grains, fresh or preserved produce, meat, dairy, and simple pantry staples like flour, sugar, and salt. Processing existed, but it meant things like canning, smoking, or fermenting.
Today, ultra-processed products account for more than 50% of the calories in the average American diet. These are foods engineered in factories using ingredients you wouldn’t find in a home kitchen: emulsifiers, flavor enhancers, colorants, modified starches, and hydrogenated oils. Think soft drinks, packaged snacks, instant noodles, frozen meals, and most fast food. The shift happened gradually through the 20th century as refrigeration, industrial chemistry, and mass distribution made it cheaper and easier to manufacture food that stayed shelf-stable for months.
This matters because ultra-processed foods tend to be calorie-dense but nutrient-poor. They’re engineered to be highly palatable, which makes them easy to overeat. And they’ve displaced the whole foods that once provided the vitamins and minerals now declining in produce. It’s a double hit: the whole foods have less nutrition, and people eat fewer of them.
What Happened to Wheat
Wheat tells a useful story about how selective breeding reshapes food. Wild wheat species had protein contents in the 16 to 28% range. Over thousands of years of domestication, and especially after the mid-20th-century push for high-yielding dwarf wheat varieties, breeders selected for larger seeds with more starch. Bigger seeds meant more calories per acre, but the protein content steadily declined as starch content rose.
Modern, heritage, and ancient wheat varieties actually have fairly similar total gluten content, and all of them contain proteins that are problematic for people with celiac disease. But there are meaningful differences in how the body responds to them. Clinical trials have found that older wheat varieties tend to produce less inflammation than modern ones. The reasons aren’t fully understood, but the protein structures differ. Modern bread wheat carries a particular genome (the D genome) that contains more of the protein sequences most active in triggering celiac reactions. Older diploid and tetraploid species have fewer of these sequences, though not few enough to be safe for people with celiac disease.
The practical takeaway: modern wheat isn’t dangerous for most people, but it is a different food than what humans ate for most of agricultural history, both in its protein-to-starch ratio and in how it interacts with the immune system.
Crop Diversity Has Collapsed
Since 1900, roughly 75% of plant genetic diversity in agriculture has been lost. According to the Food and Agriculture Organization of the United Nations, farmers worldwide abandoned their local varieties and traditional landraces in favor of genetically uniform, high-yielding cultivars. Where a region might once have grown dozens of distinct varieties of rice, corn, or potatoes, it now grows one or two.
This matters for two reasons. First, genetic uniformity makes the food supply vulnerable. A single disease or pest that can attack the dominant variety can wipe out an entire harvest across a wide area, as happened with bananas (the Gros Michel variety was nearly wiped out by Panama disease in the mid-20th century). Second, the lost varieties often carried unique nutritional profiles, flavors, and resilience traits that can’t easily be recovered. The food system became more productive but also more fragile and more monotonous.
Plates Got Bigger, and So Did Portions
In the 1960s, a standard dinner plate measured about 25 to 27 centimeters (10 to 10.5 inches) across. Today’s plates typically run 28 to 30 centimeters (11 to 12 inches). That might sound trivial, but going from a 9-inch plate to a 12-inch plate can lead you to serve yourself 25 to 30% more food without noticing. A meta-analysis of 56 studies found that doubling plate size leads to a 41% increase in the amount of food consumed on average.
The psychology behind this is consistent: most people eat about 92% of what they serve themselves, and larger plates make portions look smaller, so you pile on more. People using bigger plates don’t feel more satisfied afterward. They simply eat more. Conversely, research shows that dropping from a 12-inch plate to a 10-inch one cuts food served and consumed by about 29%. The change in dinnerware over the decades isn’t the sole cause of rising calorie intake, but it’s one of those quiet environmental shifts that nudge eating habits without anyone making a conscious choice.
The Fat Balance Has Shifted
Your body needs both omega-6 and omega-3 fatty acids, but the ratio between them matters. Up until about 100 years ago, the typical human diet provided these fats at a ratio of roughly 4:1, omega-6 to omega-3. The modern Western diet has pushed that ratio to approximately 20:1 in favor of omega-6.
The shift happened because of changes in cooking oils, animal feed, and food processing. Vegetable oils high in omega-6 (soybean, corn, sunflower) became cheap and ubiquitous in the 20th century. At the same time, animals raised on grain instead of pasture produce meat, eggs, and dairy with higher omega-6 and lower omega-3 levels. This imbalance promotes chronic inflammation and has been linked to higher rates of autoimmune diseases, asthma, and allergies. It’s not that omega-6 fats are inherently harmful. The problem is the lopsided ratio, which is almost entirely a product of how the modern food system sources and processes fat.
Food Additives Multiplied
Before 1958, any substance added to food in the United States was presumed safe until someone proved otherwise. That year, the law flipped: the FDA began requiring safety approval before a new additive could be used. But the law carved out broad exceptions for the hundreds of substances already in use, and today roughly 2,600 to 2,700 substances are intentionally added to foods. Of those, only about 400 fall under the formal regulatory definition of “food additive” and require the most rigorous safety review. The rest occupy various exempted categories.
A century ago, the list of things added to food was short: salt, sugar, vinegar, smoke, and a handful of spices and preservatives. The explosion in additives tracks directly with the rise of ultra-processed food. Many of these substances serve functions that have nothing to do with nutrition or safety, like extending shelf life, improving texture, or making a product look more appealing. The sheer number means that the average person now consumes a complex cocktail of chemicals daily that simply didn’t exist in the food supply a few generations ago.

