Scientists have been spectacularly wrong about some of the most fundamental questions in medicine, physics, chemistry, and biology. What makes science unusual isn’t that its practitioners never make mistakes, but that the system is designed to eventually catch and correct them. The history of scientific error is, paradoxically, the history of scientific progress. Here are some of the most consequential examples.
Fire Releases a Substance (It Doesn’t)
For most of the 18th century, chemists believed that everything flammable contained an invisible substance called “phlogiston,” a universal fire element proposed by the German scientist Georg Ernst Stahl. The logic seemed airtight: when charcoal burned, it lost weight, so clearly something was escaping into the air. The less residue left behind, the more phlogiston a material supposedly contained. Breathing was explained the same way, as a process of releasing phlogiston from the body.
The French chemist Antoine Lavoisier dismantled this idea through a series of careful weighing experiments in the 1770s. When he heated metals in air, the resulting material weighed more, not less. Phosphorus and sulfur also gained weight when burned. Something was being added, not lost. By 1777, Lavoisier proposed that combustion was actually a reaction with a component of air he identified as oxygen. In 1783, he declared flatly that “Stahl’s phlogiston is imaginary” and backed it up by decomposing water into oxygen and hydrogen. The entire framework of chemistry had to be rebuilt from scratch.
Disease Comes From Bad Air
For centuries, stretching back to Hippocrates, physicians believed that diseases like cholera were caused by “miasma,” noxious vapors rising from rotting garbage, sewage, and swamps. The theory had a compelling internal logic: epidemics struck hardest in filthy, low-lying neighborhoods where the air smelled worst. People living at lower elevations in London had higher cholera mortality, which miasmatists attributed to a greater concentration of airborne organic particles closer to the ground.
The miasma theory was wrong about the mechanism but accidentally right about some of the solutions. Cleaning up sewage and garbage did reduce disease, just not for the reasons anyone thought. It was John Snow, working during London’s cholera outbreaks in the 1850s, who began to suspect that contaminated drinking water was the real culprit. His epidemiological detective work, tracing cases back to specific water pumps, helped crack open the door for germ theory: the idea that specific microorganisms cause specific diseases. Ironically, the miasma framework’s emphasis on mapping disease patterns helped Snow develop the very methods he used to disprove it.
Life Springs From Mud
The idea of spontaneous generation held that living organisms could arise from non-living matter. It wasn’t a fringe belief. Aristotle observed eels, frogs, and tiny fish emerging from muddy riverbanks each spring and concluded that the sun’s energy interacted with mud to create new life without parents. For nearly two thousand years, this explanation was widely accepted. People believed maggots came from rotting meat and mice from piles of grain.
Louis Pasteur ended the debate in the 1860s with an elegantly simple experiment. He placed nutrient broth in specially designed “swan-neck” flasks, with long curved necks that allowed air in but trapped airborne particles in the bend. After boiling, none of the flasks showed any microbial growth, ever, in any of his publicly reported experiments. The air could reach the broth, but the microorganisms riding on dust particles couldn’t. Life came from life, full stop.
The Universe Stands Still
When Albert Einstein applied his general theory of relativity to the universe as a whole, the math kept pointing toward a universe in motion, either expanding or contracting. Einstein didn’t like that. The prevailing assumption was that the cosmos was eternal and static, so he added a fudge factor to his equations, the cosmological constant, tuned precisely to counterbalance gravity and hold everything in place. His model described a finite, closed, spherical universe frozen in equilibrium.
The model was shaky from the start. The British astrophysicist Arthur Eddington demonstrated that Einstein’s static solution was unstable: the slightest perturbation would cause the universe to start expanding or collapsing. Then in the late 1920s, Edwin Hubble observed that distant galaxies were systematically moving away from us, with their light shifted toward the red end of the spectrum. The universe was expanding. Meanwhile, the Russian physicist Alexander Friedmann had already found solutions to Einstein’s own equations that described a dynamic universe. Einstein reportedly called the cosmological constant his “biggest blunder.”
Continents Don’t Move
In 1912, the German meteorologist Alfred Wegener proposed that the continents had once been joined in a single landmass and had since drifted apart. He pointed to the jigsaw-puzzle fit of South America and Africa, matching fossils on opposite sides of the Atlantic, and similar rock formations on now-distant coastlines. The evidence was suggestive, but the scientific establishment rejected him for decades.
The core objection was mechanical: Wegener couldn’t explain what force could possibly push entire continents through solid ocean floor. He proposed that Earth’s rotation created a “pole-fleeing force” that broke the ancient supercontinent apart, but physicists calculated that the actual forces involved were far too weak. He also suggested the sun and moon’s gravity could explain the Americas drifting westward. That was rejected too. Without a plausible engine, the geological community dismissed the whole idea.
Vindication came in the early 1960s, when oceanographers discovered mid-ocean ridges, magnetic anomalies in the seafloor running parallel to those ridges, and deep trenches near continental margins. Harry Hess and Robert Dietz published the seafloor spreading hypothesis: convection currents in Earth’s mantle were pushing new crust up at the ridges and pulling old crust down at the trenches. The continents weren’t plowing through the ocean floor. They were riding on top of moving plates. Wegener had the right answer but the wrong mechanism, and it cost him his reputation during his lifetime.
Lobotomy as Breakthrough Medicine
In 1949, the Portuguese neurologist António Egas Moniz received the Nobel Prize in Physiology or Medicine for developing the lobotomy. The Nobel Committee described his first procedure, performed in 1935, as “one of the most important discoveries ever made in psychiatric medicine.” The operation involved severing connections in the brain’s frontal lobes and was used on patients with schizophrenia, depression, and anxiety disorders.
Lobotomized patients were indeed more manageable, which was the primary metric of success in an era with no effective psychiatric medications. But they were also left with devastating, irreversible personality changes. Contemporaries described them as “mental invalids” and “drooling zombies.” Roughly 5,000 lobotomies were performed in the United States alone before antipsychotic drugs became available in the 1960s, making the procedure obsolete almost overnight. It remains one of the most troubling Nobel Prizes ever awarded.
Ulcers Come From Stress
Until the early 1980s, the medical consensus held that stomach ulcers were caused by stress, spicy food, and excess acid production. Treatment focused on antacids, dietary changes, and stress management. The idea that bacteria could even survive in the stomach’s harsh acid environment seemed absurd.
In 1982, two Australian researchers, Barry Marshall and Robin Warren, identified a previously unknown bacterium (later named Helicobacter pylori) in stomach biopsies from ulcer patients. The medical community was skeptical, so Marshall took a dramatic step: he drank a broth containing the bacteria and promptly developed gastritis. The link between H. pylori infection and peptic ulcers was subsequently confirmed through antibiotic treatment studies and large epidemiological investigations. Marshall and Warren received the Nobel Prize in 2005. Ulcers went from being a chronic lifestyle condition to something curable with a course of antibiotics.
Linus Pauling’s Wrong DNA Model
Linus Pauling was one of the greatest chemists of the 20th century, a Nobel laureate who had already solved the structure of protein alpha-helices. When he turned his attention to DNA in 1953, he proposed a triple-helix model with the phosphate groups forming the core and the bases pointing outward. It was a significant error with a straightforward chemical flaw: each phosphate group carries a negative charge, and packing that many negative charges together in the center of the molecule would create enormous repulsive forces, literally driving the structure apart. James Watson and Francis Crick, working with Rosalind Franklin’s X-ray data, got it right with their double helix, bases inward, just weeks later.
Thalidomide Is “Completely Safe”
Between 1957 and 1962, the sedative thalidomide was marketed in 46 countries under various brand names and advertised as completely safe, including for pregnant women experiencing morning sickness. It became one of the world’s best-selling drugs. There was no adequate testing for effects on fetal development before it went to market.
Over 10,000 children were born with severe malformations, including shortened or missing limbs, before two clinicians, Widukind Lenz in Germany and William McBride in Australia, independently identified thalidomide as the cause in 1961. The drug was withdrawn from the UK in November 1961 and from most of the world by 1962. The true number of affected babies will likely never be known. The disaster directly led to stricter drug testing regulations in many countries, particularly the requirement to test for effects on pregnancy before approval.
Dietary Fat Causes Heart Disease
Starting in 1961, the American Heart Association recommended that all Americans reduce their saturated fat intake, replacing animal fats with polyunsaturated vegetable oils to protect against heart disease. The U.S. Senate’s 1977 Dietary Goals formalized this into a target of limiting saturated fat to about 10% of total calories. By 1990, that cap was embedded in the official Dietary Guidelines for Americans and remained there through every subsequent edition.
The evidence behind this consensus has eroded substantially. Over the past decade, more than 20 review papers from independent research teams have concluded that saturated fats have no significant effect on heart attacks, strokes, cardiovascular mortality, or total mortality. A major international study (PURE) found that saturated fat was not associated with heart attack risk and was actually linked to lower stroke risk and lower total mortality. A 2021 review in the Journal of the American College of Cardiology, co-authored by four members of previous dietary guidelines committees, found “no robust evidence” that population-wide caps on saturated fat will prevent cardiovascular disease. An analysis of the studies used by the 2020 Dietary Guidelines Advisory Committee found that 88% did not support a link between saturated fat and heart disease. The guidelines haven’t fully caught up with the science.
How Science Catches Its Own Mistakes
Every example above follows a similar arc: a confident consensus, accumulating contradictory evidence, resistance from the establishment, and eventual correction. This process isn’t automatic. Someone has to actively challenge the prevailing view, collect new data, and endure the pushback.
The primary mechanism for self-correction is replication. When independent researchers repeat a study using the same methods and get different results, confidence in the original finding erodes. A single failed replication raises questions. A string of them eventually discards the original result. Peer review acts as a filter before publication, though findings sometimes reach the public before formal review is complete. More recently, researchers have added tools like reproducibility checks, where the original data is reanalyzed to verify the reported numbers, and systematic screening for statistical inconsistencies in published papers.
These mechanisms work, but they work slowly. Continental drift took 50 years to gain acceptance. The germ theory of disease needed decades to displace miasma. The saturated fat consensus held for over half a century before serious cracks appeared. Science corrects itself not because scientists are uniquely open-minded, but because the system eventually forces confrontation with evidence that can’t be explained away.

