How Has Medicine Changed Over the Last 100 Years?

A century ago, a case of pneumonia could easily kill you, surgeons operated with limited anesthesia and crude tools, and a diagnosis of diabetes in a child was a death sentence. The transformation since then has been so sweeping that a physician from the 1920s would barely recognize modern medicine. Life expectancy in most developed countries has nearly doubled, driven not by a single breakthrough but by overlapping revolutions in antibiotics, surgery, imaging, vaccines, genetics, and digital technology.

Infections Went From Death Sentences to Treatable Illnesses

The single biggest shift in the last 100 years is the conquest of infectious disease. In 1900, the leading killers worldwide were pneumonia, tuberculosis, and gastrointestinal infections. Today, the world’s top cause of death is heart disease, responsible for about 13% of all deaths globally. That flip from infection to chronic disease tells the story of modern medicine in a single statistic.

Penicillin, discovered in 1928 and mass-produced by the mid-1940s, drove much of this change. After penicillin became widely available in 1947, mortality from infections it could treat (pneumonia, scarlet fever, syphilis) dropped by 58%. The drug didn’t just save individual lives. In Italy, where detailed records were kept, longstanding regional gaps in death rates from these diseases closed rapidly once penicillin reached the population. Antibiotics turned formerly lethal infections into problems that could be solved with a short course of pills.

That success created its own challenge. Overuse of antibiotics has fueled drug-resistant bacteria, making some infections harder to treat today than they were 30 years ago. But the net effect is still overwhelmingly positive: diseases that filled hospital wards in the 1920s are now minor medical events for most people.

Vaccines Eliminated Entire Diseases

Smallpox killed roughly 3 out of every 10 people who contracted it. Thanks to a global vaccination campaign, the last natural case occurred in 1977, and the World Health Assembly officially declared the disease eradicated in 1980. It remains the only human disease ever fully wiped out.

Vaccines developed over the past century now protect against dozens of diseases, including polio, measles, diphtheria, tetanus, and more recently hepatitis B and HPV. The result is visible in infant mortality data: in 1916, about 101 out of every 1,000 babies born in the United States died before their first birthday. By 2015, the global infant mortality rate had fallen to roughly 32 per 1,000, and in wealthy countries it dropped below 5. Vaccines weren’t the only reason for that decline (better sanitation, nutrition, and hospital care all played roles), but they were among the most powerful tools in the toolkit.

Surgery Became Smaller and Safer

Early 20th-century surgery meant large incisions, long hospital stays, high infection risk, and painful recoveries. The introduction of minimally invasive techniques, particularly laparoscopic surgery starting in the 1980s, changed what the experience of an operation looks like for patients.

The numbers are striking. In a comparative study of laparoscopic versus traditional open surgery, patients who had the less invasive approach spent an average of 2.1 days in the hospital, compared to 4.4 days for open surgery. Surgical site infections within 30 days were roughly half as common (about 5% versus 9%). Most telling for patients: those who had laparoscopic procedures returned to normal activities in about 6 days, while the open surgery group took an average of 13 days. What once required weeks of bed rest now often means going home the next day.

Beyond laparoscopy, robotic-assisted surgery has added even more precision, allowing surgeons to operate through tiny incisions with enhanced visualization and control.

Imaging: Seeing Inside the Body

For most of the early 1900s, the X-ray (discovered in 1895) was essentially the only way to see inside a living person, and it could only show bones and dense structures clearly. The diagnostic landscape changed dramatically in the 1970s and 1980s with two inventions that are now cornerstones of medicine.

The first CT scan of a patient’s brain took place in 1971, giving doctors the ability to see soft tissue in cross-sectional slices. MRI followed shortly after: the first images were produced in 1973, and the first full human body scan happened in 1977. By the early 1980s, MRI machines were being installed in hospitals. These technologies let doctors detect tumors, blood clots, torn ligaments, and brain abnormalities without making a single incision. Conditions that previously required exploratory surgery to diagnose can now be identified in a 30-minute scan.

Chronic Disease Replaced Infection as the Main Threat

As antibiotics, vaccines, and sanitation eliminated many infectious killers, people lived long enough to develop the diseases of aging. Heart disease, stroke, cancer, diabetes, and Alzheimer’s disease now dominate the list of leading causes of death worldwide. This shift, sometimes called the epidemiological transition, is one of the defining features of modern medicine.

The response has been a massive expansion of treatments for chronic conditions. Blood pressure medications, cholesterol-lowering drugs, chemotherapy, radiation therapy, and targeted cancer treatments all emerged over the past century. Cancer survival rates have improved dramatically for many types, though the disease remains a leading killer. Heart disease mortality has fallen significantly in wealthy countries thanks to a combination of better drugs, lifestyle interventions, and procedures like stenting and bypass surgery.

Insulin: From Animal Organs to Genetic Engineering

The story of insulin captures the arc of pharmaceutical progress in miniature. Before the 1920s, a type 1 diabetes diagnosis in a child was essentially fatal. The discovery that insulin from cow and pig pancreases could control blood sugar changed that overnight, but the supply chain was fragile. Producing insulin required enormous quantities of animal organs, and impurities in the product caused immune reactions that made some patients resistant to the drug.

Advances in purification during the 1970s reduced those adverse reactions significantly. Then in 1978, scientists used recombinant DNA technology to produce biosynthetic human insulin, a version made by bacteria programmed with the human insulin gene rather than extracted from animals. The FDA approved this product, Humulin, in 1982. It was the first medical product of any kind made using genetic engineering. The shift solved supply problems, reduced immune reactions, and opened the door to further insulin variants designed to act faster or last longer in the body.

Organ Transplantation Went From Impossible to Routine

The first successful human organ transplant took place in Boston in 1954, when a man received a kidney from his identical twin brother. The recipient went on to live another eight years, marry, and have two children. At the time, it was a medical miracle. Today, transplantation is a standard, if still complex, part of medicine. Since the U.S. Organ Procurement and Transplant Network began keeping records in 1988, over 800,000 transplants have been performed in the United States alone.

The key breakthroughs that made this possible were drugs that suppress the immune system enough to prevent organ rejection without leaving the patient defenseless against infection. Those medications, developed primarily in the 1970s and 1980s, turned transplantation from a procedure that only worked between identical twins into one that could use organs from unrelated donors.

AI and Digital Tools Are Reshaping Diagnosis

The most recent wave of change involves artificial intelligence and digital health. AI systems trained on medical images have shown they can match or exceed human specialists in certain diagnostic tasks. In studies comparing AI to board-certified dermatologists at classifying skin lesions, AI models outperformed the specialists in 63% of cases. For detecting malignant skin lesions specifically, AI systems achieved accuracy rates between 85% and 97%. When dermatologists worked alongside AI rather than competing against it, their diagnostic precision improved further.

Telemedicine, once a niche technology, surged during the COVID-19 pandemic. Among developed countries, virtual consultations jumped from 11% of all doctor visits in 2019 to 21% in 2020. While usage has settled somewhat since then, remote visits remain far more common than they were before the pandemic, expanding access for people in rural areas or with mobility limitations.

What Ties It All Together

The pattern across every branch of medicine is the same: problems that were invisible became detectable, conditions that were fatal became manageable, and procedures that were brutal became precise. A century ago, doctors relied heavily on physical examination and intuition. Today, they have imaging that reveals structures smaller than a millimeter, drugs engineered at the molecular level, and algorithms that can scan thousands of images in seconds.

The pace of change has also accelerated. It took decades to go from the discovery of penicillin to mass production. The first COVID-19 vaccines went from genetic sequencing of the virus to emergency authorization in under a year. Each generation of tools makes the next generation faster to develop, which means the next 100 years of medicine will likely look even less like today than today looks like 1925.