Penicillin is the most important medicine that went from laboratory curiosity to mass-produced, life-saving drug during World War II. Although Alexander Fleming discovered the mold’s bacteria-killing properties in 1928, it took the pressures of wartime to turn penicillin into a medicine that could actually reach patients. Several other medical breakthroughs also emerged from the war years, including the first antibiotic for tuberculosis, dried blood plasma for transfusions, and the first licensed flu vaccine.
Penicillin: From Bedpans to Industrial Tanks
For over a decade after Fleming’s discovery, penicillin remained a lab oddity. Scientists knew it could kill bacteria in a petri dish, but nobody had figured out how to produce it in useful quantities. That changed in the summer of 1941, when Oxford scientists Howard Florey and Norman Heatley visited the United States and convinced American officials to take the compound seriously.
What followed was one of the most remarkable manufacturing sprints in history. The War Production Board coordinated 21 companies, 5 university research groups, and multiple government agencies to scale up production. In just five years, they went from growing crude penicillin in bedpans and milk bottles to fermenting highly refined penicillin in 10,000-gallon tanks. Between May and June of 1944 alone, monthly production increased by more than 250 times. By January 1945, U.S. factories were producing 4 million sterile packages of penicillin per month. Two months later, the drug was released for civilian use.
The key to this speed was an unusual wartime policy: the government lifted restrictions on scientific exchange that would normally protect trade secrets, allowing companies and researchers to freely share breakthroughs with one another.
How Penicillin Changed Battlefield Survival
The numbers tell the story clearly. The death rate among hospitalized wounded soldiers dropped to about 3.9% in World War II, compared to 8% in World War I. Pneumonia deaths plummeted even more dramatically. Between 1917 and 1918, over 17,000 American soldiers died of pneumonia. Between the U.S. entry into World War II and March 1945, only 70 soldiers died from pneumonia and its complications.
Gangrene, one of the most feared complications of battlefield wounds, dropped to roughly 1.5 cases per 1,000 soldiers treated with penicillin. Among prisoners of war who received only sulfa drugs (because penicillin was reserved for active troops), gangrene struck as many as 30 per 1,000. Wounds treated with sulfa drugs failed to heal properly 23% of the time, compared to 17% for penicillin-treated wounds. For soldiers with femur fractures, the infection death rate was just over 1% with penicillin versus more than 8% with standard treatments of the era.
Sulfa Drugs Came First
Penicillin gets the most attention, but sulfonamide drugs were actually the first antibacterial agents widely used in the war. From 1941 onward, the U.S. Army issued sulfonamide powder sachets in every soldier’s personal first-aid kit. When wounded, a soldier (or his buddy) would sprinkle the powder directly onto the open wound to prevent bacterial infection. A standard medic’s kit by 1944 included three packets of eight sulfadiazine tablets each.
Sulfa drugs saved countless lives, but they had real limitations. They were less effective than penicillin, carried more side effects, and couldn’t touch many of the infections penicillin could handle. Once penicillin became available in large quantities, sulfa drugs were gradually pushed into a supporting role.
Streptomycin: The First Weapon Against Tuberculosis
In 1943, while the penicillin effort was ramping up, a team at Rutgers University led by Selman Waksman discovered streptomycin. This was a landmark for a reason penicillin couldn’t address: tuberculosis was the world’s leading killer at the time, and penicillin had no effect on it. Streptomycin was the first practical antibiotic that could fight the tuberculosis bacterium, and it also worked against plague, cholera, typhoid, and several other infections that penicillin missed. Millions of tuberculosis recoveries have since been credited to streptomycin.
Dried Blood Plasma for the Battlefield
Before World War II, blood transfusions required fresh, refrigerated whole blood, making them nearly impossible in combat zones. Charles Drew, a surgeon and researcher, developed a technique to dry and reconstitute plasma, dramatically extending its storage life and making it shippable across oceans.
Drew’s work was first tested at scale through the “Blood for Britain” campaign in 1940, which collected over 14,500 blood donations and shipped more than 5,000 liters of plasma solution from New York hospitals to Britain. The process involved separating plasma by centrifuge, adding an antibacterial preservative, running bacterial tests, and diluting with saline before sealing and packing. After that campaign’s success, the American Red Cross launched a pilot program in early 1941 to mass-produce dried plasma for U.S. military personnel, with Drew as assistant director. Dried plasma became standard equipment for combat medics and is estimated to have saved thousands of lives by making transfusions possible close to the front lines.
Morphine Syrettes and Standardized Pain Relief
World War II also introduced a new way to deliver pain medication on the battlefield: the morphine syrette. Shaped like a tiny toothpaste tube with a needle attached, it let medics or even wounded soldiers themselves inject a precise 10-milligram dose of morphine subcutaneously without any training in syringe use. You simply broke a seal, pushed the flexible tube, and the drug went in. By 1944, a standard U.S. Army medic kit included two boxes of five morphine syrettes each. This was the first time pain management was truly standardized for individual soldiers in the field.
Atabrine for Malaria and the First Flu Vaccine
In the Pacific theater, malaria was a bigger threat than enemy fire in some campaigns. Japan had captured most of the world’s quinine supply early in the war, forcing the Allies to rely on a synthetic alternative called Atabrine. It worked, but soldiers hated it. Atabrine frequently caused diarrhea, headaches, and nausea, and it temporarily turned skin bright yellow. Compliance was such a problem that the military launched poster campaigns featuring a fictional mosquito character to remind troops to take their pills.
The war years also produced the first licensed influenza vaccine. Developed by Thomas Francis and Jonas Salk at the University of Michigan with U.S. Army support, the inactivated flu vaccine was tested for safety and effectiveness on military personnel before being licensed for broader public use in 1945. Salk, of course, would go on to develop the polio vaccine a decade later.
Why the War Mattered for Medicine
World War II didn’t just produce individual drugs. It created a new model for how medicines get developed. The penicillin project proved that government coordination, shared research, and industrial-scale production could take a promising compound from the lab bench to millions of patients in just a few years. That collaborative framework became the template for postwar pharmaceutical development. The war also normalized the idea of clinical trials, standardized field medicine, and mass vaccination, all of which shaped the modern medical system that followed.

