Blood transfusions became a routine medical procedure during World War II, when the American Red Cross collected 13.4 million pints of blood from 6.6 million donors between 1941 and 1945. But the path from experimental curiosity to everyday medicine stretched across more than a century, shaped by a handful of breakthroughs that each solved a specific problem standing in the way.
Early Experiments in the 1800s
The British obstetrician James Blundell performed the first documented human-to-human blood transfusion in 1818, reporting “temporary success” after giving blood to a dying patient. His method was crude: he collected venous blood in a cup and transferred it with a syringe. Blundell chose human blood deliberately, having observed in animal experiments that cross-species transfusions failed. His work proved the concept was viable, but without any understanding of why some transfusions killed patients and others didn’t, the procedure remained a desperate last resort for decades.
Throughout the rest of the 1800s, transfusions were performed sporadically and unpredictably. Physicians knew fatal reactions happened but blamed them on technical errors or patient frailty. The prevailing belief was that all human blood was essentially the same. That assumption kept transfusion medicine stuck in place for nearly a century.
Blood Types Changed Everything in 1901
The single most important breakthrough came in 1901, when the Austrian scientist Karl Landsteiner identified the ABO blood group system. He showed that human blood falls into distinct types, and mixing incompatible types causes the red blood cells to clump together, a reaction that can be fatal. This explained, for the first time, why some transfusions succeeded while others killed. Landsteiner’s discovery earned him a Nobel Prize and the title “Father of Transfusion Medicine.”
With blood typing, doctors could now match donors and recipients before a transfusion. The procedure went from a gamble to something approaching reliable medicine. But a major practical obstacle remained: blood clots within minutes of leaving the body, meaning donor and patient had to be side by side, often with their blood vessels surgically connected.
Storing Blood Became Possible Around 1915
The clotting problem was solved between 1914 and 1916, when researchers in Brussels, Buenos Aires, and New York independently showed that sodium citrate in low concentrations could safely prevent blood from clotting. Richard Lewisohn at Mount Sinai Hospital in New York refined the dosing, and Richard Weil published a landmark paper in January 1915 on the technique. This was a quiet revolution: for the first time, blood could be collected, treated, and stored for later use. The donor no longer needed to be in the same room as the patient.
The timing was critical. World War I had just begun, and battlefield medicine desperately needed a way to give blood to wounded soldiers quickly. U.S. Army Captain Oswald Robertson put the new science into practice, administering 22 transfusions of cross-matched, citrate-treated, cold-stored blood to injured soldiers near the front lines. Some of that blood had been stored for up to 26 days. Robertson demonstrated that stored blood worked just as well as a fresh, direct transfusion, effectively creating the first mobile blood depot.
The First Blood Bank Opened in 1937
Despite the wartime proof of concept, peacetime medicine was slow to adopt organized blood storage. That changed in 1937, when Bernard Fantus established the first hospital blood bank at Cook County Hospital in Chicago. Fantus coined the term “blood bank” itself, envisioning a system where donations could be deposited and withdrawn as needed, much like money in a financial institution. His model gave hospitals a way to keep a reliable supply on hand rather than scrambling to find a compatible donor each time a patient needed blood.
Other hospitals quickly followed. By the late 1930s, the infrastructure for routine transfusion was finally in place: blood could be typed, stored safely, and distributed from a centralized supply. The missing piece was scale.
World War II Made Transfusions Routine
The massive scale came with World War II. The American Red Cross launched a national blood collection program that ultimately gathered 13.4 million pints of blood from 6.6 million donors, spending nearly $16 million on the effort before the program ended in September 1945. This was blood banking on an industrial scale, with collection centers across the country, standardized processing, and a supply chain that delivered blood products to military hospitals worldwide.
The war did more than save soldiers’ lives. It trained thousands of medical professionals in transfusion techniques, built public trust in blood donation, and created the organizational systems that civilian hospitals would adopt after the war ended. By the late 1940s, blood transfusion had shifted from an extraordinary intervention to a standard part of hospital care in the United States and Europe.
Post-War Refinements in Safety
Transfusions were common by the 1950s, but they weren’t yet as safe as they are today. Landsteiner, working with Alexander Wiener, had identified the Rh factor in 1940, adding another layer of compatibility testing. Rh-negative patients receiving Rh-positive blood could develop severe reactions, particularly dangerous for pregnant women whose babies had a different Rh type. Routine Rh testing became standard alongside ABO typing, further reducing transfusion reactions.
The biggest safety concerns that remained were infectious diseases transmitted through donated blood. Screening evolved in waves over several decades. Testing donated blood for hepatitis B became standard practice in the 1970s. After HIV was identified as the cause of AIDS, antibody screening for the virus was implemented in 1985. Hepatitis C screening followed in the early 1990s, and more sensitive molecular testing (which detects viral genetic material directly) was added for HIV and hepatitis C in the 2000s and 2010s. Each new test closed a gap that had previously put recipients at risk.
Blood Transfusions Today
In the United States alone, roughly 10.3 million units of red blood cells were transfused in 2023, along with about 2.2 million units of platelets and 1.9 million units of plasma. That’s according to the National Blood Collection and Utilization Survey. These numbers have actually been declining slightly, down about 4% for red blood cells compared to 2021, as surgical techniques have improved and doctors have adopted more conservative transfusion guidelines.
The procedure today bears almost no resemblance to Blundell’s syringe-and-cup method from 1818. Donated blood is separated into components so that a single donation can help multiple patients. Every unit is typed, cross-matched, and screened for infectious diseases. The risk of a serious transfusion reaction is extremely low. What took over a century of incremental discoveries, from blood types to anticoagulants to disease screening, has become one of the most routine and reliable procedures in modern medicine.

