History of Blood Transfusion: A Complete Timeline from 1492 to Modern Medicine
Trace the complete history of blood transfusion medicine from Pope Innocent VIII in 1492 through blood typing, blood banking, and HIV screening milestones.
Key Takeaways
- • The first recorded transfusion attempt occurred in 1492 — it failed, and the patient died
- • Karl Landsteiner's 1901 discovery of blood groups A, B, and O made safe transfusion possible
- • The first U.S. hospital blood bank opened in Chicago in 1937, coined by Bernard Fantus
- • The Rh factor was discovered in 1939–1940, preventing millions of fatal transfusion reactions
- • HIV screening of donated blood began in 1985, transforming transfusion safety
Why Does the History of Blood Transfusion Matter?
Understanding how transfusion medicine evolved explains both why today’s blood supply is remarkably safe and why specific safeguards — blood typing, disease screening, cross-matching — exist at all. Each advance came in response to a failure, often a fatal one. The story runs from desperate medieval experiments to one of modern medicine’s most refined and life-saving procedures.
What Were the Earliest Transfusion Attempts? (1492–1678)
The first recorded transfusion attempt occurred in 1492, when Pope Innocent VIII reportedly received blood from three young men as treatment for illness. It didn’t help — the pope died by the end of that year, and the three donors also perished.
In the 1660s, physicians began experimenting more systematically. Animal-to-human transfusions were attempted across Europe, driven by the theory that animal blood might be purer or more vigorous than human blood. The results were predictably grim. Fatal reactions were common, and by 1678, animal-to-human transfusions were outlawed in France, England, and Italy. The field stalled for over a century.
How Did James Blundell Change Transfusion Medicine? (1818)
The modern era of transfusion began in 1818, when British physician James Blundell performed the first successful documented human-to-human blood transfusion. Working in London, Blundell used a syringe to transfer blood from a husband to his wife, who was suffering from postpartum hemorrhage.
Blundell went on to perform 10 transfusions between 1818 and 1829, with about half of them succeeding. He correctly concluded that only human blood should be used in human patients — a principle that had eluded physicians for centuries. By the 1840s, transfusion was being used to treat hemophilia, and antiseptic techniques in surgery improved overall outcomes through the 1860s.
When Did Blood Typing Transform Transfusion Safety? (1901–1912)
The single most important advance in transfusion history was Karl Landsteiner’s discovery of blood groups in 1901. Working in Vienna, Landsteiner identified that human blood fell into three groups — A, B, and O — based on the presence or absence of specific antigens on red blood cells. A fourth group, AB, was identified in 1902.
This explained why some transfusions worked and others caused fatal reactions: transfusing incompatible blood types triggered an immune attack on the donated cells. Cross-matching techniques, introduced in 1907, allowed physicians to test compatibility before transfusing. By 1912, blood type matching had become a formal part of transfusion practice, drastically reducing reaction rates.
How Was Blood Storage Made Possible? (1914–1918)
Before the early 1900s, transfusions required direct vein-to-vein transfer — donor and recipient had to be in the same room, often the same bed. Two developments during World War I changed that permanently.
In 1914–1915, researchers discovered that sodium citrate prevented blood from clotting, making storage and transport possible for the first time. Building on this, Francis Rous and J.R. Turner developed a citrate-glucose solution that kept blood viable for several days. By 1917, the first blood depot had been established on the Western Front, allowing stored blood to be shipped to field hospitals and used in mass casualty situations.
When Did the First Blood Banks Open? (1930s)
The world’s first fully functional blood bank opened in Leningrad in 1932. In the United States, the first hospital blood bank opened at Cook County Hospital in Chicago in 1937, established by Dr. Bernard Fantus — who also coined the term “blood bank.” The concept spread rapidly, and by 1950, approximately 1,500 hospital blood banks were operating across the United States.
What Was the Rh Factor Discovery? (1939–1940)
In 1939–1940, Karl Landsteiner (the same scientist who discovered ABO types) and Alexander Wiener identified the Rh factor — a second antigen system on red blood cells. Individuals who carry the Rh antigen are Rh-positive; those who don’t are Rh-negative.
The Rh discovery solved a persistent medical mystery: why some transfusions between ABO-compatible donors and recipients still caused severe reactions, and why some infants were born with severe hemolytic disease. Rh incompatibility had been silently killing patients and newborns. Routine Rh testing became standard within a few years and remains one of the most critical checks before any transfusion.
How Did Component Therapy Revolutionize Treatment? (1940s–1970s)
World War II accelerated blood banking dramatically. Dr. Charles Drew developed techniques for large-scale plasma preservation, enabling massive blood drives and supply chains that supplied Allied forces. After the war, the technology used to separate plasma from whole blood was extended to separate out individual blood components: red cells, platelets, clotting factors, and plasma.
Component therapy — giving patients only the specific component they need rather than whole blood — became standard practice from the 1960s onward. It reduced disease transmission risk, allowed multiple patients to benefit from a single donation, and created specialized products like Factor VIII concentrate for hemophilia patients.
When Did Disease Screening Begin? (1970s–1990s)
The contamination of the blood supply with HIV in the early 1980s — which infected approximately 10,000 hemophiliacs and 12,000 other patients before the cause was identified — prompted urgent action and created the modern era of blood safety.
Key disease screening milestones:
| Year | Screening Development |
|---|---|
| 1970s | Hepatitis B surface antigen testing required |
| 1985 | HIV-1 antibody testing begins |
| 1990 | Hepatitis C antibody screening introduced |
| 1992 | HIV-2 antibody screening added |
| 1996 | p24 HIV antigen testing added, shortening the detection window |
| 1999 | Nucleic acid amplification testing (NAT) study begins for HIV and hepatitis C |
Each layer of screening further reduced the already-small but real risk of transfusion-transmitted disease. By 1999, the U.S. blood supply was safer than at any point in history — a direct result of failures that forced systemic change.
What Does Modern Transfusion Medicine Look Like Today?
Today’s blood transfusion involves multiple overlapping safety checks: donor health screening, 11–12 mandatory laboratory tests on every donated unit, ABO and Rh cross-matching, and computer-verified identity checks to prevent wrong-patient errors. Nucleic acid testing can detect viral infections in donor blood within days of exposure rather than weeks.
The field continues to advance with research into blood substitutes, pathogen reduction technologies, and extended antigen matching for patients with chronic transfusion needs. But the foundation was laid by the long sequence of discoveries detailed above — each one built on a predecessor’s failure, each one saving lives that couldn’t be saved before.
For a patient-facing guide on what to expect from a modern transfusion, see What to Expect from a Blood Transfusion.
Frequently Asked Questions
When was the first blood transfusion performed?
Who discovered blood types?
When did blood banking begin?
When did HIV testing of blood donations begin?
Disclaimer: This article is for informational purposes only and does not constitute medical advice. Always consult a qualified healthcare provider for diagnosis and treatment recommendations.