History of Blood Banking: From 1901 Discovery to Modern Safety Standards

Trace the history of blood banking from Karl Landsteiner's 1901 blood type discovery through the first blood banks, wartime innovations, and modern nucleic acid testing.

2 sources cited

Key Takeaways

  • Karl Landsteiner discovered the ABO blood group system in 1901, making safe transfusion scientifically possible for the first time
  • Bernard Fantus coined the term 'blood bank' in 1936 when he established the first U.S. hospital blood bank at Cook County Hospital in Chicago
  • By 1999, nucleic acid amplification testing allowed direct virus detection in donated blood — closing the window period that had caused HIV infections

Blood transfusion seems like an obvious idea. Two bodies, same basic fluid — just move it from one to the other. Yet for most of human history, attempts to transfuse blood killed far more people than they saved. The history of blood banking is the story of how medicine finally figured out why, and built an entire scientific infrastructure around a single life-saving insight.

The Problem Before the Discovery

Early transfusion attempts — dating to the 1600s — were frequently fatal. Physicians didn’t understand why some transfusions succeeded while others triggered rapid death. The assumption was contamination, incompatible animal blood (dog-to-dog transfusions were tried), or technique failures.

The real reason was invisible to nineteenth-century medicine: blood isn’t universally compatible. Giving someone blood of the wrong type triggers an immune response in which the recipient’s antibodies attack the donor’s red blood cells, causing them to clump and rupture in the bloodstream. The result is hemolytic shock — often fatal.

Without knowing this, there was no way to make transfusion reliably safe.

1901: The Discovery That Changed Everything

Karl Landsteiner, an Austrian physician working in Vienna, published a deceptively simple observation in 1901. When he mixed red blood cells from different people with serum from different donors, sometimes the cells clumped (agglutinated) and sometimes they didn’t. The pattern wasn’t random.

Landsteiner identified three distinct patterns and named the responsible blood groups A, B, and O. In 1902, his colleagues Decastello and Sturli identified a fourth group — AB. The ABO blood typing system was complete.

The clinical implication was immediate: transfuse compatible blood types and the clumping wouldn’t happen. Transfuse incompatible types and you’d trigger a potentially fatal reaction. For the first time, blood typing gave physicians a way to match donors and recipients before transfusion.

Landsteiner received the Nobel Prize in Physiology or Medicine in 1930. His 1901 paper is widely considered one of the most consequential publications in medical history.

1914–1916: Solving the Storage Problem

Knowing blood types was necessary but not sufficient. Blood still had to move from donor to recipient directly — vein to vein, in real time. There was no way to collect blood in advance and store it.

The breakthrough came with anticoagulants. In 1914, researchers discovered that sodium citrate prevented blood from clotting without harming red cells — a critical finding that separated collection from transfusion in time. Blood could now be drawn into citrate solution and stored.

In 1916, Francis Rous and J.R. Turner introduced a citrate-glucose solution that kept red blood cells viable for several days. Glucose provided energy for the metabolically active cells; citrate prevented clotting. The combination made storage feasible for the first time.

These developments transformed transfusion from a two-person procedure requiring donor and recipient to be in the same room simultaneously into something that could be planned, managed, and scaled.

1932: The First Blood Bank

With blood typing solved and short-term storage possible, the logical next step was a centralized repository — a place where donated blood could be collected, stored, and distributed on demand.

In 1932, the first fully functional blood bank opened in a Leningrad hospital. The concept — drawing blood from donors in advance, storing it, and dispensing it as needed — was genuinely novel. Blood was being treated like a bankable resource for the first time.

1936: The Term “Blood Bank” Is Born

Four years later, the idea crossed the Atlantic. Dr. Bernard Fantus, a physician at Cook County Hospital in Chicago, established the first hospital blood bank in the United States in 1936. Fantus is credited with coining the term “blood bank” itself — the banking metaphor captured exactly what made the concept work: deposits made by healthy donors in advance, withdrawals by patients in need.

Cook County’s blood bank demonstrated that a large urban hospital could manage its own blood supply systematically, reducing dependence on last-minute donor appeals and family members rushing to donate for individual patients.

1940: Drew’s Plasma Work and Wartime Scale

Two developments in 1940 accelerated blood banking enormously.

First, Landsteiner and Alexander Wiener discovered the Rh blood group system — identifying the Rh positive and negative designations that complete the blood type description still used today. This solved a second compatibility problem: Rh-incompatible transfusions could cause reactions even between ABO-matched donors and recipients.

Second, Dr. Charles Drew completed landmark research at Columbia Presbyterian Hospital documenting techniques for long-term blood plasma preservation. Plasma — the liquid component of blood separated from its cells — was more stable than whole blood, could be freeze-dried, and didn’t require ABO matching. It could be stored for weeks and shipped internationally.

Drew’s work had immediate wartime significance. As World War II escalated, the Allied forces needed enormous quantities of plasma for battlefield casualties. The U.S. and British military established large-scale plasma collection programs, collecting millions of units from civilian donors. Blood banking went from hospital experiment to national infrastructure.

1947: Standards and Organization

The American Association of Blood Banks (AABB) formed in 1947, establishing the first national standards for blood collection, testing, storage, and transfusion. Before the AABB, blood banking practices varied widely between institutions. Standardization was essential for safety at scale — ensuring that a unit of blood collected in one city met the same quality requirements as one collected in another.

1950: Plastic Bags and Expansion

By 1950, approximately 1,500 hospital blood banks operated across the United States — reflecting how thoroughly the concept had embedded itself in hospital medicine within just fourteen years of Cook County’s pioneering effort.

The same period brought a practical improvement that’s easy to overlook: plastic bags replaced glass bottles as blood containers. Glass containers required careful sterilization, broke under stress, and created air pressure complications during collection. Plastic bags were lighter, flexible, could be heat-sealed, and could easily be subdivided into smaller bags for component separation. This seemingly mundane change was a significant enabler of component therapy.

1972–1977: Federal Regulation

The AIDS crisis of the 1980s is the most famous instance of blood supply contamination, but regulatory concern predated it. The FDA began formally regulating blood banking in 1972 and established systematic inspection standards for blood banks in 1977.

The regulatory framework required blood banks to maintain records, follow standardized testing protocols, and demonstrate process controls. It was the beginning of treating blood as a regulated biological product rather than simply a clinical service.

1980s: The HIV Crisis and Its Consequences

The HIV epidemic exposed how badly the blood supply could fail. Before reliable HIV testing existed, an unknown number of blood transfusion recipients — and nearly half of the U.S. hemophilia population who relied on pooled plasma-derived clotting factors — were infected.

The response was a transformation of blood safety practices:

  • Donor screening questionnaires targeting behavioral risk factors
  • ELISA testing for HIV antibodies (introduced 1985)
  • Heat treatment of plasma-derived clotting factors to kill viruses
  • Growing push toward recombinant clotting factors from non-blood sources
  • Leukoreduction to reduce CMV transmission

The HIV crisis permanently changed how the public and regulators thought about blood safety. It also drove the growth of autologous donation — banking your own blood before elective surgery — as patients sought to minimize any transfusion risk.

1999: The Era of Nucleic Acid Testing

Even after HIV antibody testing was introduced in 1985, a significant vulnerability remained: the “window period.” Newly infected donors might not yet have developed detectable antibodies, meaning infected blood could pass antibody screening. Window period infections were responsible for ongoing HIV and hepatitis C transmissions through the 1990s.

By 1999, nucleic acid amplification testing (NAT) — which directly detects viral genetic material rather than the body’s antibody response — was implemented across U.S. blood centers. NAT closes the window period dramatically: the window for HIV detection dropped from 22 days to about 11 days; for hepatitis C from 70 days to about 10 days.

The 1999 introduction of NAT represents the last major safety milestone in blood banking history. Today’s blood supply is tested for HIV-1 and -2, hepatitis B and C, HTLV-I and -II, syphilis, West Nile virus, Trypanosoma cruzi (Chagas disease), and Zika virus.

The Modern System

A century after Landsteiner’s discovery, blood banking has become a sophisticated global infrastructure. The U.S. alone collects approximately 14 million units of whole blood annually, processed into some 20 million components, distributed through hundreds of blood centers to thousands of hospitals.

The challenges that remain aren’t primarily scientific — they’re logistical and social. Chronic shortages of platelets and rare blood types persist. Aging donor populations and declining donation rates create periodic supply crises. The blood products that modern medicine depends on still require continuous human participation to exist.

Understanding this history explains why blood banking professionals describe what they do in terms of public trust. The entire system — from donor to patient — runs on voluntary cooperation, scientific standards, and regulatory oversight built up slowly over a century of hard-won lessons.

Frequently Asked Questions

Who discovered the ABO blood groups?
Austrian physician Karl Landsteiner discovered the first three human blood groups — A, B, and O — in 1901 by observing that mixing blood from different people sometimes caused clumping. A fourth type, AB, was identified by his colleagues in 1902. Landsteiner received the Nobel Prize in Physiology or Medicine in 1930 for this foundational work.
When was the first blood bank established?
The first fully operational blood bank opened in 1932 at a hospital in Leningrad (now St. Petersburg), Russia. The first U.S. blood bank was established in 1936 by Dr. Bernard Fantus at Cook County Hospital in Chicago — who also coined the term 'blood bank' to describe the concept of storing donated blood for future use.
What role did World War II play in blood banking history?
World War II was the pivotal moment for blood banking. The enormous demand for blood during combat operations drove rapid expansion of donor programs and blood collection infrastructure. Dr. Charles Drew's 1940 work on plasma preservation enabled blood to be shipped overseas. By the war's end, large-scale blood banking had moved from experimental to essential infrastructure.
When did the FDA start regulating blood banks?
The FDA began formally regulating blood resources in 1972 and established inspection standards for blood banks in 1977. This regulatory framework followed growing recognition that blood transfusion carried infectious disease risks that required systematic oversight — a concern that would be validated by the HIV crisis of the 1980s.
Sources (2)
  1. American Association of Blood Banks History
  2. Red Cross History of Blood Banking

Disclaimer: This article is for informational purposes only and does not constitute medical advice. Always consult a qualified healthcare provider for diagnosis and treatment recommendations.