Keywords

Introduction

Damage control resuscitation (DCR) is a bundle of care first described by Holcomb et al. that is aimed at reducing death from hemorrhage for patients with severe traumatic bleeding. DCR principles include compressible hemorrhage control; hypotensive resuscitation; rapid surgical control of bleeding; avoidance of the overuse of crystalloids and colloids, prevention or correction of acidosis, hypothermia, and hypocalcaemia; and hemostatic resuscitation (blood-based resuscitation) [1]. RDCR is defined as the prehospital application of DCR concepts. The term RDCR was first published by Gerhardt and has been disseminated by the THOR Network [2, 3].

The number and severity of wounded in the wars in Afghanistan and Iraq coupled with the collection of clinical data inspired renewed thinking regarding the optimal methods to improve outcomes for casualties with traumatic hemorrhagic shock. Motivation for reassessment of the standard resuscitative approach for severe bleeding was a result of retrospective studies supporting the earlier use of blood products to include whole blood [4,5,6,7] and data by Eastridge that indicated the majority of casualties succumb to their wounds before reaching any medical facility with an advanced resuscitation capability, and the overwhelming majority of these patients (>90%) died from hemorrhage [8]. Advanced life-saving interventions performed in this pre-medical treatment facility (MTF) phase of care can improve outcomes by delivering a casualty to the surgeon with survivable injuries [9, 10].

The history of DCR and RDCR starts well before the inception of the terms. The concepts behind the principles of DCR and RDCR stretch far back into the past. This chapter provides an outline of this history, but it is limited to the fluid resuscitation aspect of DCR/RDCR.

1600s

The history of fluid resuscitation starts with the discovery of the circulatory system. Until this point in time there was no intervention to the circulatory system as no one had yet conceived of the blood to be in “circulation”; it was incorrectly assumed that the blood was produced in the liver and consumed in the peripheries.

In 1628, William Harvey, an English physician educated in Italy at the University of Padua as a student of Hieronymus Fabricius and later at the University of Cambridge, publishes Exercitatio Anatomica de Motu Cordis et Sanguinis in Animalibus translated as “An Anatomical Exercise on the Motion of the Heart and Blood in Living Beings” commonly called De Motu Cordis (On The Motion of Heart and Blood). This was the first complete, well-researched, description of the circulatory system including the pulmonary and the systemic circulation. The concept was in contradiction to Galen and the accepted understanding of the age. Harvey calculated cardiac output and demonstrated that it was impossible that the liver could possibly produce the volume of blood required as had been previously thought. This bold insight set the stage for new ideas surrounding treatment for hemorrhage.

Harvey’s description of the circulatory system was rapidly accepted and it was not long until the idea of interventions via the circulatory system was envisioned. The first intravenous injections (IV) were administered by Christopher Wren and Robert Boyle in 1656 in Oxford. An animal bladder was attached to a Goose quill and wine, ale and opiates were injected into dogs. A mixture of opium and alcohol produced the first IV anesthesia with full recovery; this concept was not implemented into clinical practice and an early chance for pain-free surgery was lost.

Richard Lower conducted research in the cardiopulmonary system and was the first to describe the difference in blood after exposure to air via the lungs. In 1666, Lower reported the first blood transfusion. More specifically, Lower revealed that transfusion could be used as a life-saving treatment for exsanguination. Lower bled a dog to the point of death and then saved the animal with a whole blood transfusion from another larger dog. In 1667, blood was first transfused from animal to man by Jean Baptiste Denis and Lower. It must be noted that the transfusion of blood from a lamb into man was not as a treatment for hemorrhage but instead for “madness.” After much medical and theological debate, the practice of transfusion was banned by the French and later the Pope. While transfusion fell into disrepute, the practice faded although the theory was passed on.

1700s

Just as it was important to identify and describe the circulatory system, it was equally important to identify and describe the condition of hemorrhagic shock; this has proven particularly difficult due to the complexity of the pathology. In 1731, French surgeon Henri Francois Le Dran in a publication titled, Observation de Chirurgie, describes the collapse of vital functions which ended in death after being hit by a missile. He called it secousse which translates from the French to Shock [11].

1800s

In 1817, Dr John Henry Leacock showed that blood was species-specific in cat and dog transfusions and argued for human-to-human transfusion.

The consequences of haemorrhages where the functions are not dangerously affected, do not of course, require transfusion, since other remedies will suffice. But when the danger is imminent, and the common means are ineffectual, as when a parturient women trembles on the brink of the grave from uterine haemorrhage, or when a soldier is at the point of death from loss of blood, what reason can be alleged for not having recourse to this last hope, and for not attempting the recruit the exhausted frame and turn the ebbing tide of life.

This quote carries a clear message of the urgency of resuscitation after severe hemorrhage.

In 1818, James Blundell performed the first human-to-human transfusion. Blundell had postulated that transfusion could be used to treat postpartum hemorrhage and researched transfusion with animals. In 1829, Blundell published the first successful resuscitation of a woman from postpartum hemorrhage in The Lancet. He performed ten transfusions in the next 10 years. Blundell also improved the technique and equipment for transfusion using a syringe to conduct vein-to-vein transfusions.

Blundell noted that vein-to-vein transfusions were impractical due to clotting, and removal of air was essential. Attaching the donor’s artery to the recipient’s vein had however proven successful in Lower’s experiments but required skill and time. To resolve this problem, the use of defibrillated blood was suggested by Prevost and Dumas in 1821, which allowed blood to clot, usually by stirring, and then the clots were removed and the remaining fluid now “defibrillated” could be used. Others sought an anticoagulant; J Neudorfer recommended sodium bicarbonate as an anticoagulant in 1860. Dr Braxton Hicks attempted a solution of sodium phosphate but was unsuccessful [12].

In 1849, C.H.F. Routh reviewed all the published blood transfusions to that date, in an article entitled “Remarks, statistical and general on transfusion of blood,” which was published in the Medical Times. He reported that he was only able to find 48 recorded cases of transfusion, of which 18 had a fatal outcome. This gave a mortality of approximately 1 in 3, which was reported as being “rather less than that of hernia, or about the same as the average amputation [12].

In 1865, Louis Pasteur recognized that bacterial and fungal contamination causes putrefaction, and in 1867, Joseph Lister discovered antiseptics to cure the dangers of infection. As a result of these discoveries, infection in transfusions moved toward a potential solution with the sterilization of instruments and antiseptic methods beginning to be introduced.

Crystalloids and Colloids

Another important development in fluid resuscitation started in 1831. William Brooke O'Shaughnessy examined cholera patients in Edinburgh and postulated that the disease resulted in hypovolemia and electrolyte loss; O'Shaughnessy experimented on dogs with Saline. In 1832, Thomas A Latta administered salt solution to cholera victims and published details in The Lancet: “The very remarkable effects of this remedy require to be witnessed to be believed. Shortly after the commencement of the injection the pulse, which was not perceptible, gradually returns, … the whole countenance assumes a natural healthy appearance” [13].

In 1885, Sydney Ringer strived to achieve optimum electrolyte concentrations for organs making Ringer’s solution. In 1896, Ernest Starling described colloid osmotic pressure (Starling’s principle) and the importance of colloids plasma proteins; this paved the way for the development of colloids.

American Civil War 1861–1865

In 1850, Samuel D. Gross makes one of the first descriptions of wound shock: “the rude unhinging of the machinery of life” [14, 15].

Two whole blood transfusion attempts were made on active duty wounded soldiers by Union surgeons and reported in the War Department’s Medical and Surgical History of the War of the Rebellion. Surgeon E. Bentley reported a successful transfusion given to Private G. P. Cross at Grosvenor Branch Hospital, Arlington, Virginia, on August 15, 1864, and another by Assistant Surgeon B. E. Fryer at Brown Hospital in Louisville, Kentucky, operated on a Private J. Mott in August 1864 [16, 17].

Franco-Prussian War 1870–1871

Battlefield Transfusions

In 1865, Dr J. Roussel of Geneva first conducted a whole blood transfusion using direct arm-to-arm transfusion with a device he had developed called the “transfuseur, ” for treatment of a patient suffering from hemorrhage. The apparatus he used was described in the Gazette des hopitaux in 1867. Roussel stated later that it was unfortunate that the device and procedure was not more widely utilized during the Franco-Prussian war, although it was used.

In 1867, Roussel claimed 16 successful whole blood transfusions out of 35 performed for the treatment of a variety of conditions. In 1882, in Paris, he reported on a total of 60 whole blood transfusions performed since 1865 in Switzerland, Austria, Russia, Belgium, England, and France. Roussel’s transfuseur apparatus was subsequently officially adopted for use by the French Army and apparently used in times of war.

Developments were made on the equipment needed to conduct whole blood transfusions. Blundell used syringes made for him specially for the vein-to-vein transfusion process; he later developed two new devices: the “impellor” and later the “gravitator.” Many other devices were invented and attempted. In 1873, Dr. J.H. Aveling used a device he invented for vein-to-vein whole blood transfusion which consisted of two cannulas joined by a bulb pump and one-way valve to ensure the correct direction of flow; he described the device as small enough to be carried around in a pocket. In 1872, Aveling attended to a lady, aged 21 years, “in extremis” from postpartum hemorrhage. She received 60 drachms of blood from her coachman and apparently soon recovered, certainly enough to reportedly be able to remark that she was dying! Dr. Aveling added in his report that: “the mental improvement of the patient was not as marked and rapid as I anticipated, but this was perhaps due to the quantity of brandy she had taken” [12].

In the United States, between 1873 and 1880, an attempt at a blood substitute was attempted with the milk of cows and goats. T.G. Thomas and J. S. Prout supported this treatment due to the problems with blood transfusion because of its “tendency to coagulation.” By 1878, J.H. Britton, writing in the New York Medical Record, predicted that transfusion using milk would entirely supersede transfusions of blood [12].

The Spanish-American War 1898

The first descriptions of wound shock which was thought of as something separate from the injury came from the American Civil War, and it was during the Spanish-American War of 1898 that wound shock was first associated with sepsis; however, wound shock was seen as distinctive from hemorrhage [18].

The Anglo-Boer War 1899–1902

In 1900, during the Anglo-Boer War , British surgeons use strychnine and saline to treat shock. Porter describes treatment, “I wanted to pump in strychnine as before, but Cheyne was playing about with 3 or 4 drop doses. The man was very bad and looked like dying so I got 10 drops and gave it. Cheyne was astonished and said it was a very big dose, but I said the patient wanted it. Then Cheyne thought he would try transfusion, and put one and half pints of salt water into a vein” [19].

In 1900, the US Surgeon General recommended that patients in a state of shock were given normal salt solution rectally and subcutaneously and 1/60 grain of strychnine , covered with blankets and kept warm [20].

1900s

Physiology: Blood Groups

In 1900, Karl Landsteiner, while experimenting with the mixing of whole blood from different people, found some blood agglutinates and some lyse, and some are unaffected. In 1901, he found that this effect was due to the red blood cells coming into contact with incompatible blood serum antibodies. He labeled the blood groups according to agglutination A, B, and C, which was later changed to O. Landsteiner also found out that whole blood transfusion between persons with the same blood group did not lead to the destruction of blood cells, whereas this occurred between persons of different blood groups [21]. A fourth main blood type, AB, was found by A. Decastrello and A. Sturli.

Transfusion: Avoiding Transfusion Reactions

In 1907, Ludvig Hektoen recommends blood cross matching, the mixing of donor and recipient blood to determine compatibility. Ruben Ottenberg performs first “cross matched” and typed whole blood transfusion, and Ruben also recognized blood type O as the universal donor.

In 1908, French surgeon Alexis Carrel devised a way to prevent blood clotting. His method involved joining an artery in the donor directly to a vein in the recipient with surgical sutures; this was a highly skilled and complex process available only to skilled surgeons.

In 1913, Dr. Edward Lindeman revolutionized blood transfusion by using syringes and cannulas to transfuse whole blood instead of directly connecting the donors’ and recipients’ blood vessels at the Bellevue Hospital in New York [22]. In 1914, the first transfusion using citrated whole blood was performed by Professor L. Agote. In 1915, Richard Lewisohn uses sodium citrate as an anticoagulant to transform the transfusion procedure from direct to indirect with the capability of storage. Richard Weil demonstrates the feasibility of refrigerated storage of such anticoagulated blood. In 1916, Peyton Rous and J.R. Turner Jr. found that adding dextrose to the citrate extended the storage time to 4 weeks.

In 1916, W. Bayliss a professor of general physiology at University College London contributed a lecture to the Physiological Society; his abstract was published in the Journal of Physiology The abstract detailed animal models after bleeding that received salt solutions had only a transitory recovery; however, the effect was sustained when 5% gelatin of gum acacia was added. Interestingly, gum acacia contains a moderate amount of calcium and magnesium salts, which are cofactors in hemostasis [23].

WWI 1914–1918

In 1915, Oswald Hope Robertson travels to Europe as a medical student and performs first whole blood transfusion of the war at a volunteer hospital in Paris. After his graduation later that year, he works with P. Rous at the Rockefeller Institute. In 1917, Robertson joins the Harvard Medical Unit with Roger Lee at the Base Hospital No. 5 from Boston. Lee had sent Robertson to work with Rous at the Rockefeller Institute. Robertson is tasked with investigation of the treatment of shock; he initiated direct transfusions and wrote to Rous with an idea of larger-scale collection and storage. In 1917, he tested donors and used only type “O” universal donors as suggested by Lee; the donors were tested for disease. He collected blood via venipuncture into glass bottles with anticoagulant. He cooled the blood in ice chests and stored it for up to 28 days. Robertson moved the blood to where it would be needed. He personally administered blood to the wounded under fire and was awarded the Distinguished Service Order for bravery. Robertson also taught the techniques to other instructors responsible for transfusion and resuscitation training. In 1918, O.H. Robertson published his findings in the British Medical Journal [24].

In 1915–1916, Captain Ernest Cowell and Captain John Fraser began measuring soldiers’ blood pressures and recorded that in wounded men with classic symptoms of shock, the average SBP was 90mmHg, and they labeled this primary shock. In the group which showed no signs of shock initially but later the BP dropped to 70–90 mmHg, this was called secondary shock. If the BP continued to decline and if it fell to 50–60, or below, the men died.

In 1916, Captain L. Bruce Robertson from Toronto, who had recently trained with Lindemann in New York, used direct whole blood transfusions with no blood typing or cross matching in the field. He published in the British Medical Journal, The transfusion of whole blood: “a suggestion for its more frequent employment in war surgery,” where he states: “the additional blood often carries the patients over a critical period and assists his forces to rally to withstand further surgical procedures.” Robertson publishes his experiences of resuscitation transfusions in 1917 in the British Medical Journal, and in 1918, in the Annals of Surgery, he describes 36 cases of transfusion including 3 fatal hemolytic transfusion reactions [24].

In 1917, after the Medical Research Council Shock Committee meeting, Bayliss recommends 5% gum acacia in a 3% sodium bicarbonate solution; this proved difficult to manufacture, and after further testing, it was agreed to place the 6% gum acacia in a 0.9% saline solution. Reports were circulated that gum acacia and Ringer’s solution were capable of saving lives on the front. In 1918, Colonel Elliott and Captain Walker reported that gum-saline succeeded if infused on arrival at the Casualty Clearing Station, but if treatment was delayed for more than 8 hours, a blood transfusion was better.

In 1917, the Investigation Committee on Surgical Shock and Allied Conditions of the Medical Research Council was formed with Starling as first chair then Bayliss. The committee was established to examine treatment of shock. The committee requests an update on the use of whole blood from Captain Oswald Hope Robertson. Both cold-stored and warm whole blood were transfused to casualties in WWI.

In 1917, Bayliss travels to France and meets Captain Fraser and Captain Walter B. Cannon of the USAMC and the Higginson Professor of Physiology at the Harvard Medical School. Cannon conducted autopsies to test the theory that wound shock was caused by blood pooling in the great veins of the abdomen and found this to be untrue. He began investigations with a Van Slyke blood gas analyzer on blood plasma and was able to show a correlation between wound shock BP and acidosis; the lower the BP, the greater the acidity of the plasma.

On August 17, 1917, at the first MRC Special Investigation Committee on Surgical Shock and Allied Conditions meeting, they publish the first definition of wound shock “a condition of circulatory failure due to deficient entry of blood into the heart.”

The Medical Research Council Shock Committee urgently tried to discover the cause of shock and potential treatment. Cannon is convinced that high acid levels in the blood are causing the wound shock and an alkali treatment is needed. H.H Dale disagrees and suggests a more complex pathology: “namely, that substances with similar activity (to histamine) absorbed from wounds involving injury to tissues, in conjunction with hemorrhage, exposure to cold, and so forth, could well determine the onset of shock.” Dale argues that the treatment of shock should include whole blood transfusion [25].

In 1918, Cannon is named the Director of Surgical Research at the Medical Laboratory at Dijon; there he trains resuscitation teams in the physiology of shock and resuscitation of shock with a strong emphasis on hypothermia management, which he learned from working on the front line with Cowell and Fraser. Cannon requests and receives the assistance of O.H. Robertson in his research. In 1918, the US Army Medical Department adopts whole blood transfusion with citrated blood to combat shock for American Expeditionary Forces.

Geoffrey Keynes developed “field durable” equipment that enabled whole blood transfusions to be carried out in the field outside of established medical facilities. In the field, the only way to transfuse casualties was from another soldier to the casualty. Keynes’ equipment enabled regulating the flow of blood between the donor and the patient.

Post-WW1

In November of 1918, the Royal Army Medical Core convened a conference in Boulogne of surgeons and pathologists to evaluate treatments for shock and hemorrhage. The final conclusion was that whole blood was probably superior, but colloids warranted further investigation, and reactions to gum acacia were reported.

After the war, the MRC Shock Committee also independently reviews the evidence from the war and declares “that in all cases of hemorrhage with shock, transfusion of unaltered whole blood or citrated blood is the best treatment yet available” [26, 27]

Major W. Richard Ohler states after the war, “hemorrhage is the important single factor in shock and the amount of hemorrhage defines the amount of shock, when, therefore, the need is for oxygen carrying corpuses, no other intravenous solution will serve the purpose.”

In 1921, Percy Lane Oliver , Secretary of the Camberwell Division of the British Red Cross, establishes the first emergency donor panel with some 20 strong donors to donate blood at short notice in London hospitals. Oliver calls it British Red Cross Blood Transfusion Service. In 1922, it is used 13 times; word spread and by 1925, the service is used 428 times. Sir Geoffrey Keynes is appointed as medical adviser to the organization. Similar systems are adopted in other countries; France, Germany, Austria, Belgium, Australia, and Japan being among the first. At the first Congress of the International Society of Blood Transfusion held in Rome in 1935, “It is to the Red Cross in London that the honor is due to having been the first, in 1921, to solve the problem of blood donation by organizing a transfusion service available at all hours, and able to send to any place a donor of guaranteed health, whose blood has been duly verified.” In 1937, Bernard Fantus of the Cook County Hospital in Chicago establishes the first US civilian blood bank, in which whole blood was collected in bottles and stored in a refrigerator for up to 10 days [28].

In 1932, Alexis F. Hartmann and M.J.C. Senn suggest a 1/6 molar sodium-lactate solution to replace the sodium chloride in Ringer’s solution; they showed that the lactate was metabolized in the liver, making sodium available to combine with available anions. The use of the solution meant the amount of chloride to be reduced, limiting hyperchloremic acidosis [29].

In 1929, Professor Vladimir Shamov of Kharkiv, USSR, reports experimental use of cadaveric blood transfusion and absence of toxicity. In 1930, Russian surgeon Sergei Yudin familiar with the work of Shamov transfuses his first patient, and he states, “My first experience was with the case of a young engineer who slashed both of his wrists in a suicidal attempt. He was brought to our hospital pulseless and with slow, jerky respiration. Transfusion with 420 cc. of blood taken from the cadaver of a man, aged 60, who had been killed in an automobile accident just six hours before, promptly revived him [30]. Later that year, Yudin reports at the fourth Congress of Ukrainian Surgeons at Kharkiv in September on his first seven transfusions from cadavers. By 1932, Yudin reports 100 transfusions with blood kept for 3 weeks from cadavers, and in 1937, Yudin reports over 1,000 uses of cadaveric blood in The Lancet [28].

Spanish Civil War 1936–1939

By 1936, Frederic Duran-Jorda had created a transfusion service in Barcelona to meet the growing demand for blood transfusions; later that year, Norman Bethune visited the facility and then sets up a similar service based out of Madrid called the Servicio canadiense de transfusión de sangre. In 1914, Bethune suspended his medical studies and joined the Canadian Army’s No. 2 Field Ambulance to serve as a stretcher-bearer in France. He was wounded by shrapnel, and after recovering, he returned to Toronto to complete his medical degree. Based on his experience in WWI, he organized a mobile transfusion service stating: “Why bring the bleeding men back to the hospital when the blood should travel forward to them?” During the Spanish Civil War, 28,900 donors donated 9000 liters of whole blood. Donors are X-rayed for TB and their blood is tested for syphilis and malaria. Six donations of whole blood were mixed and filtered and then placed in 300 ml glass jars and stored at 2 °C for up to 15 days. With the advent of blood fractionation, plasma could be separated from whole blood and was used for the first time in this war to treat the battle wounded. In 1938, Duran-Jorda fleed to the United Kingdom and worked with Dr. Janet Vaughan at the Royal Postgraduate Medical School at Hammersmith Hospital to create a system of national blood banks in London.

Pre-WWII

In 1934, Alfred Blalock proposed four categories of shock: hypovolemic, vasogenic (septic), cardiogenic, and neurogenic. Hypovolemic shock, the most common type, results from loss of circulating blood volume due to loss of whole blood (hemorrhagic shock), plasma, interstitial fluid, or a combination [31].

In 1938, the Medical Research Council establishes four blood depots in London. Later, in the autumn, the War Office also created the British Army Blood Transfusion Service and the initial Army Blood Service Depot (ABSD) in Bristol under the control of Dr Lionel Whitby. The service also sets up a plasma-drying facility that produced 1200–1400 units a week.

WWII 1939–1945

Transfusion: UK Army Blood Transfusion Service

In 1938, Brigadier Lionel Whitby was appointed Director of an autonomous UK Army Blood Transfusion Service (ABTS) . Unlike WWI where the blood was obtained from fellow soldiers, the plan changed to central civilian collection and then to a distribution network. The service was organized on three levels: (1) the Army Blood Service Depot (ABSD) , producing all wet and dried products, crystalloids, grouping sera, blood collecting, and administering equipment and training; (2) Base Transfusion Units, which were chiefly concerned with distribution in each theater of operations; and (3) Field Transfusion Units, which worked in forward areas.

Plasma for Britain

In 1940, Dr Charles R. Drew, surgeon and researcher who had developed techniques for preserving liquid plasma, supervised the “Blood for Britain” program which delivered blood to treat those wounded during the Blitz. To encourage donation, Drew first used vehicles with refrigerators serving as donation centers.

Research

In 1940, on May 31, US Surgeon General Magee appoints Professor Walter B. Cannon of Harvard University as Chairman of the US National Research Council Committee on Shock and Transfusion. On November 3, 1941, this committee agreed “that it had been the consensus of the group that [US] Armed Forces should use whole blood in the treatment of shock wherever possible”; the results of that discussion were not made official until 2 years later, on November 17, 1943 [32].

Cannon also introduced the term “homeostasis” to describe the equilibrium maintained in the internal environment and is credited for the first proposal to cause deliberate hypotension in order to reduce internal hemorrhage until surgical control could be established [33].

Plasma: Fractionation

In 1940, Edwin Cohn, a professor of biological chemistry at Harvard Medical School, develops cold ethanol fractionation, the process of breaking down plasma into components and products. Albumin, gamma globulin, and fibrinogen are isolated and become available for clinical use. John Elliott develops the first blood container, a vacuum bottle extensively used by the Red Cross [34]. In 1941, Isodor Ravdin treats victims of the Pearl Harbor attack with Cohn’s albumin for blood loss and shock [34].

Transfusion: The United States’ Need for Whole Blood

In 1941, as US troops arrive in the United Kingdom, the United States reports that they are not able or prepared to supply US donated blood to Europe or Africa.

On June 28, 1941, the first Conference on Shock was conducted by the Subcommittee on Shock, 6 months before the United States entered the war. Treatment recommendations included control of hemorrhage with early application of a tourniquet, the application of heat to reverse hypothermia and analgesia. Regarding fluid therapy, when shock is imminent or present, blood, plasma, or albumin should be injected as promptly as possible. In massive hemorrhage, whole blood is preferable to blood substitutes.

In 1943, pressure grows on the United States to supply whole blood during D-Day Planning: The Allied planning group were shocked to be told that the U.S. would not sanction the transport of any whole blood from the United States to Great Britain; logistical problems and the efficacy of human plasma were cited as the reasons for the U.S. obduracy [35].

In March 1943, US Army Colonel Edward D. Churchill arrives for duty as Chief Surgical Consultant to the North African and Mediterranean operational theater. Churchill conducts a study on the resuscitation of shock and releases a report that states plasma is a first aid measure in support of whole blood which is the first-line treatment for resuscitation of battlefield casualties. Whole blood is the only agent that prepares casualties for surgery and decreases mortality by reducing infection. Inadequate resuscitation with whole blood resulted in organ damage. There was a widespread misconception by US military medical leadership that plasma was as effective as whole blood [36]. Churchill, incensed by the US Surgeon Generals’ position on blood products, briefed a New York Times reporter with the aim of publicizing the need for military blood banks [37]. In 1943, Colonel Elliott C. Cutler’s memorandum to Brigadier General Paul R. Hawley, Chief Surgeon, European Theatre of Operations, stated that “Brigadier Whitby tells me that the use of wet plasma has practically been given up, and transfusion (of whole blood) used in its stead in the British Army [38].

Colonel Frank S Gillespie (Liaison Officer for the United Kingdom in Washington DC)

I have often wondered at the physiological differences between the British and American soldier. The former, when badly shocked, needs plenty of whole blood, but the American soldier, until recently, has got by with plasma. However, I seemed to observe a change of heart when I was in Normandy recently and found American surgical units borrowing 200–300 pints of blood daily from British Transfusion Units, and I'm sure they were temporarily and perhaps even permanently benefited by having some good British blood in their veins.

In December 1943, the second Conference on Shock was held. Dr E. I. Evans comments on the therapeutic effects of whole blood and blood substitutes in shock, One of the chief problems is concerned with supplying whole blood in forward areas. Somewhere along the planning line somebody seems to have forgotten that plasma lacks oxygen-carrying power. Evans stated that this led to the wounded not surviving surgery.

Crystalloids: WWII

In WWII, crystalloids were mainly used for dehydration and electrolyte imbalance or if plasma or whole blood were not available.

Colloids: WWII

In WWI, use of gum acacia had resulted in toxic reactions and edema. During WWII, other colloids were researched for effectiveness, namely, gelatin, pectin, fish gelatin, amino acids, and oxidized cotton.

In the 1940s, dextran was being investigated by the United Kingdom, the United States, and Sweden. In 1942, A. Grönwall and Swedish biochemist B. Ingelman suggested using hydrolyzed dextran as a plasma substitute. A Swedish pharmaceutical company adopted the project in 1943. In 1944, under the direction of surgeon G. Bohmansson extensive clinical trials were initiated at the Regional Hospital in Örebro. By 1947, about 4 years after the innovation, a 6% solution of dextran fraction had been approved for clinical use in Sweden and, shortly thereafter, in the United Kingdom.

Transfusion Transmissible Disease

In 1942, batches of yellow fever vaccine and plasma contaminated with hepatitis virus were linked to cases of viral hepatitis. Between 1942 and 1945, around 200,000 cases were reported. This identified the disease as a matter of prime importance to the Armed Forces during World War II and it became evident during these conflicts that effective methods of screening, treating, and preventing hepatitis in soldiers were urgently required.

Post-WWII

Review

In 1945, the Conference on Shock and Transfusion drew the following conclusions: Plasma was best used far forward. Whole blood was essential and rendered the casualty fit for surgery. Large wounds required large transfusions. Speed in administration was essential. Reduced volume of resuscitation was advocated for the central nervous system and chest injuries.

In 1949, W. Rankin , who had served in the US Army in both world wars, reviewed his experience as an Army General and Director of the Surgery Division of the US Army in World War II and cited four factors as being most important in the reduction of mortality and morbidity rates for battle injuries in World War II: (1) the availability of excellently trained young surgeons who could perform surgery in combat areas; (2) improved methods of resuscitation, including the ready availability of whole blood and plasma; (3) the availability of antibiotics and chemotherapeutic agents used as adjuncts to surgery; and (4) improved evacuation along the chain of care.

As a result of those improvements in care, the percentage of combat casualties dying of wounds was reduced to 3.3% from the World War I percentage of 8.1%. Furthermore, the mortality rates of patients with life-threatening wounds of the head, chest, and abdomen were reduced to approximately one-third of the rates in World War I [39].

Korean War 1950–1953

In 1950, 5 years after WWII, the US military blood program had been discontinued. There had however been a review regarding this state of affairs, and a new policy had been drawn up but not implemented. On July 3, 1950, within days of the onset of hostilities, responsibility for collecting and distributing blood in the Far East Command was assigned to the 406th Medical General Laboratory in Tokyo, and on July 7, blood was delivered to the first hospital unit arriving in Korea. Military personnel in Japan and many Japanese civilians donated blood. Only low-titer (Anti A and B <256) group “O” whole blood was collected to reduce the logistical burden of typing and cross matching recipients. 39,000 units were collected; however, this was insufficient to meet the needs of the casualties.

As in WWII, the American Red Cross was asked to become the collecting agency for the US military again. The agency had a blood collecting program in operation, to supply blood to civilian hospitals in the United States and could build upon it; this too proved insufficient.

The Armed Forces Blood Program and a National Blood Program were set up and remained in operation until the end of active fighting in Korea. Some 400,000 units of whole blood were transfused by the end of the war.

Massive Transfusion of Group “O” Problems

Massive transfusions of low-titer group O whole blood to other blood groups resulted in the virtual replacement of the recipient’s cells with cells of the O group. The recipient’s plasma sometimes contained antibodies against red cells of their own hereditary blood group. Gradual hemolysis of native red cells by transfused antibodies was observed. The presence of anti-A and B antibodies from type O whole blood, however, sometimes made it impossible to crossmatch the patient. Severe reactions sometimes occurred when type-specific whole blood was given after large transfusions of low-titer O whole blood. In the light of this new observation, it was recommended that after transfusions of low-titer group O whole blood, no change should be made to blood of another group until at least 2 weeks had elapsed from the last group O whole blood transfusion [40].

Plastic Collection Bags

In 1950, Carl Walter and W.P. Murphy Jr. introduced plastic bags for whole blood collection; this important development made transport of blood easier and more efficient during war time.

Need for Whole Blood

In 1951, at a meeting of the Subcommittee on Shock , Committee on Surgery, NRC (National Research Council), Dr Walter L. Bloom stated: “It is interesting, and somewhat depressing, to note in various reports of conferences concerning the blood and blood-derivatives program in the Korean War how quickly the World War II experience seemed to have been forgotten and how the tendency was again evident to concentrate on agents other than whole blood in the management of combat and other casualties.” He went to add “that the entire philosophy of plasma expanders was questionable. The limitations of these substitutes should be defined, and they should be considered as suitable for emergency use only. The first need of combat casualties was for whole blood.” A review of use showed that an average of two-and-a-half pints were used for every casualty wounded in action [41].

Transfusion Risks

In 1952, only 4 major hemolytic reactions resulting in acute renal failure were reported out of the 50,000 whole blood transfusions administered [42].

Plasma Problems

The Army, in need of a fluid therapy agent to stabilize casualties during evacuation to a medical treatment facility, faced a difficult decision because using plasma risked hepatitis. The risk had increased from WWII. In 1951, the incidence of hepatitis after plasma transfusion was 21%. Sterilization techniques had proved unsuccessful. On August 20, 1953, Circular No. 73, Department of the Army directed that because of the risk of serum hepatitis, the higher cost, and the need to use it for the production of specific globulins, plasma would not be used “to support blood volume” unless dextran was not available [40].

Serum Albumin

In 1951, with the increased need for volume expanders, 50,000 units of outdated serum albumin were obtained from the Navy and transferred to the San Francisco medical depot for shipment to Korea. Technically, outdated serum albumin proved satisfactory. One of its advantages was that the small size of the units made it possible for corpsmen to load their pockets with it. Also, serum albumin did not freeze, as reconstituted plasma did. Albumin heated for 10 hours at 60 °C carried no risk of hepatitis. Albumin could be made from contaminated plasma, which meant that a large quantity could be obtained from the available plasma no longer considered fit for use because of the risk of transmission of hepatitis [40].

Dextran

By 1950, the Swedish experience with dextran had reached 200,000 cases. In the 10 years of its use, there had been no postmortem evidence of tissue damage, and reactions were fewer than with the use of either blood or plasma. A compilation of articles from the literature by Pharmacia showed an impressive use of dextran by reliable investigators in Denmark, Finland, Holland, as well as in Sweden. There was some evidence obtained from use of Swedish and British dextran that showed local and systemic allergic reactions; this was thought to be worse with the higher molecular weight dextran and the US-produced dextran of lower molecular weights. On October 1, 1952 at the meeting of Subcommittee on Shock, it was reported that 125 units of dextran had been used in Korea, with good clinical results and no significant reactions. A 6-month study had been started in Air Force installations in the United States. Dextran was used in increasing amounts until the end of the Korean War. In September 1953, a hitherto undescribed consequence of dextran injections was reported, a prolongation of the bleeding time, and the change in the bleeding time occurred within 3–9 hours after dextran had been given [40].

Vietnam War 1955–1975

In 1965, no formal military blood program existed in Vietnam. Transfusion requirements were met with shipments of 10 units of group O blood from Japan approximately every 10 days. Blood supply was provided by the 406th Mobile Medical Laboratory, Camp Zama, Japan. The decision was made to ship only low-titer group O whole blood. Later Group A was added and by 1966 all types of whole blood were utilized to meet demand. In Vietnam only low-titer group O whole blood was used far forward. From 1967 to 1969, around 230,323 units of whole blood were transfused; 24 hemolytic transfusions reactions were reported [43]. The Vietnam War was the first major wartime engagement for the Armed Services Blood Program (ASBP) . Over the course of the conflict, the program collected nearly 1.8 million units of blood in support of troops in Vietnam. It was the first time that every unit of whole blood used to support the war was voluntarily donated by military personnel, their dependents, and civilians employed at military installations and not through civilian organizations.

Aggressive fluid resuscitation during the Vietnam War with red blood cells, plasma, and crystalloid solutions allowed patients who previously would have succumbed to hemorrhagic shock to survive. Renal failure became a less frequent clinical problem, vital organ function was better sustained, but fulminant pulmonary failure termed “DaNang lung” or “acute respiratory distress syndrome (ARDS)” appeared as an early cause of death after severe hemorrhage.

Acute Coagulopathy in Trauma

Miller et al. published data in 1971 from the War in Vietnam that showed coagulation defects in massive transfusion; this was treated unsuccessfully with FFP and then whole blood which was successful in limiting the bleeding tendencies [44].

Post-Vietnam War

Crystalloids

In the 1970s, additional studies by Shires et al. demonstrated that a prolonged period of hemorrhagic hypotension was associated with the development of microvascular injury with marked ECF deficit which could be corrected only by the administration of isotonic crystalloids in volumes two to three times the estimated blood loss due to loss of interstitial fluid from the extravascular space. This was the basis of the well-known “3 to 1” dogma for the treatment of hemorrhagic shock, which was adopted by the ATLS for the treatment of trauma casualties. It was recommended that the early treatment of hemorrhagic shock includes primarily the control of external bleeding and early intravenous administration of 2000 ml of crystalloids through a large bore catheter [45].

Their philosophy for resuscitation in patients with traumatic bleeding was misapplied and led to the overuse of crystalloids, to the detriment of patients with severe bleeding who commonly received 5–10 L of crystalloids before any blood product administration [46].

These outcomes were actually predicted by Shoemaker in 1976, when he challenged the notion that the interstitial compartment required resuscitation and instead emphasized the need for whole blood to treat significant bleeding when the hematocrit fell below 30%. The overuse of crystalloids occurred despite a call for moderation by Moore and Shires as early as 1967. In their editorial, Moore and Shires state, “Blood should still be replaced during major operative surgery as it is lost. The use of balanced salt solutions appears to be a physiological adjunct to surgical trauma, not a substitute for blood.” Subsequent research has demonstrated that a crystalloid-based resuscitation strategy leads to increased inflammation and vascular permeability compared to WB [46].

Rise of Blood Components

By the 1980s and 1990s, the accepted ATLS treatment for hemorrhagic shock was aggressive use of crystalloids and colloids and component therapy. RBCs were to be used for patients who continued to actively bleed after 2L of fluids were given. Plasma and platelets were indicated if the patient was still bleeding after the RBCs were given and there was a laboratory abnormality indicating poor coagulation or platelet count, respectively.

From the 1990s onward, the evidence started to mount that this strategy may not be optimal. In 1990, Kaweski et al. published “The effect of prehospital fluids on survival in trauma patients.” In 1992, Krausz et al. published “Scoop and run” or stabilize hemorrhagic shock with normal saline or small-volume hypertonic saline.” In 1994, Bickell et al. published “Immediate versus delayed fluid resuscitation for hypotensive patients with penetrating torso injuries.” In 2004, Blumenfeld et al. published “Prehospital fluid resuscitation in trauma: the IDF-MC Consensus Panel Summary”; these and other papers started to question the strategy of aggressive crystalloid use.

Somalia

Use of Fresh Whole Blood in Mogadishu Experience and How It Affected Iraq/Afghanistan Wars

In 1993, the US forces in Mogadishu Somalia were faced with a shark attack victim from the tenth Mountain Division who required bilateral lower extremity amputations and massive transfusion. With limited plasma and infrequent resupply of packed red blood cells, Colonel Denver Perkins initiated an emergency donor panel whole blood collection. The immediate resolution of clinical coagulopathy and improved physiology was apparent to all. This highlighted the efficacy of whole blood and led to around 120 units of whole blood collected and more than 80 units being transfused during the Black Hawk Down crisis. This event led to the inclusion of whole blood training for deploying forward surgical teams at the joint trauma training center starting in 1999 with a section on whole blood in the 2003 version of Emergency War Surgery handbook (personal communication, JB Holcomb).

Tactical Combat Casualty Care

In 1996, Butler et al. published “Tactical combat casualty care in special operations” which led to the establishment of TCCC guidelines; this initial guideline recommended 1000 ml of Hespan for a casualty in shock with bleeding controlled [47].

2000 To Present

2000

RDCR and DCR

After 9/11 and the commencement of the “War on Terror,” coalition forces faced conflict in Afghanistan and Iraq. The initial resuscitation strategies were similar to those from the 1990s with the United Kingdom and the United States using clear fluids with a hypotensive resuscitation strategy based on maintaining a radial pulse, forward and during evacuation and blood components in medical treatment facilities.

During the conflicts and as casualty number and severity of injuries increased, mainly due to a rise in IED use, the concept of damage control resuscitation was resurrected from the shock wards of WWII and synthesized with emerging resuscitation strategies. This concept can be envisioned as the resuscitation of a patient to increase the chance of survival to, and survival of, damage control surgery [47]. Remote damage control resuscitation is the use of DCR concepts in prehospital and as early as possible in the evacuation chain [2]. Aspects of RDCR began with the prehospital use of RBCs to be effectively practiced by the UK MERT platform in Afghanistan after 2006 and by the US forces as part of their “Vampire Missions” that transfused patients upon transport starting in 2012.

Lethal Triad

Increasingly resuscitation strategies target the lethal triad of hypothermia, acidosis, and coagulopathy. The realization that this combined pathology has an impact on mortality has its roots in WWI. The modern concept was coined in 1982 by the American Trauma Society which proposed the “bloody vicious cycle,” which included acidosis, hypothermia, and coagulopathy as an important cause of death in patients with coagulopathy in the early stage of trauma. This term was gradually replaced by other terms, such as “lethal triad” and “iatrogenic trauma coagulopathy,” and is also the theoretical basis for damage control resuscitation [48].

2003

Acute Traumatic Coagulopathy

In 2003, Brohi et al., in a retrospective study of over 1800 patients, showed that just over 24% had significant coagulopathy, and this group had a threefold higher mortality. Brohi called the pathology acute traumatic coagulopathy . Later in 2007, Brohi identified that studies had shown that this coagulopathy exists on admission to hospital and is independent of severity score. He argued that the driver of this pathology is hypoperfusion causing activation of the protein C pathway and fibrinolysis. Resuscitation strategies started to target this coagulopathy [49].

2004

Resurgence of Whole Blood

In 2004, the 31st Combat Support Hospital in Baghdad began using ABO type specific whole blood as a salvage therapy when patients were near death. With experiential data that whole blood was more effectively reversing shock and coagulopathy than with RBCs of advanced storage age and plasma, this encouraged earlier use of whole blood [50]. In October of 2004, a massive transfusion guideline was developed that incorporated the early use of warm fresh whole blood (ABO specific) and blood components in a 1:1:1 ratio until whole blood was available [50].During 2004 at the 31st CSH, there was the incorporation of rapid screening tests for HIV, HCV, and HBV for ABO-specific whole blood that was collected from donors. Results of these rapid tests were available within 5 minutes and resulted prior to completion of the collection of the unit of whole blood [51].

ABO-specific whole blood was used instead of low-titer group O whole blood because at this time the AABB standards for whole blood stated that it must be ABO-specific when transfused. The lessons learned from WWII to the Korean and Vietnam Wars regarding the efficacy and safety of low-titer group O whole blood were lost in the 1980s to the 2000 timeframe and the concern regarding the mild-to-moderate risk of incompatible plasma led the blood banking community to write standards that required whole blood to be ABO specific [50].

2005

Platelets

In 2005, the US Army for the first time makes apheresis platelets stored at 22 °C for 5 days available in Baghdad and soon thereafter expands the availability to all other combat support hospitals [52].

2007

Component Therapy

The optimal use and ratios of components in resuscitation of hemorrhagic shock was questioned with a trend toward increased use of plasma. In 2007, Borgman et al. publish “The ratio of blood products transfused affects mortality in patients receiving massive transfusions at a combat support hospital.” Borgman recommended early and increased use of red blood cells and plasma in a 1:1 ratio. This retrospective study evaluated 246 patients, with massive transfusion, and reported an independent association between higher ratio of plasma to RBCs and survival. There was also lower risk of death from hemorrhage in patients transfused with higher plasma to RBC ratios [4].

2008

ATLS

The 8th edition of the ATLS manual was changed to reflect developing strategies in resuscitation “Balancing the goal of organ perfusion with the risks of re-bleeding by accepting a lower than normal blood pressure has been called ‘Controlled resuscitation’, ‘Balanced Resuscitation’, ‘Hypotensive Resuscitation’ and ‘Permissive Hypotension’. The goal is the balance, not the hypotension. Such a resuscitation strategy may be a bridge to but is also not a substitute for definitive surgical control of bleeding” [53].

2009

Data on WFWB

Spinella et al. published data indicating that ABO-specific warm fresh whole blood is independently associated with improved survival for patients with combat-related traumatic injuries and may improve 30-day survival [7].

Emergency Donation of WB

In 2009, Steve Williams of the Royal Caribbean Cruises implemented a fresh whole blood transfusion protocol using onboard guests and crew as volunteer donors [54].

Dried Plasma

French medical personnel in medical treatment facilities begin using dried, pathogen reduced, pooled plasma which was first reported in 2009 [55].

2011

Data on WFWB

Nessen et al. publish data indicating that warm fresh whole blood (WFWB) is associated with improved survival at two facilities, and it is the first manuscript from the Afghanistan/Iraq Wars to provide some data that the use of group O whole blood in non-group O patients is safe and effective [42].

In 2011, the THOR Network is established by Strandenes and Spinella. It is an international multidisciplinary network of providers ranging from medics to basic scientists with a goal to improve outcomes for patients with life-threatening traumatic bleeding. It initially focuses on whole blood but rapidly expands to include all aspects of resuscitation for patients with traumatic hemorrhagic shock [56].

2012

Dried Plasma

In 2012, US Special Forces medics begin to use freeze-dried plasma provided by the French Military (personal communication with Andre Cap).

2013

Whole Blood Storage

Pidcoke et al. publish that cold storage of whole blood at 4 °C maintains adequate hemostasis for at least 14 days. These findings are confirmed by Strandenes in Norway in 2015 [57, 58].

The Norwegian Naval Special Operations Commando starts bringing cold-stored whole blood on mission in the Gulf of Aden [58].

Dried Plasma

The Israeli Defense Forces implements the use of freeze-dried plasma (FDP) at the point of injury (POI) [59].

2014

TCCC Guidelines Change ranks resuscitation fluids and places whole blood as the optimum for hemorrhagic shock [60].

The THOR Network advocates for the resurrection of cold-stored LTOWB to improve its availability and safety compared to the use of warm fresh whole blood [61].

Field Transfusion

Strandenes et al. publish “Emergency Whole-Blood use in the field: a simplified protocol for collection and transfusion” in which he presents the Norwegian Naval Special Operation Commando unit specific RDCR protocol, which includes field collection and transfusion of warm fresh whole blood [62].

Dried Plasma

The Norwegian Helicopter Emergency Medical Service began using a German freeze-dried plasma product for civilian casualties [63].

Platelets

In 2014, the US Army Blood Research Program , led by Dr. Andre Cap, begins extensive in vitro studies convincingly demonstrating that apheresis platelets stored at 4 °C have superior hemostatic function and are not irreversibly activated as previously presumed compared to platelets stored at 22 °C [64].

2015

The Norwegian Helicopter Emergency Medical Service located in Bergen started transporting low-titer group O whole blood on every mission for civilian casualties [65].

The University of Pittsburgh becomes the first civilian trauma center to bring low-titer group O whole blood back after its disappearance in the 1970s after the Vietnam War ended [66].

The Norwegian Armed Forces transported cold-stored LTOWB to military facilities in Afghanistan (Personal communication with CDR Geir Strandenes).

ROLO Program

In 2015, the 75th Ranger Regiment’s Ranger Group O Low Titer (ROLO) Whole Blood Program was developed and initiated in concert with international multidisciplinary civilian and military providers of the Trauma Hemostasis and Oxygenation Research (THOR) Network to bring emergency blood transfusion from the hospital environment to the battlefield. Thanks in large part to LTC Andre Cap, Chief of Blood Research at the Army Institute of Surgical Research, LTC Ethan Miles Command Surgeon, 75th Ranger Regiment, and LTC Jason Corley, Deputy Director of the Army Blood Program, the ROLO Whole Blood Program went from concept to implementation at the unit level in only 18 months [67].

2017

Cold Platelets

In 2017, the US Army began transfusing apheresis platelet units stored at 4 °C based on its superior hemostatic function compared to platelets stored at 22 °C [68, 69]. The storage duration for the 4 °C platelets started at 3 days with the plan to extend it over time after collecting data.

Blood Failure

Bjerkvig et al. publish on the concept of “blood failure, ” and the link between oxygen debt and traditional organ failure has long been recognized. Bjerkvig argues for the consideration of failure in two additional linked and very dynamic organ systems, the endothelium and blood, both very sensitive to oxygen debt. The degree of damage to the endothelium is largely modulated by the degree of oxygen debt. Hypoperfusion causes oxygen debt and is believed to begin a cascade of events leading to acute traumatic coagulopathy (ATC) . This combination of oxygen debt-driven endothelial damage and ATC might be considered collectively as “blood failure.” The article presents the implications of oxygen debt remote damage control resuscitation strategies, such as permissive hypotension and hemostatic resuscitation [70, 71].

2018

AABB Standards for Low-Titer Group O Whole Blood

The THOR Network petitions the AABB for acceptance of the use of low-titer group O whole blood for patients with severe bleeding of any etiology. A few months later, Standard 5.15.1 in the 31st edition of the AABB standards are changed allowing the use of low-titer group O whole blood. After this change in standards, many civilian trauma centers internationally begin to adopt the use of cold-stored LTOWB for patients with life-threatening bleeding [72].

Whole Blood Transfusions

From 2003 to 2018, there have been over 10,000 units of ABO-specific warm fresh whole blood transfused. Between 2017 and 2018, there have been over 300 units of cold-stored LTOWB stored at 2–6 °C transfused by the US military.

Conclusion

DCR and RDCR will continue to evolve as new evidence, research on the pathophysiology of hemorrhagic shock, technological advances, and drug development emerge. It is in looking back that we understand the path that has led us to where we are now. It is essential that the hard lessons learned from lives lost do not have to be learned again, as has so often been the case in the past.