Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

3.1 Introduction

Vaccine history is inextricably linked with the histories of microbiology and immunology; evolution of the latter disciplines parallels the ongoing quest of humankind to understand the fundamental basis of life. How our species survives in the hostile world that surrounds us has been a source of fascination since the beginning of recorded time. Injury and infection likely exacted a heavy toll as our early hominid ancestors descended from the trees and adapted a predatory life style on the African plains; death from bleeding and wound infections undoubtedly plagued early humans (Opal 2003). Epidemic disease, however, probably played a minor role in shaping the primitive human immune system. Instead, the primary determinants of lethality for small, scattered bands of hunter-gatherer populations of Homo sapiens were starvation, predation, and hypothermia.

Our collective fate was radically altered approximately 8,000–10,000 years ago when a highly developed immune system became a major selective advantage. Inhabitants of the “Fertile Crescent”, in what is now the modern day Middle East, first successfully domesticated plants and animals, irreparably altering human history. Domestication of plant and animal species had four major impacts – reduction in the risk of starvation, establishment of fixed dwellings close to fields for farming, improved nutrition with extended fecundity rates in women associated with more successful child bearing, and proximity to animals with the attendant risk of transmission of zoonoses to humans.

Adaptation from a nomadic, hunter-gather existence to a stable agrarian society with ample food supplies spawned a massive population explosion of humans. Division of labor followed, resulting in the blossoming of civilization, science, innovation, government, and the arts (Diamond 1999). The rapid expansion of densely populated, human habitations with poor sanitation, absent sewage disposal, proximity to domesticated animals, and lack of understanding about the spread of infectious diseases created favorable conditions for epidemics. Since that time, waves of epidemics have been recorded and continue unabated today. Strong selection pressures created by repeated infections have promoted highly evolved innate and acquired immune systems in humans.

Although the successful domestication of animals greatly benefited humankind as a ready source of food, transportation, and work, it also exposed humans to a large set of infectious agents that were epizootic to these animal species. Crossing species barriers is a difficult process for pathogens; however, once accomplished, the pathogen enjoys unfettered access to a new host species, unencumbered by any preexisting immunologic experience, resulting in epidemic diseases. Ancient examples abound – endemic camelpox in domesticated camels became human smallpox, bovine rinderpest became epidemic human measles, bovine tuberculosis became human tuberculosis, and swine influenza became human influenza. More recent historical examples include the cross-species adaptation of human immunodeficiency virus (HIV) from simian immunodeficiency virus (SIV) of non-human primates (Kalish et al. 2004); spongiform encephalopathy from sheep to cattle and on to humans as variant Creutzfeldt-Jakob Disease (Stevens et al. 2006); avian influenza from water fowl (Herzog et al. 2004); and severe acute respiratory disease (SARS) from civet cats (Margaret et al. 2004).

Other disease transmission factors also became important in evolving human societies. Peri-domestic rodent populations, emboldened by feeding upon the enormous amounts of refuse generated by large population centers, developed into efficient reservoirs for infections such as epidemic typhus and plague. Large population densities of humans in fixed, farming communities provided the essential substrate for efficient airborne transmission of respiratory pathogens and for a sufficient number of partner exchanges to maintain sexually transmitted diseases as well (Sherman 2007). With the acceptance of the germ theory of disease, novel modalities succeeded in protecting human populations primarily through enhanced sanitation measures, public health efforts, and, as vaccine science evolved, through the use of vaccination. The fundamental historical events that gave rise to the fields of microbiology, immunology, and infectious diseases will be described in this chapter (Fig. 3.1).

Fig. 3.1
figure 1_3figure 1_3

Major Milestones in Microbiology

3.2 Early Concepts of Contagion and Protection

Epidemic, transmissible diseases were documented in the recorded histories of early, yet advanced civilizations. Ancient Hebrew texts refer to “plagues” that beset the Pharaohs in Egypt more than 1,000 years before the birth of Christ; the Greeks and Romans each experienced cataclysmic outbreaks that had profound impacts on their respective Empires. With each of these “plagues” enlightened observers noted the phenomenon of resistance upon re-exposure to the same disease process. The Greek historian Thucydides recorded such observations regarding smallpox, and there is evidence that the Chinese exploited this knowledge in the sixteenth century in their practice of variolation (Leung 1996).

Much later, the intercontinental exchange of people and pathogens during the age of exploration to Africa and the New World in the sixteenth and seventeenth centuries dramatized the concept that some form of “natural resistance” to disease was often intrinsic to native populations yet lacking in the newly exposed (Diamond 1999). Africans, forcibly exported to America as slaves, were noted to be more resistant to tropical diseases such as yellow fever and malaria when compared to share croppers of European descent. This was most evident upon first arriving to the colonies, a process known as “seasoning” by landowners. European farmers died by the drove from sickness and disease in the Southern colonies, as did captured Native Americans transported from New England and elsewhere to work in the fields there and in the Caribbean (Morgan 1975), thus furthering the African slave trade as an economic expediency for the rapid expansion of a healthy labor force.

Indigenous Amerindian peoples were highly susceptible to smallpox, first introduced into the New World by the Spanish Conquistadors in the early 1500s. Cortez and Pizarro unwittingly took advantage of this phenomenon to subjugate the Aztec and Inca Empires, respectively. In 1763 Lord Jeffrey Amherst, commander of British troops in North American during the French and Indian War took this knowledge a step further, using smallpox as a biological weapon against the hostile Native American forces in Pennsylvania. Blankets were deliberately contaminated with the scabs from smallpox victims and left for the Indians in wintertime. Whether acquired from the fomites or via human-to-human transmission, smallpox devastated the Indians who had sided with the French forces, contributing to the British victory (Diamond 1999).

Back in the Old World a dramatic epidemic of another kind was underway. Shortly after Columbus’ first return voyage in 1493, an epidemic of “great pox” occurred throughout much of Europe. “Great pox” aptly described the clinical appearance of the cutaneous lesions of secondary syphilis, in contradistinction to the familiar appearance of smallpox. While it is possible, even likely, that some of Columbus’ crew contributed to the spread of syphilis throughout Europe, they were likely the vector, rather than the original source of infection. Skeletal remains found in both Britain and Greece and dated well before Columbus made his famous voyage carry the unmistakable stigmata of the osseous forms of tertiary syphilis. It is likely that syphilis existed in Europe prior to Columbus in relatively rare and localized forms, arriving from the Mediterranean via trade routes established centuries earlier. After the defeat of the Islamic Moors in the Battle of Granada in 1492, a Papal order closed all leprosaria, institutions that probably housed numerous, misdiagnosed, syphilitic patients within their confines. Release of these highly infectious individuals, coupled with the rampant prostitution practices of the time, likely contributed to the spread of the disease across Europe. This newly recognized and highly virulent form of syphilis continued to be epidemic into the first half of the sixteenth century (Sherman 2007).

The Renaissance brought forth the Age of Enlightenment with its remarkable advances in science and the arts, adding to the major advances that had already occurred in the first and early second millennia AD in China, India, Persia, and the Islamic world. Although the fundamental principles of the scientific method were originally described by the Franciscan monk Roger Bacon in 1269, multiple factors limited the work of scientists and intellectuals during the ensuing 400 years. For the areas of microbiology and immunology the lack of tools and techniques to adequately study microscopic events was the major impediment.

Using his powers of observation and knowledge of epidemics, the Italian physician Girolamo Fracastoro, or Hieronymous Fracastorius, had written a treatise on the germ theory of disease entitled “de Contagione” in 1546. Fracastorius correctly surmised that tiny, free-living organisms, which he referred to as “seeds of disease,” existed in nature. Despite being invisible to human eyes, he postulated that these disease-causing organisms could be transmitted from person-to-person directly or via fomite intermediaries, thereby spreading contagion (Gensini and Conti 2004). He correctly surmised that syphilis was caused by such a microscopic organism. In his poem entitled “Syphilis sive Morbus Gallicus” (translated “Syphilis or the French Disease”) he described in remarkably accurate, yet mythical, poetic detail the clinical consequences of syphilis (Conrad et al. 1995). The Italians blamed syphilis on the French, hence the name “the French Disease”; the French on the other hand referred to it as “the Italian Disease.” This pattern of naming the syphilis epidemics based on local, political, or religious adversaries continued as the scourge spread throughout the western world and the Middle East (Sherman 2007).

The Dutch textile merchant and self-taught scientist, Antonie van Leeuwenhoek (1632–1723) is credited with first identifying microorganisms, or “little animals,” using his newly developed microscope in 1677, thereby confirming Fracastoro’s hypothesis (Corliss 2002). The critical significance of these tiny forms to human health was not fully appreciated until almost 200 years later when Pasteur and Koch first successfully cultured bacterial organisms from diseased tissues. Despite the technical shortcomings in the period between van Leeuwenhoek (Fig. 3.2) and Pasteur, a number of scientists and physicians correctly hypothesized the existence of microscopic organisms and their contribution to human disease.

Fig. 3.2
figure 2_3figure 2_3

Antony van Leeuwenhoek (Rijksmuseum, The Netherlands)

Regrettably, theories of contagion still lacked the tools enabling scientific proof, and therefore the warnings of disease pathogenesis were largely ignored, often with tragic consequences. The Viennese physician Marcus Plenciz presented a lucid explanation for clinical observations made up to that time, proposing a germ theory of disease as early as 1762. Subsequently, Jakob Henle, a noted German physician and anatomist, further advanced the germ theory concept in 1840 ([Au1]Gensini and Conti 2004). Such theories were still ahead of scientific technologies for their validation; however, empiric evidence supporting these ideas mounted dramatically with the seminal observations of two European physicians in the mid-nineteenth century.

3.3 Mounting Evidence for the Germ Theory of Disease

In the early 1840s a young Hungarian obstetrician embarked on an area of scientific investigation, informed by a series of observations that would eventually revolutionize the concept of disease causation (Wyklicky and Skopec 1983). Ignaz Semmelweis (1818–1865) was a faculty member of the Lying-In Hospital in Vienna, Austria, which consisted of two obstetrical services that alternated admissions on a daily basis. The first service was operated by physicians and medical students; the second by midwives. The mortality rate for puerperal or “child bed” fever was such that one out of ten pregnant women could be expected to die shortly after birth from this dreaded complication. Semmelweis (Fig. 3.3) observed that the mortality rate was almost tenfold higher in the physician service as compared to the second service (Nuland 1979). He recognized that the putrid odor associated with women dying of puerperal fever was similar to that emanating from corpses during autopsies by the medical faculty and students.

Fig. 3.3
figure 3_3figure 3_3

Ignaz Semmelweiss (Wellcome Library, Permission requested)

Autopsies were a critically important component of medical education of the time; they were employed as a primary tool to teach anatomy and pathology to medical students. Semmelweis noted that the same malodorous smell was found on the hands of doctors and students moving from the autopsy room to the labor and delivery rooms. He also observed that the death rate from puerperal fever in the physicians’ clinic decreased significantly when the medical students were on vacation and no autopsies were being performed. Lastly, he witnessed the death of one of his close friends, Jakob Kolletschka, a pathologist who died shortly after cutting his finger during an autopsy of a woman who had recently died from puerperal fever. He correctly hypothesized that some form of “putrid matter” must be carried on the hands of physicians during their rounds between the autopsy and birthing tables and might be transmitted to pregnant women causing this highly lethal peripartum illness (Jones 1970).

Semmelweis made these observations with no formal training in microbiology, as the latter did not exist as a distinct area of science at the time. In fact the germ theory of disease was not taught in medical schools in Europe or elsewhere. Semmelweis found that washing hands using a dilute, chlorinated lime solution after performing autopsies would remove the putrid odor. Based on his empiric observations but lacking definitive proof of his hypotheses, he boldly introduced a policy whereby all medical students and faculty were required to wash their hands in this solution before having contact with patients. In 1847 Semmelweis showed that the introduction of hand washing between patient contacts reduced the mortality rate from puerperal fever by fourfold in 1 year (Wyklicky and Skopec 1983).

As seen throughout history, innovative ideas that contradict prevailing wisdom are vulnerable to immediate rejection; additionally, Semmelweis was guilty of poor timing. Although he had demonstrated the benefit of a simple intervention, it occurred at a moment of great geopolitical turmoil and was met with considerable acrimony, much of it politically motivated. He experienced profound, negative professional and personal consequences of his work. By 1848 the concept of revolution was spreading throughout Europe; within the Hapsburg Empire, of which Austria and Hungary were a part, the tenuous Dual Monarchy was at risk of crumbling under the separatist demands of Hungarian nationalists. A wave of political and social conservatism took hold in Austria. When this young, talented Hungarian faculty physician with his radical new ideas about health care came up for reappointment, he was passed over and forced to resign. He returned to Hungary where his novel prevention strategy against puerperal fever was implemented with success.

Semmelweis failed to optimize his position in Vienna. Because his oratory and literary skills in German were inadequate, it was difficult for him to effectively communicate his ideas to colleagues (Nuland 1979). Additionally, he was by reputation dogmatic and inflexible, traits that further alienated him from his peers. It did not help matters that it took him over a decade to write the definitive review of his investigations into the etiology and prevention of puerperal fever and when the manuscript was finally produced in 1861, it was a rambling, confused report that convinced few of his skeptics and was roundly criticized as being poorly formulated and unscientific. Semmelweis countered with a series of harsh diatribes against his critics, essentially accusing his fellow physicians of killing their patients through negligence and intransigence to his new ideas about hand washing. His behavior in public and private became increasingly erratic; he fell into a deep melancholy eventually resulting in his involuntary commitment to an insane asylum. The final details of his demise remain shrouded in mystery; he apparently died of bacterial sepsis from injuries sustained when he attempted to escape from this mental institution.

Semmelweis died at the age of forty-seven, never seeing his radical notions regarding transmissible microscopic organisms as the cause of disease and hand hygiene as its solution being widely acknowledged or appreciated by the medical or scientific communities. He stood firm until his final days: “In a word, the carrier is anything contaminated with decomposed animal organic material that comes in contact with the vaginal tract of the parturient. If I shall be denied the privilege of seeing with my own eyes the conquest of puerperal fever, the conviction that sooner or later this thesis will find acceptance, will cheer my hour of death.” (Wangesteen and Wangesteen 1978).

Epidemiologic evidence of microorganisms as a cause of human disease was being observed in community outbreaks as well as in hospital wards. In the late 1840s and early 1850s large, community-wide outbreaks of cholera gained much public attention. Massive population expansion into overcrowded, poorly hygienic, urban areas had occurred throughout the nineteenth century as a result of the industrial revolution. Although the flush toilet had been patented in 1819, it was not in widespread use, and the effluent from toilets and public privies was deposited into local rivers, converting municipal sources of drinking water into open sewers.

In 1849, a prominent London physician, John Snow (1813–1858), published a pamphlet in which he speculated that cholera was a waterborne or foodborne, intestinal illness (Snow 1855). In so doing he directly challenged the prevailing “miasma theory” that cholera and other diseases resulted from bad air. Such thought was widely accepted at the time through traditional teachings and the influential experimental work of the German chemist Max von Pettenkoffer. In 1854 a cholera outbreak occurred in London that provided compelling evidence in favor of Snow’s alternative hypothesis.

Snow (Fig. 3.4) carefully mapped the incident cases of cholera in the residents of downtown London and noted their proximity to public water-drawing sites. He observed that the highest incidence of disease was centered at the corner of Broad and Cambridge Streets, the site of a pumping station for drinking water. The water intake for this pump was drawn from a location just downstream of a large sewer effluent from London in the Thames River. Using interviews of cases and contacts and statistical assessments, methods that would become standard fare for future outbreak investigators but were novel at the time, Snow deduced that the infection was transmitted by contaminated water. As a result of his evidence, the handle was removed from the Broad Street pump forcing local residents to seek water from other pumping stations. The epidemic, probably already waning, was halted. Snow is appropriately credited as the founding father of the field of epidemiology based on this work. Although he microscopically examined the contaminated water supplies and observed “small, white flocculent particles” that he speculated were the causative agent of cholera (Johnson 2002), he never obtained definitive microbiologic proof.

Fig. 3.4
figure 4_3figure 4_3

John Snow (Wellcome Library, Permission requested)

Snow’s recommendations included a number of other sanitation maneuvers, such as washing the clothes and bed linens of cases, isolation of sick people from healthy ones, and boiling water supplies; all of these helped to curtail further cases of cholera. It would take another 30 years before Koch and his colleagues finally isolated Vibrio cholerae, the etiologic agent of this dread epidemic disease (Snow 1855; Sherman 2007). Nonetheless, through careful epidemiologic study, Snow had been able to infer an understanding of the possible etiology of cholera and implement effective public health measures to prevent future outbreaks.

Around the time Semmelweis was making his seminal observations on an obstetrical infection in Vienna, the English botanist and clergyman, Reverend Miles J. Berkeley was unraveling the mysterious etiology of another devastating infection with major socioeconomic implications, the potato blight, and lending further support to the growing body of evidence in favor of the germ theory of disease. Berkeley, a mycology expert, noted the unmistakable presence of microscopic mold elements in diseased plants in 1846. The potato blight would, over the next few years, lead to the death of one million Irish and result in the mass emigration of approximately two million of their countrymen from their homeland, never to return (Sherman 2007). Berkeley’s observations were predictably mocked by the scientific community, as it was generally accepted at the time that the potato blight was due to cold and damp “miasma.” In 1861, the same year that Semmelweis wrote his now famous if flawed paper on puerperal fever, Anton de Bary, a German plant pathologist and mycologist, conclusively proved that the etiology of potato blight was in fact a fungus –Phytophthora infestans (literally meaning “the plant destroyer”) – by essentially following the same lines of scientific reasoning that would set the standard for microbial causation two decades later in a Berlin tuberculosis laboratory.

3.4 Microbiology Comes of Age: Louis Pasteur

The actual inception of microbiology as a distinct science traditionally dates to 1857, when Louis Pasteur (1822–1895) convincingly demonstrated that microorganisms were responsible for the fermentation of fluids, although incremental, significant advances in the field had occurred in the intervening period since van Leeuwenhoek’s observations using microscopy (Wainright 2001). Pasteur’s work debunked the extant theory of “spontaneous generation” and showed instead that fermentation, spoiling, or contamination of organic substances was due to the presence of environmental microorganisms (Johnson 2002). With these investigations Pasteur (Fig. 3.5) essentially proved the germ theory of disease and launched the field of modern microbiology.

Fig. 3.5
figure 5_3figure 5_3

Louis Pasteur (Institut Pasteur, Permission requested)

Although the germ theory of disease had its renowned proponents, including Jakob Henle and Edwin Klebs, both German physicians and contemporaries of Pasteur, it also attracted many influential detractors. Using early prototype microscopes, van Leeuwenhoek and Robert Hooke had clearly demonstrated the presence of unicellular protozoan and tiny bacterial organisms – the “little animalcules” – as early as 1677 (Gest 2007). Plant pathologists and mycologists had already demonstrated the essential role of microorganisms as the cause of selected diseases in plants. Yet it was still unproven whether microorganisms could actually cause human diseases. Moreover, debate smoldered as to whether these organisms arose spontaneously from substances already present in devitalized tissue or whether they derived from exogenous sources and had to be implanted to cause disease.

One of Pasteur’s foremost, contemporary critics was Archimé de Pouchet, Director of the Natural History Museum in France and one of the main advocates of spontaneous generation. Owing to the scientific and even political importance of the debate, the French Academy of Sciences offered a monetary prize in 1862 to the scientist who could provide definitive evidence to either prove or disprove the concept of spontaneous generation. Pasteur accepted the challenge and won the award through a series of elegant and carefully executed experiments that eliminated the possibility of spontaneous generation. He showed that heat sterilization, chemical sterilization, or filtration of air and water could maintain organic materials in sterile conditions indefinitely without any microbial growth (Debré 1998).

Techniques of sterilization and “Pasteurization” of dairy products were soon introduced and undoubtedly saved millions of lives in the period that followed. Pasteur established the Pasteur Institute through a combination of major private financing and public monies. The Institute soon became an international center for microbiology, immunology, and medicine, largely due to the efforts of Louis Pasteur himself.

Pasteur’s work inspired the British surgeon Joseph Lister (1827–1912) to attempt to use sterile methods to protect the wounds of trauma patients at the orthopedic infirmary in Glasgow, Scotland in 1867. Realizing that universal air filtration or heating the patient to maintain sterility were impractical clinical options, Lister (Fig. 3.6) investigated the use of chemical disinfectants as a method of preventing wound infections. Based on the discovery by local farmers that carbolic acid decreased the fetid odor of the common fertilizer “night soil” (i.e. human excreta), Lister demonstrated the value of dilute solutions of this chemical in maintaining the sterility of dressings, surgical instruments, and the hands of surgeons caring for injured patients (Harding-Rains 1977).

Fig. 3.6
figure 6_3figure 6_3

Joseph Lister (Wellcome Library, Permission requested)

Lister’s findings were favorably received by the scientific community, and the use of sterile technique in the care of surgical patients was adopted as an international standard (Bynum 1994). Lister’s work was accepted, and he succeeded in establishing principles of antisepsis where his predecessors, most notably Semmelweis, had failed because the germ theory of disease had by this point garnered widespread acceptance through the efforts of Pasteur (Wangesteen and Wangesteen 1978).

Pasteur’s celebrity and stature within the scientific community attracted talent from many parts of the world. He surrounded himself with a large number of dedicated and capable investigators, thereby greatly enhancing the prestige of the Institute that bore his name. A number of his students, assistants, and colleagues made major contributions to the fields of infectious diseases, microbiology, and immunity, including Charles Chamberland who invented the autoclave, a water purification device that was later used in the discovery of viruses, and developed a Pasteurella vaccine; Alexandre Yersin, co-discoverer of the plague bacillus; Emile Roux who discovered diphtheria toxin and antitoxin; Jules Bordet, who discovered the whooping cough bacillus and complement; Ilya Metchnikoff, who discovered the process of phagocytosis and provided the initial descriptions of innate immunity; and Albert Calmette, who discovered cobra antivenin and developed Bacillus-Calmette-Guérin, the first effective tuberculosis vaccine (Debré 1998).

Pasteur used his powers of experimental observation to move the burgeoning field of microbiology to its logical next steps – protection against pathogens. In an ironic nod to his own axiom concerning chance favoring the prepared mind, established earlier in his industrial chemistry career, Pasteur serendipitously discovered the phenomenon of laboratory attenuation of microorganisms and was able to extrapolate his findings as a means of developing targeted vaccines. In 1879, Pasteur observed that after serial passage the chicken cholera bacillus, now known as Pasteurella spp., lost the capacity to cause lethality when injected into chickens. Because chickens were in short supply in the laboratory, Pasteur was forced to recycle the same animals in subsequent experiments using freshly passed and highly virulent strains of bacteria. Remarkably, the chickens previously exposed to attenuated bacilli survived infection with virulent strains, whereas naïve chickens died rapidly upon challenge. He surmised that serial passage of the bacteria at certain elevated temperature ranges and in the presence of oxygen resulted in organisms that could induce resistance to challenge using virulent forms of the same bacteria.

Pasteur recognized that this technique of “artificial attenuation” could replace the need to identify naturally attenuated microorganisms, as Jenner had done with cowpox in milkmaids, and that this phenomenon could revolutionize the concept of vaccines. This finding, perhaps more than any other since Jenner’s, opened up a new epoch in the battle against communicable diseases, one in which the microbiology laboratory performed a pivotal function. With this technology, Pasteur rapidly developed successful vaccines against anthrax in 1881 and rabies in 1885.

3.5 Robert Koch and the Berlin School of Microbiology

Even as Paris was fast becoming the center of research in the nascent field of microbiology, a country doctor from Prussia was beginning his career in microbiology essentially as a weekend hobby. Robert Koch (1843–1910) studied medicine at the University at Göttingen where he came under the influence of the notable Professor of Anatomy Jakob Henle, an early proponent of the germ theory of disease, and learned the importance of careful animal experimentation in understanding disease causation. In the 1870s, as a district medical officer in the Prussian town of Wollstein, Koch (Fig. 3.7) began his investigations into the etiology of anthrax in sheep; this marked the beginning of a distinguished career in scientific research (Brock 1988). He identified anthrax bacilli in the blood of infected sheep and successfully transmitted the infection into healthy experimental animals. Using careful photomicroscopy and detailed drawings, he accurately described the life cycle of anthrax and the process of endospore formation. With the publication of this work in 1876, Koch became a major force in the fledgling field of microbiology.

Fig. 3.7
figure 7_3figure 7_3

Robert Koch (Robert Koch Institute)

Koch pioneered a number of laboratory techniques. He employed the use of the oil immersion microscope to study bacteria; developed new staining methods for bacterial identification; and he invented procedures for the isolation of pure bacterial cultures on solid media, the latter facilitated by the use of agar as the solidifying agent in flat “Petri” dishes, named after their inventor, Richard Petri (a colleague of Koch) and still in common use today. To obtain pure growth he insisted upon the use of single colony isolation, “the Koch plate technique” (Kaufmann and Winau 2005), acclaimed by even his rival and eventual antagonist, Pasteur, who was noted to remark, “C’est un grand progress, monsieur”(Brock 1988).

While serving as a senior medical officer in the Imperial Health Office in Berlin in 1882, Koch discovered the microbial etiology of tuberculosis, perhaps the most important disease cause of death at the time, making his a household name (Ryan 1992). Using differential staining techniques, careful microscopy, and solid agar methods, Koch isolated the causative agent, Mycobacterium tuberculosis, in pure culture (Dubos and Dubos 1956). It was in this context that he initially proposed a set of criteria that had to be satisfied to infer an etiologic role for a specific bacterial agent in a particular disease. These conditions came to be known as “Koch’s Postulates” and were eventually refined by Koch: the pathogen accounts for the clinical and pathological features of the disease and must be found in every case in which the disease occurs; the pathogen is not found in other diseases as a non-pathogen; after being isolated from the body and repeatedly passed in pure culture, the pathogen can induce the disease in animal models; and the same pathogen must be re-isolated from the experimental animal (Brock 1988). These criteria remained the gold standard upon which to judge evidence of microbial disease causation and are still valid to some extent today.

Koch, like Pasteur, surrounded himself with brilliant colleagues and collaborators and simultaneously attracted strong supporters and equally vocal detractors. Contemporary physicians who rejected the germ theory in favor of other theories of disease causation included Max von Pettenkoffer, the influential Munich hygienist and the celebrated cellular pathologist Rudolf Virchow. Pettenkoffer espoused the “sanitation theory” of disease, widely supported by social liberals, that poor sanitation, unfavorable water, soil conditions, and damp weather generated miasma poisons that subsequently caused illness, primarily in socio-economically disadvantaged populations. The cure for epidemics was therefore social progress and the elimination of poverty. Virchow, considered to be the founding father of cellular pathology and the most respected academic physician in Germany during Koch’s era, remained an ardent opponent of the germ theory of infectious diseases; he never completely embraced Koch’s discovery of the tubercle bacillus, despite the overwhelming scientific evidence (Brock 1988; Kaufman and Winau 2005).

However, the germ theory of disease was embraced by the conservative Prussian government in Berlin largely because the fundamental premise was that communicable diseases were the consequences of exogenous microorganisms invading the body, circumstances that were largely independent of socioeconomics. Pathogen control could then be viewed as possible with central governmental controls, without having to address all the ills of society. The Prussian Parliament supported Koch’s work with lavish funding for the Koch Institute for Infectious Diseases, which opened in Berlin in 1891 (Brock 1988). The government’s interests were more than altruistic; these were fervently nationalistic times. When Koch’s team succeeded in isolating the Vibrio etiology of cholera in Egypt after Pasteur’s group had failed, the German government hailed it as proof of the superiority of German science over French science, and Koch was welcomed back to Berlin with a hero’s procession (Brock 1988; Kaufmann and Winau 2005).

Whereas Virchow eventually capitulated to at least public acceptance of Koch’s theories (he returned to Berlin after The Pathological Institute was built for him on the grounds of Koch’s Institute), Pettenkoffer remained a vocal skeptic of the germ theory, even in the face of overwhelming evidence. He famously ingested a culture dose of Vibrio cholerae from Koch’s laboratory, claiming not to become ill from the disease and offering this as proof that the bacillus was not the etiologic agent of cholera. Subsequent reports suggested that Pettenkoffer did experience mild diarrhea after this oral challenge, probably the result of partial immunity from a previous bout of cholera a few years earlier (Brock 1988). History clearly sided with Koch, Pasteur, and their supporters (Sherman 2007).

A serious rift developed between the two great contemporary microbiologists, Pasteur and Koch, during this time. They were each staunch patriots in a period of strident nationalism throughout Europe; enmity between their respective countries was firmly entrenched after the French defeat in the Franco-Prussian War in 1870. But a variety of other philosophical, cultural, and scientific differences existed between these two men. Although competition in the realm of science can be healthy and provide the impetus for discovery, adversarial competition can lead to secrecy and suspicion, thus impeding scientific progress. The tempestuous relationship between Pasteur and Koch vacillated between healthy and unhealthy competition throughout their careers. Fortunately, many of their coworkers were able to maintain more reasoned and collegial professional relationships (Dubos and Dubos 1956; Kaufmann and Winau 2005).

Some of the antagonism between Pasteur and Koch was based on miscommunication. Neither spoke the other’s language, setting the stage for errors in translation. At the Fourth International Congress of Hygiene and Demography in Geneva in 1882, each of these supremely accomplished scientists felt personally insulted by the other’s public remarks; in both instances there appeared to be no malicious intent (Brock 1988). The result was a series of vitriolic verbal and written exchanges that played out during the 1880s at scientific conferences and in the literature. The controversy had largely subsided by 1890, although Koch was conspicuously absent from the world’s celebration of Pasteur’s seventieth birthday in 1892 (Brock 1988).

Aside from their respective issues of national pride, Pasteur and Koch harbored major differences in their styles and scientific approaches (Table 3.1). Pasteur favored a vaccination approach to infectious diseases; Koch believed in a more population-based, public health approach to the problem. Nonetheless, their actions, in many ways, belied their mutual, if muted scientific respect; the Institute in Berlin, for example, was predicated on the Pasteur Institute.

Table 3.1 Fundamental differences between Koch and Pasteur

Koch’s Institute flourished, attracting a superb group of investigators and collaborators to the fields of microbiology and immunotherapy. Notables included Paul Ehrlich, co-discoverer of antibodies, antigens, and chemotherapy for infectious diseases; Richard Pfeiffer, who discovered bacterial endotoxin, the phenomenon of bacteriolysis, and played a major role in the development of killed typhoid vaccines; Emil von Behring, discoverer of serum therapy for diphtheria and tetanus; and Shibasaburo Kitasato and Sakahiro Hata, Japanese scientists who made important contributions to serum therapy and the discovery of salvarsan for the treatment of syphilis, respectively (Brock 1988). Koch’s standard methodologies for bacteriology still continue to be used in clinical microbiology laboratories today, and though his classic “postulates” have been revised and revisited on numerous occasions (Relman et al. 1992; Fredricks and Relman 1996), Koch and his scientific rival Pasteur remain the two most influential figures in the history of microbiology (Kaufmann and Winau 2005).

Following on the heels of the landmark discoveries in the area of bacteriology by Pasteur and Koch, advances in other disciplines of microbiology such as mycology, parasitology, and virology developed at a rapid pace beginning in the late nineteenth century and continuing throughout much of the twentieth. In 1870, Patrick Manson, a Scottish physician working on tropical diseases in the Far East, confirmed the presence of microscopic parasites in mosquito vectors of filariasis. This discovery eventually led the British physician Ronald Ross, working in India, to definitively prove the parasitic nature of malaria and its transmission by mosquitoes in 1896 (Sherman 2007). Contemporaneously, agricultural scientists in the Netherlands and Russia discovered the “filterable agent” responsible for tobacco mosaic disease (Mayer 1886; Ivanowski 1892; Beijerinck 1898). These infectious particles were capable of passing through submicron filters that were known to capture bacteria; hence, a new discipline within microbiology was founded based on sub-microscopic entities that did not completely conform to Koch’s well-accepted scientific dogma. The history of virology and advances in laboratory methodologies for cultivating these microorganisms are considered in detail in Chapter 9.

3.6 Modern Advances in Microbiology

The history of microbiology in the twentieth century was dominated by research discoveries in genetics, nucleic acid biochemistry, and molecular biology. Since Charles Darwin’s description of natural selection and variation and Gregor Mendel’s work in defining the laws of genetics in the mid-nineteenth century, scientists had sought the biochemical basis for genes that determine the destiny of life forms on earth. Oswald Avery, Maclyn McCarty, and Colin MacLeod, working at the Rockefeller Institute identified the “holy grail” of genetics in 1944 with their finding that the “transforming principle” or genetic material of Streptococcus pneumoniae was deoxyribonucleic acid (DNA), not protein as previously postulated (Lederberg 1994). This observation led to the elucidation of the structure of DNA in 1953 by James Watson, Francis Crick, Rosalind Franklin, and Maurice Wilkins (Watson 1968), which in turn led to the deciphering of the genetic code by the former two scientists and ushered in the modern era of molecular biology. The first complete genomic sequencing, that of a bacteriophage was accomplished in 1977 (Sanger et al. 1982); that of a free-living organism, Haemophilus influenzae Rd, was accomplished nearly two decades later (Fleischmann et al. 1995), followed shortly thereafter by the first draft of the human genome in 2001 (Altshuler 1995; Venter et al. 2001). Recent advances in microbiology, including the development of recombinant DNA technology, the polymerase chain reaction, and monoclonal antibodies have revolutionized clinical microbiology and permitted the use of non-culture methods to diagnose fastidious or non-cultivatable organisms such as hepatitis C, Trophyrema whippelii, and a variety of other organisms that likely contribute to human disease (Fredricks and Relman 1996).

3.7 A Brief History of Immunology

Understanding the basic elements of the human immune response evolved rapidly in parallel with the acceptance of the germ theory of disease. The innate immune system evolved in multicellular organisms to defend against invasion by microorganisms. Adaptive or acquired immunity evolved relatively late in vertebrate evolution through the acquisition of large retro-transposons within the genome to accommodate the increasing longevity of complex organisms and to provide long term immunologic memory against potential pathogens to which the host has had previous immunologic exposure. The mechanisms that underlie the capacity of the host to orchestrate an appropriate immune defense have been the focus of research for generations of scientists. Major milestones in the history of immunology are illustrated in Fig. 3.8.

Fig. 3.8
figure 8_3figure 8_3

Milestones in the History of Immunology

The inception of immunology as a distinct discipline has its origin in the late nineteenth century with the development of the cell-mediated and the humoral immune theories of host defense. Ilya Metchnikoff (1845–1916) is credited with first recognizing phagocytosis as an important cellular defense strategy (Ambrose 2006). Metchnikoff (Fig. 3.9), a comparative zoologist from the village of Kharkov in modern-day Ukraine, reasoned that this highly advantageous host defense he observed in starfish mesenchymal cells would be found in higher species as well (Silverstein 2003). Aware of the potential significance of his findings, he changed his career path to human pathology and microbiology. With colleagues at the Pasteur Institute, Metchnikoff confirmed that phagocytosis by neutrophils (“microcytes”) and macrophages was an essential part of the innate immune response in humans. He promulgated the idea of cell-mediated immunity as a defense against specific sets of microbial pathogens in 1884.

Fig. 3.9
figure 9_3figure 9_3

Elie Metchnikoff (Wellcome Library, Permission requested)

German physicians Emil von Behring (1854–1917) and Paul Ehrlich (1854–1915), both assistants in Koch’s Institute of Hygiene laboratory in Berlin in 1890, recognized that serum factors prevented lethality from bacterial toxins such as tetanus and diphtheria (Jaryal 2001). These factors, termed “antitoxins” were subsequently shown to be antibodies; Behring (Fig. 3.10) and Ehrlich (Fig. 3.11) demonstrated that protection could be passively transferred from one animal to another using serum alone. This formed the basis for the use of serum therapy for toxin-mediated infectious diseases, a strategy that became widely used by both the Koch and Pasteur groups. Behring was awarded the inaugural the Nobel Prize in 1901 for his work on immune therapy; Ehrlich and Metchnikoff shared the Nobel Prize in 1908 for their descriptions of humoral and cellular immunity, respectively (Silverstein 2005; Gensini et al. 2007).

Fig. 3.10
figure 10_3figure 10_3

Emil von Behring (right) (Robert Koch Institute)

Fig. 3.11
figure 11_3figure 11_3

Paul Ehrlich (Wellcome Library, Permission requested)

Another fundamental aspect of humoral immunity was discovered by Jules Bordet, a Belgian physician working in Metchnikoff’s laboratory at the Pasteur Institute in 1896, who first identified a heat labile serum factor that contributed to the protection induced by antibodies during the process of serum therapy. Ehrlich similarly observed this property and referred to it as “complement” to describe its complementary effect on the activity of antibodies (Walport 2001). It would take nearly another century for this phenomenon to be fully elucidated (Pillemer et al. 1956; Super et al. 1989).

One of the fundamental problems facing early immunologists was providing an explanation of how a seemingly infinite repertoire of diverse antibodies could be generated to maintain adaptive immunity against myriad potential human pathogens and their antigens. None of the theories advanced in the early part of the twentieth century adequately explicated experimental observations regarding antibody diversity (Weiser et al. 1969).

Ehrlich first proposed the selection or “side chain” theory to explain antibody diversity. He hypothesized that specialized, inducible cells of the immune system existed with antibody-like molecules on their surfaces. Upon coming in contact with a relevant antigen, cells with the highest binding affinity on the side chains of their surface antibodies would be selected, become stimulated and proliferate, releasing antibodies into the circulation. While Karl Landsteiner’s work in the early twentieth century questioned the plausibility that the human body could respond to the array of potential antigens found in the environment in this manner (Figl and Pelinka 2004), Australian virologist-turned-immunologist F. Macfarlane Burnet proposed an alternative hypothesis in 1956 based on modifications of the theories of Danish immunologist Niels Jerne. Burnet’s clonal selection theory, describing the activation, clonal proliferation, and subsequent targeted antibody secretion of lymphocytes after binding to a matched antigen, reconciled experimental observations and was subsequently shown to be the correct explanation for the generation of antibody diversity (Burnet 1957).

Relatively rapid progress in elucidating the functional aspects of the human immune system was initiated with the discovery of the origins of B cells and T cells in the 1950s by Ohio State University graduate students Bruce Glick and Timothy Chang who serendipitously identified the bursa of Fabricius as the site of antibody formation in chickens (Chang et al. 1955; Glick 1955; Adelman 1967; Ribatti et al. 2006). Soon thereafter Francis Miller demonstrated that cell-mediated immune responses required thymic conditioning (Ribatti et al. 1965), and that thymectomy depleted the lymphoid organs of lymphocytes and abrogated these responses (Cooper et al. 1966).

The identification of human disease equivalents to B cell and T cell deficiencies of experimental animals (Stehm and Johnston 2005; Peterson 2007) introduced a new era of cellular immunology (Silverstein 2001). Novel revelations emerged in rapid succession: the essential role of lymphocytes in allograft rejection and the fundamental nature of immune tolerance (Steinman 2007); subtyping and quantitation of T cells and B cells in the mid 1970s (Köhler and Milstein 1975); the role of natural killer cells and regulatory T cells (Sakaguchi et al. 2007); the details of antigen processing (Gordon 2007), presentation, and T cell signaling by macrophages (Zinkernagel and Doherty 1974); the critical interactions between T and B cells (Claman and Chaperon 1969); and the role of dendritic cells in antigen presentation (Steinmann and Cohn 1973).

The last decades of the twentieth century witnessed renewed interest in the role of the innate immune system in host defense upon initial encounter with potential pathogens. The discovery of pro-inflammatory cytokines such as tumor necrosis factor (Carswell et al. 1975) and interleukin-1 (Auron et al. 1984) were major milestones in understanding immune cell signaling and response, a field advanced further by the discovery of the toll-like receptors (TLRs) in the early 1990s (Beutler et al. 2006). The definition of TLR4 as the lipopolysaccharide receptor capped a century-long search for the receptor for bacterial endotoxin and brought immunology full circle back to its bacteriologic roots (Beutler and Poltorak 2000).

3.8 Summary and Conclusions

Perhaps no other developments have had such a major impact on the health and welfare of humankind than those comprising the history of microbiology and immunology. Over the last 100 years, the mortality burden of infectious diseases has decreased substantially and the average lifespan has increased by over 30 years due to advances in public health, sanitation, vaccines, and anti-infective chemotherapy – all deriving from the sciences of microbiology and immunology (Centers for Disease Control 1999). Future advances are anticipated when the genomics era in which we now live and the promise of systems biology and personalized medicine are fully realized in the next few decades. A remarkable story of directed inquiry into the fundamental nature of microbes and immune defenses preceded many of the current advances in medicine. Much work remains before the benefits of these discoveries can be applied equally worldwide.