Breakthrough!: How the 10 Greatest Discoveries in Medicine Saved Millions and Changed Our View of the World - Jon Queijo (2010)

Chapter 6. The Scratch that Saved a Million Lives: The Discovery of Vaccines

image

Clara and Edgar, Part I

Riding on the wave of an explosive sneeze, the microscopic enemy blasts out at 100 miles per hour, a cloud of 40,000 aerosolized droplets that instantly fills the room. Clinging to ocean-sized droplets, the invisibly small microbes drift about for several minutes, patiently awaiting their next victim. The wait is not long. Gently wiping the nose of her dying four-year-old child, Clara complies by the simple act of breathing.

The enemy makes landfall in Clara’s nose and throat and within hours has mobilized into nearby lymph nodes. Entering cells, it converts them into reproductive slaves, forcing them to churn out the enemy’s own offspring. In just half a day, each cell begins releasing a new generation, dozens of new invaders to join the expanding army, infecting more and more cells. Several days later, the enemy enters Clara’s bloodstream.

But as the deadly assault continues, Clara’s own protective army fails to mount a response. The enemy slips quietly by, unrecognized and undeterred...

* * *

The “enemy” is variola virus—smallpox—a fuzzy, brick-shaped microbe so tiny that a bacterium would tower over it like a small house, a red blood cell would dwarf it like a football stadium. Like other viruses, it is so genetically primitive that it exists in the creepy netherworld between the living and the dead. For tens of thousands of years, its ancestors lived in Africa, content to infect only rodents. But about 16,000 years ago, something within its sparse 200 genes mutated and gave birth to a new form—a virus that could infect only humans. From that time onward, the new strain ungraciously thanked its human hosts by killing 30% of those it inhabited.

Over thousands of years, the virus joined its human hosts in their migration out of Africa, into Asia, and eventually Europe. With each sick person capable of infecting five or six others, it traveled easily from culture to culture, triggering waves of epidemics. The first evidence of its presence in humans is seen in skin rashes on Egyptian mummies dating from 1580 BC. The first recorded smallpox epidemic occurred 200 years later during the Egyptian-Hittite war. By 1122 BC, smallpox-like diseases were being reported in ancient China and India...

Clara and Edgar: Part II

The enemy continues to multiply relentlessly throughout Clara’s body, but only now—two weeks later—does she experience her first symptoms. It begins with a high fever, chills, and exhaustion. Then severe headaches, backache, and nausea. Mercifully, the symptoms abate within days—just in time for the real trouble to begin. Mounting a whole new campaign of terror, the virus begins invading small blood vessels in her skin.

The infamous rash—the “Speckled Monster”—first emerges as small red spots on the tongue and in the mouth. Soon, it appears on her face, and within 24 hours it has spread over her entire body. For the next few weeks, the rash follows a hideous and predictable progression: Flat red spots rise into bumps. The bumps fill with a thick milky fluid that develop belly button-like depressions and then evolve into rounded pustules so packed with fluid that they feel like hundreds of beads embedded in the skin. As pustules erupt over the entire body and emit a repugnant smell, the effect is grotesque, as if evil itself were bubbling up from inside. Finally, the pustules dry into a crust and form a scab. When the scabs fall off, they leave a disfiguring landscape, a face pitted by scars...

But all of this assumes that Clara is still alive. In one-third of cases, often when the pustules are so widespread that they touch one another, patients die from an immune system so overwhelmed that it destroys the very tissues it is trying to save. The virus also attacks other parts of the body, leaving many survivors blind and with limb deformities. In the meantime, anyone close to the patient during the contagious rash phase may already be nursing the next generation.

* * *

One of the first and most devastating smallpox epidemics to be recorded, the Antonine plague, began in 165 AD and lasted until 180 AD. Killing from three to seven million people, some theorize that it contributed to the fall of the Roman Empire. As the centuries rolled on, the virus continued its deadly global march by joining the Crusades and Arab expansion. By the 1500s, it threatened—and actually eliminated—entire civilizations. Brought to the New World by Spanish and Portuguese conquistadors, smallpox killed 3.5 million Aztec Indians and caused the downfall of both the Aztec and Incan empires. In the eighteenth century, smallpox was endemic or epidemic in most major European cities, killing 400,000 people a year, including five reigning European monarchs, and causing up to one-third of all blindness.

Clara and Edgar: Conclusion

Having buried their child a few days earlier, Edgar enters the room to care for his dying wife, Clara. He finds it unbearable to watch her agonizing struggle, recalling how he once suffered from the same illness when he was a child. In the final hours before Clara’s death—just two weeks after her symptoms began—there is an explosive sneeze and the simple act of breathing in the air. The enemy makes landfall in Edgar’s nose begins its next invasion.

But this time the virus is not so lucky. Inside Edgar’s body, it is immediately recognized by cells that remember his childhood encounter from long ago. The cells spring to life, multiply, and begin producing a deadly weapon: antibodies. Specialized proteins designed to target and attack this exact invader, the antibodies begin their work. They block the virus from latching onto cells; they stop it from entering cells; they prevent it from replicating in cells, and—for any that manage to survive—they neutralize and help destroy it. In the weeks that follow, Edgar does not experience a single symptom.

* * *

One of the first major clues that smallpox could be defeated was recorded in 910 AD by the Persian physician Rhazes (Al-Rhazi). Considered the greatest physician of the Islamic world, Rhazes not only wrote the first known medical account of smallpox, but noted a curious—and critical—clue: People who survived smallpox were somehow protected against subsequent attacks.

About the same time, writings began to appear in China providing a second key clue: People could actually protect themselves from the disease by taking scabs from a victim, crushing them into a powder, and swallowing or scratching it into the skin. But though this unsavory-sounding practice—called variolation—seemed to work and eventually was also practiced in Asia and India, it was not widely adopted, perhaps because of one unfortunate side effect: The risk of accidentally contracting full-blown smallpox and dying.

And so the deadly epidemics continued over the centuries, spreading and periodically erupting around the globe. For 16,000 years, tragic stories like that of Clara and Edgar recurred in countless variations as the smallpox virus conducted its relentless death march through human civilization. Until finally in the late eighteenth century, a country doctor in Gloucestershire, England, performed a curious experiment that would change the world...

May 14, 1796: An historic turn of events

James Phipps, a healthy young eight-year-old boy, is brought into a room where a doctor suddenly seizes his bare arm and cuts two shallow incisions into his skin. A cloud of tiny particles—taken from a sore on the hand of a dairymaid infected by a disease called cowpox—instantly fills the shallow wound.

Making landfall near the base of James’ epidermis, the microscopic enemy—the cowpox virus—enters nearby cells and begins replicating. But despite its distant relation to smallpox, this virus poses little danger. Within days, specialized cells in James’ body begin producing antibodies that target and attack the invader. The cowpox virus is quickly defeated, and James experiences only mild symptoms. But as the evidence will later show, James is not only protected from future attacks by cowpox: Because of the virus’s similarity to its deadly cousin, he is now also immune to smallpox.

* * *

Although it would be nearly another 100 years before scientists had even a rudimentary understanding of why it worked, when Edward Jenner inoculated James Phipps in May, 1796 with infectious cowpox viruses from a lesion on the hand of a dairymaid, he capitalized on clues that had been accumulating for more than 1,000 years. And in so doing, he laid the scientific foundation for one of the greatest breakthroughs in medicine: vaccines.

Vaccines’ clever secret: not fighting, but teaching the body to fight disease

It is fortunate for the human race that the world’s first vaccine was so effective against what many consider to be the world’s worst disease. Few people today remember the threat smallpox once posed to human civilization, but even as late as the 1950s—150 years after the discovery of an effective vaccine—smallpox continued to infect 50 million people yearly, killing two million of them. As the World Health Organization has noted, no other disease of the past or present has approached smallpox in the devastation it has caused the world population.

Since Jenner’s historic milestone 200 years ago, advances in vaccines have followed a long and remarkable road, reflecting the complexity of disease and the intricacies of the human body. Today, vaccines remain one of medicine’s most remarkable approaches to fighting disease for two reasons. First, unlike most treatments, vaccines do not directly attack disease. Rather, they teach the body how to fight a disease by training it to produce its own weaponry—antibodies. Second, in a delicious twist of biological irony, every vaccine is made from the very disease it is designed to fight—typically, a weakened or killed form of the bacteria or virus in question.

While the journey to understanding and creating vaccines was initially slow, soon a dizzying series of milestones would create a growing arsenal of vaccines. Today, vaccines have enabled us to control ten major diseases—smallpox, diphtheria, tetanus, yellow fever, pertussis, Haemophilus influenzae type b, polio, measles, mumps, and rubella (German measles).

But while Edward Jenner is properly credited for his role in the discovery of vaccines, often overlooked is the fact that the first true milestone occurred decades earlier in southern England, when a farmer named Benjamin Jesty took a daring risk to save his family from a local outbreak of smallpox. Jesty led his wife and two young children on a two-mile hike through a patchwork of hedgerows near the wooded slopes of Melbury Bubb and the River Wiggle. And there, in the cow pastures of Farmer Elford, he gathered his family, knelt down by a sick cow, and pulled out a sharp stocking needle...

Milestone #1 What the milkmaid knew: a daring experiment in a cow pasture

In some ways, it’s surprising no one had thought of it before. On the one hand, by the mid-1700s it was common knowledge among country folk that milkmaids who caught relatively mild cowpox were subsequently immune to deadly smallpox. Cowpox was an annoyance with which many farmers were familiar: Sporadically erupting on farms, it caused small pustules to form on the udders of cows and decrease their milk production. This did not please the farmers, nor did the fact if one of their milkmaids contracted the disease through an open cut, she would soon develop similar pustules on her skin, along with fever, headaches, and other symptoms that forced her to stop working for a few days. Fortunately, the dairymaids quickly recovered, and when they did, they were now immune to not only cowpox but—if folklore was to be believed—smallpox.

On the other hand, variolation—the risky practice of inoculating people with a small amount of live smallpox to protect them against it—had been introduced to England in 1721 and by the mid-1700s was well-known and practiced by many physicians. Yet there remained a critical gap: Few people made the link between what dairymaids knew about cowpox and what doctors knew about smallpox variolation...until Benjamin Jesty’s historic family field trip to a nearby cow pasture.

Benjamin Jesty was a prosperous farmer who, despite a lack medical training, had a reputation for intelligence and a penchant for innovation. Thus, in 1774, when a smallpox epidemic broke out in Jesty’s community in Dorset county, a fear for his family’s health set him to thinking. Although not everyone believed—or was even aware—that cowpox might protect a person from smallpox, Jesty had heard the rumors. In fact, years before, two servants, who both had previously contracted cowpox, had told him that they had later survived an outbreak of smallpox, despite caring for two boys with the highly contagious disease. Jesty filed this information away, just as he had filed away something else he had apparently learned from local doctors: the technique of variolation.

And so in the spring of 1774, putting those two facts together, 37-year-old Jesty took a leap of faith no one else had made. With a local smallpox outbreak under way, he led his family on a two-mile walk through the patchwork hedgerows and wooded slopes of Melbury Bubb, entered Farmer Elford’s pasture, and found a cow whose udders were marked by the distinctive sores of cowpox. Jesty then pulled out one of his wife’s stocking needles, dipped its slim point into an open lesion, and did something most people at the time would have considered ill-advised, if not immoral. He inoculated his entire family with the infectious cowpox material: immediately below the elbow of his wife Elizabeth (to avoid her dress sleeve) and above the elbows of his sons, Robert, 3, and Benjamin, 2. Jesty did not inoculate himself since he’d already had cowpox as a youth.

The experiment was nearly a disastrous failure. Within days, Elizabeth’s arm became inflamed, she developed a serious fever, and might have died had she not received care from a doctor. But happily, Elizabeth recovered, and the experiment proved a success. Jesty’s wife and two sons remained free of smallpox for their rest of their lives, despite later exposure to epidemics. What’s more, both of his sons’ immunity was confirmed when they were later variolated and had no reaction. (A lack of a reaction to variolation is evidence that a person is immune to smallpox.)

Unfortunately, when news of Jesty’s experiment got out, it reportedly caused “no small alarm” in the neighborhood, particularly among those who viewed the mixing of substances between humans and animals to be an “abomination” against God. As news spread through the community, whenever Jesty attended local markets he was scorned, ridiculed, and even pelted with mud and stones.

Sadly, despite his successful gamble, Jesty never inoculated another person, and there is no written evidence that Edward Jenner even knew of Jesty’s experiment. Nevertheless, Jesty eventually received credit for his milestone, while Jenner would take the discovery to a whole new level—one that would eventually impact the world.

Milestone #2 From folklore to modern medicine: Jenner discovers the science of vaccination

What motivates a man to discover one of the top ten breakthroughs in the history of medicine? In the case of Edward Jenner, it was not simply a desire to conquer the deadliest disease in human history. Rather, it was the desire to spare others from something that had nearly killed him when he was just eight years old: a poorly conceived—if not downright bizarre—attempt to prevent the disease.

The irony is that when Jenner was variolated in 1757, the procedure had been practiced in England for 35 years and was considered reasonably safe and well accepted. While variolation clearly had its risks—about 1 in 50 people contracted full-blown smallpox from the procedure and died—it was still preferable to the 1 in 3 risk of death faced by those who contracted smallpox naturally. Nevertheless, in a misguided attempt to improve variolation, some physicians had begun devising “preparations,” in which, prior to being variolated, patients were subject to weeks of purging, enemas, bleeding, fasting, and dieting. The ordeal was so extreme that the preparation itself sometimes proved fatal. Narrowly missing this fate as a child, Jenner later recalled of his six-week regimen, “There was taking of blood until the blood was thin, purging until the body was wasted to a skeleton, and starving on a vegetable diet to keep it so.”

But Jenner’s pain was humanity’s gain. Thanks to his frightful experience, he developed a lifelong aversion to variolation and a powerful motivation to find a better way to prevent smallpox. Like Benjamin Jesty, the pieces to the puzzle came to Jenner gradually over the course of many years. Born in Gloucestershire in 1749, one of Jenner’s first major clues came to him when he was just 13 years old. Working as an apprentice to a surgeon at the time, Jenner was intrigued when he overheard a dairymaid boasting, “I shall never have an ugly pockmarked face.” She was referring, of course, to the facial scarring often seen in those lucky enough to survive smallpox. The reason for her confidence? “I shall never have smallpox,” she explained, “for I have had cowpox.”

The dairymaid’s confidence in local folklore made a strong impression on the young Jenner. From that time forward, he had a persistent curiosity about the link between cowpox and smallpox. Unfortunately, this persistence was not shared by peers, as seen early in his career, when Jenner raised the topic one too many times at an informal medical society. According to his friend and biographer Dr. John Baron, “It became so distasteful to his companions [that] they threatened to expel him if he continued to harass them with so unprofitable a subject.”

In 1772, after completing his medical training, 23-year-old Jenner set up a medical practice in Berkeley, Gloucestershire. Around 1780, still intrigued by the connection between cowpox and smallpox, he began collecting case reports of people who had been infected with cowpox and were later found to be immune to smallpox (as seen by their lack of reaction to variolation). In 1788, Jenner made sketches of cowpox lesions he had seen on the hands of infected milkmaids, took them to London to show to several doctors, and discussed his idea that cowpox could protect against smallpox. Most were unimpressed. Similarly, when Jenner later asked some medical colleagues for help in investigating the link, they insisted the idea was ridiculous and merely an old wives’ tale.

But Jenner remained undiscouraged and continued his investigations until he could no longer avoid his milestone destiny: On May 14, 1796, taking the matter into his own hands, Jenner performed his first vaccination on eight-year-old James Phipps, the son of a laborer who occasionally worked for Jenner. Jenner inoculated the boy with infectious cowpox “matter” taken from the hand of a milkmaid named Sarah Nelmes, who had picked up her infection from a cow named Blossom. Like Benjamin Jesty, who had achieved the same milestone 22 years earlier, Jenner’s experiment proved to be a success: When Phipps was variolated six weeks later, and again a few months after that, the lack of reaction showed that he was indeed immune to smallpox. In fact, Phipps went on to live a long life free of smallpox and even had himself variolated some 20 times to prove his immunity.

Yet despite Jenner’s victory, news of his success was no more welcome than that of Jesty’s 20 years earlier. When in 1796 he submitted a paper to the Royal Society describing the Phipps experiment and 13 case histories of people who became immune to smallpox after immunization with cowpox, the paper was promptly rejected due to a lack of sufficient data. What’s more, Jenner’s experiment was deemed “at variance with established knowledge” and “incredible,” and Jenner was warned that he “had better not promulgate such a wild idea if he valued his reputation.”

Jenner could do nothing about the “wild” or “incredible” nature of his idea, but he could gather more data. Unfortunately, he had to wait more than a year for the next outbreak of cowpox, but when it finally occurred in the spring of 1798, Jenner inoculated two more children. He then inoculated several more children from lesions on one of the first two children—the so called “arm-to-arm” method. When variolation later showed that all of the children were immune to smallpox, Jenner knew his case was proven. But rather than approach the Royal Society again, he self-published his findings in a now-classic 64-page paper, An Inquiry into the Causes and Effects of Variolae Vaccinae, or Cowpox.

Making the leap: from publication to public acceptance

Jenner made many important claims in this paper, including: inoculation with cowpox protected against smallpox; protection could be propagated from person to person by “arm–to-arm” inoculation; and that unlike smallpox, cowpox was not fatal and only produced local, noninfectious lesions. The paper also included Jenner’s first use of the term vaccinae (from the Latin vaca, meaning cow), from which “vaccine” and “vaccination” would be derived.

Yet even with these new findings, Jenner continued to face doubts and derision from his peers. Opposition ranged along many fronts, with some physicians disputing that cowpox was a mild disease, others claiming that vaccination didn’t work when they tried to repeat Jenner’s experiment, and still others opposing vaccination on religious or moral grounds. Perhaps the most bizarre objection came from those who claimed that their attempts at vaccination resulted in patients developing “bovine” characteristics—a notion that led to one cartoon showing vaccinated babies with “cow horns” sprouting from their heads.

Eventually, however, as more credible doctors tried the technique, more positive reports began to emerge. The vaccine seemed to work after all, although debates continued over its effectiveness and safety. In the meantime, Jenner continued his work and published more papers that clarified or revised his views based on new evidence. While Jenner was not always right—he incorrectly believed that vaccination provided life-long protection—the practice of vaccination began to spread surprisingly fast. Within a few years, vaccinations were being administered not only in England, but throughout Europe and soon in other areas of the world. In America, the first vaccinations were given on July 8, 1800, when Benjamin Waterhouse, a professor at Harvard Medical School, inoculated his 5-year-old son, two other children, and several servants. Waterhouse later sent the vaccine to President Thomas Jefferson for distribution to southern states, and Jefferson soon had his entire family and neighbors (some 200 people) vaccinated.

By 1801, Jenner had no doubts about the success of vaccination, as seen when he wrote, “The numbers who have partaken of its benefits throughout Europe and other parts or the globe are incalculable, and it now becomes too manifest to admit of controversy that the annihilation of the Small Pox, the most dreadful scourge of the human species, must be the final result of this practice.”

Although no one in Jenner’s time remotely understood how vaccines worked or what caused smallpox, and though not technically the “first” person to vaccinate a person against smallpox, today historians give Jenner primary credit for this milestone because he was the first to scientifically demonstrate that vaccines can work. Equally important, he gave the world its first reasonably safe way to stop the deadliest disease in human history.

* * *

And yet...despite Jenner’s success, it was soon clear that his vaccine had some serious shortcomings. For one thing, the immunity was not life-long, and no one was exactly sure why. Some scientists speculated that the loss of potency might be due to a concept called “passages,” the progressive weakening that occurred as the vaccine was continually transferred through “arm-to-arm” inoculation. In other words, perhaps the “agent” responsible for conferring immunity somehow lost more and more of its disease-fighting ability each time it was transferred from person to person.

At the same time, Jenner’s vaccine raised other vexing questions. For example, why couldn’t his approach—using a harmless disease to make vaccines against a related dangerous one—be used against all diseases? The answer, as we now know, is that Jenner’s vaccine was a stroke of enormous good fortune. The fact that the smallpox virus happened to have a harmless but closely related cowpox cousin was a quirk of nature, so rare that it is not seen in any other human infections. Indeed, lacking any other ways to make vaccines, the story of vaccines would have been a very short one indeed.

Perhaps that’s why vaccine development did hit a dead end for the next 80 years. Until finally one scientist—already a key player in the discovery of “germ” theory—made a milestone leap by going on a long vacation...

Milestone #3 A long holiday and a neglected experiment lead to a new concept in vaccines

By the 1870s, Louis Pasteur had already achieved his lion’s share of medical milestones. Over the previous three decades, he had contributed significantly to the discovery of germ theory through his work in fermentation, pasteurization, saving the silkworm industry, and disposing of the theory of spontaneous generation. But in the late 1870s, Pasteur was poised yet again to make a milestone discovery, this time after receiving a rather inauspicious gift: the head of a chicken.

It was no threat or cruel joke. The chicken had died from chicken cholera—a serious and rampant disease that at the time was killing as many as 90 out of 100 chickens—and the veterinarian who sent the specimen to Pasteur for investigation believed the disease was caused by a specific microbe. Pasteur soon verified this theory: When microbes from the chicken head were grown in a culture and then injected into healthy chickens, the injected chickens soon died of chicken cholera. Though this discovery helped support growing evidence for germ theory, Pasteur’s culture of bacteria soon played an even more profound role—thanks to a combination of neglect and serendipity.

In the summer of 1879, Pasteur went on a long holiday, forgetting about the chicken cholera culture he had created, and leaving it exposed to the air. When he returned from his vacation and injected it into some chickens, he found the culture was not so deadly anymore: When chickens were inoculated with the weakened, or attenuated, bacteria, they got sick but did not die. But Pasteur’s next discovery was even more significant. If the chickens were allowed to recover and then were injected with deadly chicken cholera bacteria, they were now fully resistant to the disease. Pasteur immediately realized that he’d discovered a new way to make vaccines: Inoculation with a weakened form of a microbe somehow enabled the body to fight off its deadly form. Discussing his discovery in an 1881 article in The British Medical Journal, Pasteur wrote, “We touch the principle of vaccination...When the fowls have been rendered sufficiently ill by the attenuated virus... they will, when inoculated with virulent virus, suffer no evil effects... chicken cholera cannot touch them...”

Inspired by his milestone discovery, Pasteur began investigating how this new approach could be used to make vaccines against other diseases. His next success was with anthrax, a disease that was disrupting the agriculture industry by killing from 10% to 20% of sheep. Earlier, Robert Koch had already shown that anthrax was caused by bacteria. Pasteur now began investigating whether anthrax bacteria could be sufficiently weakened to make them harmless, yet still able to stimulate protection in the body if inoculated as a vaccine. He eventually succeeded by growing the bacteria at elevated temperatures. And when faced by some peers who doubted his findings, Pasteur soon found an opportunity to prove himself with a dramatic public experiment. On May 5, 1881, Pasteur inoculated 24 sheep with his new attenuated anthrax vaccine. Nearly two weeks later, on May 17, he inoculated them again with a more virulent—but still weakened—vaccine. Finally, on May 31, he injected deadly anthrax bacteria into both the vaccinated sheep and 24 sheep who had not been vaccinated. Two days later, a large crowd of people—including senators, scientists, and reporters—gathered to witness the dramatic results: All of the vaccinated sheep were alive and well, while those who had not vaccinated were dead or dying.

But perhaps Pasteur’s most famous achievement in this area was his creation of a rabies vaccine, his first vaccine for humans. At the time, rabies was a dreaded and almost invariably fatal disease. Typically caught from the bite of an infected dog, treatments of the day ranged from the awful—inserting long heated needles deep into bite wounds—to the horrible—sprinkling gunpowder over the wound and setting it on fire. Although no one knew what caused rabies—the causative virus was too small to see with current microscopes, and it could not be grown in a culture—Pasteur was convinced that the disease was caused by a microbe that attacked the central nervous system. To create his vaccine, Pasteur cultivated the unknown microbe in the brains of rabbits, attenuated it by drying the tissue fragments, and used the fragments in a vaccine.

Although initially reluctant to try the experimental vaccine in humans, on July 6, 1885, Pasteur was compelled to reconsider when nine-year-old Joseph Meister was brought to him with 14 wounds from a rabid dog. Surrendering to the pleas of the boy’s mother, Pasteur administered his new vaccine. The lengthy vaccination—13 inoculations over 10 days—was a success, and the boy survived. And despite some public protests that a deadly agent had been inoculated into a human, within 15 months nearly 1,500 others had received the rabies vaccine.

And so, in just eight years, Pasteur not only achieved the first major advance in vaccination since Jenner’s time—attenuation—but had also created successful vaccines against chicken cholera, anthrax, and rabies. Yet there was one unexpected twist to his milestone work: It wasn’t all about reducing the virulence of a virus. As Pasteur himself later realized, most of the viruses in his rabies vaccine were probably not weakened, but killed. And therein lay the seeds for the next major milestone.

Milestone #4 A new “killed” vaccine for the birds (not to mention cholera, typhoid, and plague)

By the late 1800s, vaccine development was about to benefit from the birth of a new Golden Age, the discovery of bacteria responsible for numerous diseases, including gonorrhea (1879), pneumonia (1880), typhoid (1880-1884), tuberculosis (1882), and diphtheria (1884). During this time, Theobald Smith, a bacteriologist working at U.S. Department of Agriculture, was assigned to find the microbial culprit responsible for causing hog cholera, a disease that was threatening the livestock industry. While Smith and his supervisor, Daniel Salmon, managed to isolate a causative bacterium, they soon made another far more important discovery: If the microbe was killed by heat and injected into pigeons, the pigeons were then protected against the deadly form of the bacteria. This finding, published in 1886 and soon verified by other researchers, represented a new milestone: Vaccines could be made from killed—not merely weakened—cultures of the causative microbe.

The concept of killed vaccines was a major advance in vaccine safety, particularly for those who opposed the idea of vaccines made from live or attenuated microbes. Other scientists soon began trying to make killed vaccines for other diseases, and within just 15 years, the benefactors extended beyond the world of pigeons to the humans affected by three major diseases: cholera, plague, and typhoid.

In the late 1800s, cholera remained a serious problem throughout the world, despite John Snow’s milestone work in the late 1840s showing that it was transmitted by contaminated water and Robert Koch’s discovery in 1883 that it was caused by a bacterium (Vibrio cholerae). While early attempts to create live and attenuated cholera vaccines showed some success, they were abandoned due, in part, to severe reactions. However, in 1896, Wilhelm Kolle achieved a milestone discovery when he developed the first killed vaccine for cholera by exposing cholera bacteria to heat.

Typhoid was another life-threatening disease caused by bacteria (Salmonella Typhi) and transmitted by contaminated food or water. While it’s still unclear who the first person was to actually inoculate a human with killed typhoid vaccine, in 1896 British bacteriologist Almroth Wright published a paper announcing that a person inoculated with dead salmonella organisms had been successfully protected against the disease. Wright’s killed typhoid vaccine was later used with encouraging results in a field trial of 4,000 volunteers from the Indian Army. Tragically, although Wright’s vaccine was later used to vaccinate British troops in the Boer War in South Africa, vaccine opponents prevented many others from being vaccinated, going so far as to dump vaccine shipments overboard from transport ships. The result? The British Army suffered more than 58,000 cases of typhoid and 9,000 deaths.

Plague, a disease that killed millions of people in Europe during the Middle Ages, is usually transmitted by bites from fleas carried by rats. The causative bacterium, Pasteurella pestis (later renamed Yersinia pestis), was discovered 1894. Two years later, Russian scientist Waldemar Haffkine was working in India on a cholera vaccine when the plague broke out in Bombay. Switching his efforts, he soon created a killed vaccine against the plague and, in 1897, tested its safety by inoculating himself. The gamble paid off, and within a few weeks, 8,000 people were vaccinated.

And so by the twentieth century, just a century after Jenner’s milestone, the vaccine tally now included two “live” vaccines (smallpox and rabies), three attenuated vaccines (chicken cholera and anthrax), and three killed vaccines (typhoid, cholera, and plague).

Milestone #5 The power of passivity: new vaccines to fight diphtheria and tetanus

In the late 1800s, diphtheria was one of many diseases that took countless human lives, killing as many as 50,000 children a year in Germany alone. Caused by the bacterium Corynebacterium diptheriae, diphtheria can cause life-threatening swelling of the airways and damage the heart and nervous system. In 1888, scientists discovered that diphtheria bacteria cause their deadly effects by producing a toxin. Two years later, German physiologist Emil von Behring and Japanese physician Shibasaburo Kitasato made a crucial finding: When animals were infected by diphtheria, they produced a powerful substance that could neutralize this toxin—in other words, an antitoxin. This finding was followed by another discovery that led to the next milestone in vaccines: If the antitoxin was removed from the animals and injected into other animals, it not only protected against diphtheria, but could cure the disease if it was already present.

Just one year later, in December 1891, the first child was inoculated with diphtheria antitoxin and, after further refinements, the vaccine went into commercial production in 1894. Although antitoxin vaccines had their limitations, they would soon be developed against other important diseases, including tetanus.

Antitoxin vaccines were a major advance because they represented a new major concept in vaccines: Active versus passive immunity. Active immunity refers to vaccines that stimulate the body to mount its own fight against the microbe, as with the vaccines discussed previously. In contrast, passive immunity involves the transfer of the protective substance from one human or animal to another. Apart from diphtheria and tetanus vaccines, another example of passive immunity is the transfer of antibodies from a mother to her baby during breast feeding. One drawback to passive immunity, however, is that it fades over time, while active immunity is usually permanent.

Von Behring’s work in creating a diphtheria vaccine won him the first Nobel Prize in Physiology or Medicine in 1901. But his milestone would soon lead other researchers to solve a greater mystery that had been lurking since Jenner’s time: Never mind how they were made—live, attenuated, or killed microbes or antitoxins—exactly how did vaccines work?

Milestone #6 An emergence of understanding—and the birth of immunology

Of course, many theories purporting to explain how vaccines might work had been offered over the years. For example, “Exhaustion” theory, held by Pasteur and others, proposed that inoculated microbes consumed “something” in the body until it was depleted and the microbes died off. Another theory, “Noxious Retention,” suggested that inoculated microbes produced substances that inhibited their own development. But both theories shared the false view that the body played no active role in vaccines and was merely a passive bystander as the inoculated microbes caused their own demise. Both theories were eventually abandoned in the face of new evidence and new vaccines, and before long the milestone work of two scientists would not only create a new understanding, but a new scientific field and, in 1908, a shared Nobel prize.

A flipped perspective leads to the discovery of the immune system

The roots of Elie Metchnikoff’s milestone insight can be traced back to 1883, when the Russian microbiologist performed a landmark experiment in which he observed certain cells have the ability to migrate through the tissues in response to injury or damage. What’s more, these cells had the ability to surround, engulf, and digest foreign substances, a process Metchnikoff called phagocytosis (from the Greek phago, to devour, and cytos, cell). While it initially appeared that cells used phagocytosis as a way to take up nutrients, Metchnikoff suspected they weren’t simply out for a Sunday brunch. His hunch was supported by a disagreement he had with Robert Koch, who in 1876 described what he thought were anthrax bacilli invading white blood cells. In his milestone insight, Metchnikoff’s flipped this perspective around: The anthrax bacteria were not invading white blood cells; rather, white blood cells were engulfing and devouring the bacteria. With this insight, Metchnikoff realized that phagocytosis was a weapon of defense, a way to capture and destroy foreign invaders. In short, he had uncovered a cornerstone to the larger mystery of how the body defends itself against disease: the immune system.

By 1887, Metchnikoff had categorized the particle-devouring white blood cells as “macrophages” and, equally important, recognized a key guiding principle by which the immune system operates. In order to function properly, whenever it encounters something in the body, the immune system must “ask” a very basic—but critical—question: Is it “self” or “non-self”? If the answer is “non-self”—whether a smallpox virus, anthrax bacterium, or diphtheria toxin—the immune system may begin its attack.

A new theory helps solve the mystery of immunity and how antibodies are made

Like many scientists, Paul Ehrlich’s milestone discovery relied in part on new tools that revealed a world previously unseen. For Ehrlich, a German scientist, these tools were dyes—specific chemicals that could be used to stain cells and tissues and thereby reveal new structures and functions. By 1878, when Ehrlich was just 24 years old, they helped him describe several major cells of the immune system, including various types of white blood cells. By 1885, these and other findings led Ehrlich to begin speculating on a new theory of how cells could take in specific nutrients: He proposed that various “side-chains” on the outside of cells—what we now call receptors—could attach to specific substances and bring them inside the cell.

As Ehrlich developed a greater interest in immunology, he began wondering if his receptor theory could explain how diphtheria and tetanus vaccines work. Previously, as we saw, Behring and Kitasato had discovered that when an animal was infected by diphtheria bacteria, it produced an antitoxin and that this antitoxin could be removed and used as a vaccine to protect others against diphtheria. As it turns out, these “antitoxins” were actually antibodies—specific proteins made by cells to target and neutralize the diphtheria toxin. As Ehrlich performed other pioneering work with antibodies, he pondered how his receptor theory might explain how antibodies worked. And he soon arrived at his milestone insight.

While Ehrlich’s initial side-chain theory suggested that cells had a large variety of receptors on their outsides, each designed to attach to a specific nutrient, he later expanded this theory and proposed that harmful substances—such as bacteria or viruses—could mimic nutrients and also attach to specific receptors. And what happened next, Erhlich proposed, explained how cells produced antibodies against the foreign invader. With the harmful substance attached to its receptor, the cell could then identify key features on the harmful substance and begin producing huge numbers of new receptors that were identical to the one that attached to the invader. It was these receptors that then detached from the cell and became antibodies—the highly specific proteins that could seek out, attach to, and inactivate other harmful substances.

Thus, Ehrlich’s theory finally explained how specific foreign invaders, once in the body, could be recognized by cells and trigger them to produce specific antibodies that would seek out and attack the invader. The beauty of this theory was that it explained how the body could produce antibodies against specific diseases, whether the antibodies were made as a response to a prior illness, variolation, or vaccination.

Of course, Ehrlich didn’t get everything right. For one thing, it turned out that not all cells had the ability to attach to foreign invaders and produce antibodies. That critical task was actually accomplished by a specific type of white blood cell—B lymphocytes. What’s more, decades of additional research would be required to explain the many complicated roles played by B cells, and many other cells and substances of the immune system.

Nevertheless, today the milestone discoveries of Metchnikoff and Ehrlich are recognized as two complementary cornerstones of modern immunology and of the long-sought explanation of how vaccines work.

Vaccines of the twentieth and twenty-first centuries: the golden age and beyond

By the end of the nineteenth century, vaccines had truly arrived as a major medical breakthrough. Not only had human vaccines been produced for smallpox, rabies, typhoid, cholera, plague, and diphtheria, but most of the fundamental concepts of vaccinology had been introduced. In fact, the subsequent gains made in vaccines throughout the twentieth century can be viewed as refinements to the basic concepts that were known at the end of the nineteenth century.

Nevertheless, vaccines made major advances in the early twentieth century, with the development of vaccines for tuberculosis (1927), yellow fever (1935), pertussis (1926), influenza A (1936), typhus (1938), along with improved vaccines for diphtheria (1923) and tetanus (1927). In addition, in 1931, American pathologist Ernest William Goodpasture introduced a new technique for growing viruses using fertile hen’s eggs, resulting in a cheaper and safer way to produce vaccines.

The advances continued after World War II, with the so-called Golden Age of vaccine development. In 1949, John Enders and his associates at Boston Children’s Hospital developed a technique for growing viruses in human cells outside a living host; their first efforts not only led to the polio vaccine, but an explosion of vaccine research and advances that continue to this day. Apart from oral and injected polio vaccines, vaccines developed since World War II included those for measles, mumps, rubella, rotavirus, Japanese and tick-borne encephalitis, Lyme disease, hepatitis A and B, meningitis, pneumonia, and influenza, as well as improved vaccines for typhoid, rabies, cholera, typhoid, anthrax, and smallpox.

While the list of recent vaccines is dizzying, a brief look at how they are classified gives a fascinating insight into how vaccines are made today—an astonishing contrast to scratching one’s arm with puss from an infected cow.

In the broadest sense, vaccines fall into just two categories: live and inactivated. As we have seen, live, or attenuated, vaccines are made by modifying the disease-producing microbe so it is no longer harmful but still able to stimulate immunity. This category includes both viral and bacterial vaccines, though most live vaccines in the United States today contain attenuated viruses. Today’s attenuated viral vaccines include those for measles, mumps, rubella, zoster, rotavirus, and varicella (chickenpox).

Inactivated vaccines include the killed vaccines discussed earlier, as well as several subcategories that truly hint at the complexity and marvel of today’s vaccines. The two major types of inactivated vaccines are whole vaccines and fractional vaccines. Whole vaccines are made of either whole or parts of bacteria or viruses and include: 1) viral vaccines against hepatitis A, rabies, and influenza vaccines and 2) bacterial vaccines against pertussis, typhoid, cholera, and plague. Fractional vaccines are where things get interesting. They include three major types: 1) subunit vaccines (made from parts of the disease-causing microbe, such as current vaccines against hepatitis B, influenza, human papillomavirus, and anthrax); 2) toxoid vaccines (modified anti-toxin vaccines, such as improved vaccines against diphtheria and tetanus); and 3) polysaccharide vaccines (made from sugar molecule chains on the surface of certain bacteria, with examples including vaccines against pneumonia and meningitis).

Finally, the new category of recombinant vaccines refers to vaccines made with genetic engineering technology. With genetic engineering, scientists can identify the specific gene in bacteria or viruses that produces a protein that triggers an immune response. The culprit gene is then inserted into yeast cells, which are coaxed to produce large amounts of that protein. The protein is then used to make a vaccine. When the vaccine is administered, it provokes an immune response—that is, causes the body to make antibodies against the protein. Thus, the same antibodies that are produced against the genetically engineered protein will also act against a bacteria or virus whose gene originally produced that protein. Genetically engineered vaccines available in the United States include the hepatitis B and human papillomavirus (HPV) vaccines.

The view from today: concerns, transformations, and hope

Today, many health experts consider the discovery of vaccines to be the greatest breakthrough in the history of medicine. They point out, for example, that vaccines have prevented more deaths, permanent disability, and suffering than any other medical discovery or intervention. In fact, some note that with the exception of safe water, no other factor—not even antibiotics—has equaled vaccines in saving human lives.

But apart from the lives saved, vaccines have transformed our lives and view of the world in several profound ways. First, vaccine advances in 1800s contributed significantly to the discovery and acceptance of germ theory—the paradigm-shattering realization that diseases are often caused by invisibly small bacteria and viruses, not evil spirits or religious forces. Second, vaccines opened our eyes to a new world inside our bodies, the immune system, and thus the first true insights into how the body fights disease. Third, vaccines showed us that medicine doesn’t always have to involve the blunt force of drugs or surgery. Rather, vaccines teach the body to treat itself by inoculating a person with the very disease you’re trying to prevent. And finally, vaccines put a new twist on the issue of personal responsibility: with contagious disease, one’s decision about whether or not to be vaccinated extends beyond individual health concerns to the health of the entire community.

This last point, whether or not one chooses to be vaccinated, is important and emotionally supercharged by those who resist being “treated” for a disease they don’t actually have, fearing that the treatment itself could cause the disease. While some concerns about safety are justified, anti-vaccination movements—which have been more or less ongoing since the eighteenth century—can create their own dangers. Arousing fears based on scientifically unfounded claims, such movements often cause people to avoid safe vaccinations and thus increase the risk of epidemic disease.

One recent example is the concern that thimerosal, a mercury-containing preservative used in some vaccines, might cause autism. In 1999, despite a lack of evidence that thimerosal is harmful, the FDA asked pharmaceutical companies to remove the preservative from vaccines. Although many studies have subsequently found no evidence that thimerosal causes neurodevelopmental problems or autism in children, publicity from the ban and the spread of false information by anti-vaccination groups raised sufficient fears among many parents that they stopped vaccinating their children. A 2007 article in the New England Journal of Medicine pointed out the dangers of such scenarios with regard to influenza, which each year causes hundreds of thousands of hospitalizations and about 100 deaths of children. Yet, “All of the negative media attention has made many parents reluctant to have their children receive this vaccine...” And thus, the author argues, “By choosing not to vaccinate their children, these parents have elevated a theoretical (and now disproved) risk above the real risk of being hospitalized or killed by influenza.”

While the risk of adverse effects is always a legitimate concern, experts note that most well-conducted scientific studies have not supported the idea that vaccines cause rare serious adverse events. In addition, a large body of scientific evidence has discounted purported associations between vaccines and such diseases as multiple sclerosis, measles, mumps, and rubella. As health experts often point out, avoiding vaccination can pose real dangers to the larger population because of the so-called “herd effect,” which refers to the fact that the more people that are vaccinated, the better the overall population is protected. Conversely, those who refuse vaccination create gaps in the community defense, offering microbes a free ride to continue their contagious spread.

Apart from safety concerns, real or imagined, vaccines continue to offer an exciting potential for new and better advances in the future. Currently, more than two dozen infections can be prevented by vaccines, and new technologies and strategies, such as those involving the manipulation of genes and proteins, are likely to result in vaccines for many others. Nevertheless, the scientific challenges are many and daunting, as seen with the ongoing quest to find vaccines for malaria and AIDS.

Africa: 16,000 years ago...and today

On October 26, 1977, a hospital cook in Merka, Somalia, became a hero of mixed blessing when he became the last known person on the planet to be infected by smallpox—16,000 years after the virus first made its pathogenic leap in Africa from animal to man. Thus, when the World Health Assembly officially declared the global eradication of smallpox in 1980, it was yet one more remarkable milestone: Smallpox had become the first and only human disease to be eliminated from the planet.

Which makes the announcement that came 30 years later all the more curious. In 2007, the FDA approved a new vaccine against... You guessed it, smallpox.

Why a new vaccine against an eradicated disease? The cynical answer is that there will always remain one deadly threat that humanity cannot fully eradicate: itself. Because stockpiled smallpox virus continues to exist for research purposes, new and better vaccines will always be needed to protect us against those who would steal the virus and use it as a weapon against their own species.

And so the battle continues. Viruses will slip into the body on the aerosolized blast of a sneeze and wage their deadly attacks. White blood cells will wage their potent counterattacks with their freshly minted antibodies. And humans will fight each other with whatever sinister weapon they can find. But in the end, at least on one battlefield, vaccines will continue to provide effective ways to help the body mount its valiant—and often victorious—defense.