DISRUPTIVE INNOVATION: THE TIN CAN - Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat(2015)

Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat (2015)

Chapter 5

DISRUPTIVE INNOVATION: THE TIN CAN

For thousands of years, armies had been content to rely on folk knowledge to prepare the foods they took into battle. Beginning in the late eighteenth and continuing through the mid-nineteenth century, the Industrial Revolution, which introduced machines and expanded markets, turned that relationship on its head. In a role reversal that continues to this day, the military, relying on its handmaidens science and technology, began to muck about in the kitchen, inventing and helping to promulgate new ways to preserve, store, and transport food.

IF YOU THINK WE LIVE IN A TIME OF TURMOIL, you should have been around three centuries ago. Punch-drunk on ideas like freedom and equality, radicals and rabble-rousers overthrew monarchs right and left and hounded the common man into participating in representative democracy. Emboldened by a move away from theism, scientists dared to declare that the universe did not, in fact, revolve around us and, furthermore, was controlled by preposterous ideas such as gravity and the laws of motion. Pushed off the land by enclosures, which put shared grazing grounds into private ownership, and driven into cities by the lure of earning a pittance for piecework, the European population became increasingly urban. But amid all this transformative hubbub, warcraft—and rations—stagnated.

Peek in the rucksack of the French or American Revolutionary soldier and what would you find? Almost exactly what you’d seen in that of the Roman legionnaire almost two thousand years earlier: flour, legumes, hardtack, and a chunk of bacon. This made perfect sense given that no new food-preservation method had been invented since man, at wit’s end with the excess corn, cabbages, and critters from his farming success, embraced drying, salting, smoking, and fermenting.

Sort of a why-fix-it-if-it-ain’t-broke approach.

And then it did break—during the French Revolutionary Wars, when the citoyens, with the zealotry of all new converts, decided to “persuade” the rest of Europe (modern-day Britain, Austria, Belgium, Germany, Italy, Luxembourg, the Netherlands, Portugal, Russia, Spain, Switzerland, plus Egypt thrown in for good measure) as to the validity of their views. A ginormous army of 1,500,000 men aged eighteen through twenty-five was raised—the first universal conscription in modern history—and all that testosterone needed fuel. General Napoleon Bonaparte, a baby-faced dictator-in-training, implemented a two-pronged feeding strategy while on campaign: a backbone of staples, schlepped in by boat or beast, and, for everything else, a standing order to go wild in the towns and countryside (fine in summer and fall, not so much in winter and spring). This made logistics speedier and more flexible than the traditional supply chain, but had a few—rather gaping—holes. One was that the troops might be MIA when a vital battle rolled around because they were off plundering the countryside. Another is that they might end up weak and ravenous because there was nary a foie gras nor a pain de campagne to be found.

In response to these field-feeding foul-ups, the logistics nightmare of arranging bœuf bourguignon for 1.5 million, and the yawping ghost of starvation left over from the Revolution, the French government issued a challenge in 1795, backed by the equivalent of one year’s salary. Were there any would-be food technologists around who could come up with a “composition of a work on the art of preserving, by the best possible means, every kind of alimentary substance”? This foresight has ever since been attributed to Napoleon, who, like all megalomaniacs, had a knack for suctioning up every achievement in the vicinity, even those that weren’t his. The fact is that in the early 1790s, the Little Corporal would have been far too preoccupied engineering his meteoric rise through the French military, sweeping the Paris streets of royalist riffraff, and courting the notorious French cougar and heartless harlot Joséphine to have spent time with a bunch of midlevel bureaucrats at the Department of Agriculture talking about food-preservation techniques. Of course, the story might have been different had he known its importance would far eclipse that of his other accomplishments, including the concept of a unified Europe and the Napoleonic Code (excellent if you happen to be on good terms with the judge, less so if your only hope was appealing to the nonexistent jury).

The government’s call for help was answered by a bad-boy celebrity chef turned candy man. By the time he was in his twenties, Nicolas Appert, an innkeeper’s son with no formal education, had clawed his way up the kitchen ranks to cook for high society—dukes, duchesses, princes, and princesses. In his thirties, already jaded from the glamorous life, he opened a confectionery shop, an experience that would serve him well—better perhaps than training in physics, chemistry, and biology—in his quest for applied-science glory. He observed the magical effect of low, steady heat on taste, texture, and—when the glass container was tightly sealed immediately afterward—spoilage. The bain-marie, a metal container set into another metal container full of boiling water, is indispensable equipment for producing dairy-based sauces, creams, and custards; melting chocolate; and making flavored syrups. It maintains a constant temperature because boiling water, as it’s at the exact point where liquid turns to vapor, can only boil faster, but not hotter. Appert would also have noted that fruit prepared in syrup, jams and jellies, or alcohol and stored in stoppered glass containers could last a long time without going bad. (These delicacies also relied on sugar as a preservative and an antimicrobial.) For more than ten years, the entrepreneur conducted painstaking experiments with vegetables, meats, dairy, soups, and stews in sealed bottles cooked in a water bath.

Of course, like any smart businessperson, Appert had his eye not only on the onetime prize but also on the possibility of a lucrative ongoing contract as a supplier to the French military. In 1803 he made his move, delivering his most delectable goods—soup, boiled beef in gravy, and peas and beans—for field-testing to the navy. Three months passed before he heard back; they’d passed with flying colors. Then, silence. A couple of years later, he delivered more samples. Again, they passed with flying colors, and again nothing came of it. (As anyone with government-contracting experience can attest, the modern-day process hasn’t sped up much.) Six years went by; Appert expanded his “spring, summer and fall in a bottle” business—so-called because people could eat “fresh” produce all year round—to a bustling forty employees; he delivered yet another round of samples. Finally, in 1809 he received a formal response: everything was satisfactory, except for the beef broth, which was deemed “weak.” He was invited in for the big presentation, and a month later Napoleon’s minister of the interior awarded him the 12,000 francs—on the condition that he relinquish ownership. Appert, poor sap, did.

The invention did have one flaw: the bottle. Glass was fine and dandy for middle-class families seeking to add out-of-season produce to the table, but impractical for naval or land expeditions (unless for mission-critical supplies such as spirits and wine). Luckily, at about the same time across the Channel, Peter Durand was inventing—and smartly patenting—the tin can. The first can factory, which opened in 1813, was not a high-volume operation; at full tilt, a skilled artisan crafted six to ten containers a day that were then filled with cooked food and simmered for up to six hours apiece. At this low output and (presumably) high price, the only early adopters were the British army and the Royal Navy, which ordered gargantuan cans of stew and soup for long journeys or expeditions. By the American Civil War, however, the ability to manufacture small cans cheaply and efficiently and the resulting reduction in cooking time turned the product from luxury good or emergency ration to everyday soldier fare, although the only tinned food bought in any quantity by the Union army was condensed milk for officers’ messes and army hospitals. (Enlisted men had to buy their own from sutlers, the peddlers who trailed behind the army with sundries and snacks.)

Today, the inventor of the fifth major food-preservation method is immortalized by the technique that bears his name: appertization. That’s right, appertization. What do you mean you’ve never heard of it? It’s the process of heating a liquid or solid food to a steady, high temperature to kill harmful microorganisms. But that’s pasteurization, you say? The sad reality is that fifty years after Appert’s discovery, along came Louis Pasteur, draped in degrees, who claimed all the credit. To be fair, Pasteur actually understood the underlying scientific principles—that many illnesses are caused by microorganisms and that these can be eliminated or at least drastically reduced by heat treatment—while Appert simply figured out that the technique worked. But still. You’d think that the man who’d dedicated his life to discovering the single idea that has most changed the face of food in the last two hundred years would have earned more than the occasional impassioned paean from food nerds like me.

A CENTURY AFTER APPERT DISCOVERED CANNING, its industrial use had grown by leaps and bounds, especially in Australia and the United Kingdom, where there were strong markets for condensed milk and tinned beef. In the United States, Gail Borden’s 1856 invention of canned condensed milk had caught on quickly, but the Chicago meatpackers, whose refrigerated carcass business was booming along with the railroads, saw no reason to diversify. It would take a mammoth order for canned beef from the U.S Army, a very public crisis, and an even more public exoneration to pave the way for public acceptance of this newest way to preserve animal protein.

Between latitudes 23.4378° N and 23.4378° S, in the geographic region known as the tropics, the weather is very pleasant for three months a year, conveniently coinciding with northern postholiday ennui and the frigid temperatures that wither the desire for outdoor adventures, and steamier than the sixth circle of hell the rest of the time. It was here that the United States danced her 1898 debutante cotillion—partner: Spain—as an international military badass, in the process relieving him of several torrid little islands (first Cuba, then Puerto Rico, the Philippines, and Guam). It should have been a cinch, except for one thing: when planning the very first invasion of the Spanish-American War, the army had forgotten to take into account that Cuba is very, very hot.

The problem was the meat ration, which, for the first time during a war, was supplied chilled or in cans rather than on the hoof. Refrigerated beef carcasses had been sold commercially since the 1870s, so it was natural that the armed forces would adopt the simpler, cheaper way to prepare and ship red meat. But the inclusion of canned animal protein was a novelty. The three big Chicago meatpackers—Armour & Company, Swift & Company, and Morris & Company—that had dominated the industry since the Civil War hadn’t taken up “beef canning … on a large scale”1 until 1879: not coincidentally until just after the army ordered that heat-sterilized meat be part of the travel ration. (At the same time, the navy unceremoniously incorporated it into sailor fare, to the tune of some five hundred thousand pounds a year.) Thus, when the next full-fledged military conflict, the Spanish-American War, erupted, there was a radical changeover—from traditional fresh or salt-preserved to the first modern processed food.

By early 1898 the meatpackers were on a bovine bender, buying up cattle for rations to send to Puerto Rico and Cuba. In addition to the thousands of tons of “refrigerated beef,” which were transported by special railway cars and ships but sometimes experienced what are known in the trade as “temperature abuses,” short or long periods of time out of the cold, they cooked up hundreds of thousands of pounds of canned beef and delivered wagonloads to Tampa, Florida. Where they sat on the sunny dock. And then were shipped in sweltering holds. And then were left on the blazing beach for days on end. When the soldiers finally received these rations, they were shockingly ungrateful. Some reactions:

“It made myself and my comrades gag.”

“The sight of it turned men’s stomachs.”

“Our company dog would not eat it.”

“The smell of this tinned meat was so vile it could not be opened without opening the car window.”

“When partaken of at evening mess it caused violent cramps in the stomach during the night.”2

“At the best it was tasteless; at the worst it was nauseating… . It at once became putrid and smelt so that we had to dispose of it for fear of its creating disease. I think we threw it overboard.”3 (This last comment was made by future president Theodore Roosevelt.)

When American troops sit at a chow hall table or open a can of government-issued rations after a long day of bayoneting, Gatling-gunning, and defilading on behalf of their fellow countrymen, they expect the kind of solid nourishment befitting a world-class power, not something that elicits the gag reflex. Although acknowledged belatedly—a complaint was registered after the war ended by Major General Nelson Miles in a tell-all to the Kansas City Star—the outrage at the camp in Santiago de Cuba sent shock waves through the military command hierarchy and ultimately reverberated through the halls of Congress. Matters were made much worse by the fact that many soldiers had already been weakened by typhoid fever in the stateside base camps and besieged by yellow fever once they’d landed. Two thousand, four hundred, and eighty-five enlisted men died of illness during the Spanish-American War—almost eight times the 385 men who died in combat.

But that was only the first act of the Cuban gristle crisis. In late 1898, a federal investigation was initiated by President William McKinley; the Dodge Commission met 109 times, heard testimony from 495 witnesses, and issued an eight-volume report running thousands of pages. Laboratory testing was done, but even the ornery United States Department of Agriculture (USDA) chemist Harvey Wiley, a fervid antipreservatives proselytizer, could find no traces of the alleged boric acid, sulfites, salicylic acid, or benzoic acid (then common meat preservatives) and reluctantly affirmed that the canned meat was unadulterated. Careers were ruined: General Miles, the commanding officer who’d lobbed the first verbal grenade—“embalmed beef!”—was rebuked for not addressing his concerns through proper channels. General Charles P. Eagan, head of the Commissary Department, was court-martialed and convicted for his intemperate reply. “He lies in his throat, he lies in his heart, he lies in every hair of his head and every pore of his body, he lies willfully, deliberately, intentionally and maliciously.”4 And other careers were launched: Teddy Roosevelt, whose Cuban exploits made and broke the mold for manly men, rode the brouhaha all the way to the White House, first as vice president and then as president after McKinley was assassinated. In fact, the episode’s repercussions are felt to this day: in 1906 Roosevelt pushed through Congress the first Pure Food and Drug Act; government, academia, and the food industry began to conduct real research on how to kill the microorganisms that live in food; and the army completely overhauled its systems for procurement and feeding of troops in the field. The chastised Lieutenant General Miles, on the other hand, eventually retreated to the sidelines, his name forever sullied by his “ruthless, unjust attack on the Commissary General of the Army,” as it’s described in his Arlington National Cemetery biographical sketch.

But was he, in fact, wrong?

BY ALL ACCOUNTS, Lieutenant General Miles was not a nice man—vain, overweening, and prone to shooting his mouth off. His long military career was as often marked by internal skirmishes as it was by external ones. However, in the case of the stinky steaks, he may well have been speaking the truth—an unpleasant one that neither the military nor the meatpackers wanted to hear, and especially did not want bandied about in the press, as it would have disrupted the move to cheaper and more efficient ways to serve meat to soldiers and, ultimately, civilian consumers. The hearings were cast as a meticulous investigation into the incident followed by an impartial assignment of responsibility; in fact, both the process and the findings supported the interests of the military and industry. The commission overemphasized some of the accusations and sidestepped others. It made authoritative declarations of findings when none were warranted.

Let’s review.

First, what exactly were Miles’s charges against the U.S. Army Commissary Department and the federal government? Dr. W. H. Daly, a surgeon on his staff (Daly committed suicide a couple of years after the scandal), concerned for his patients, inspected—but didn’t test—several shipments of refrigerated beef, and found it to have “an odor similar to that of a dead human body after being injected with preservatives, and it tasted when first cooked like decomposed boric acid.”5 Based on this, the good doctor believed that the beef had been treated with chemicals, a charge championed by Miles, who declared it to be a “secret process of preserving beef.”6 (It’s notable that in the report Dr. Daly remarked on the taste of boric acid, an impression the army triumphantly dismissed because boric acid has no taste or odor.) The army duly tested other meat shipped to Puerto Rico and found it to be free of adulteration. On the other hand, only passing attention was paid to proving that spoilage had not occurred en route, although it was stated that “the testimony, with some exceptions, showed that the refrigerated beef issued was pure, sound, and wholesome.”7

With regard to the canned beef, the army made the same argument as to its integrity—a fact that Miles had not disputed—while avoiding a prolonged discussion of its deterioration, about which it conceded: “There is no doubt that when issued to soldiers in Cuba and Porto [sic] Rico, where it was exposed to the heat, and where they did not have the proper means of treating the cans, as directed on the labels, and could not properly cook it, the meat was unpalatable, especially to those suffering from malaria, or convalescent… . In a tropical climate, carried on the march, exposed to heat, the meat so changes in appearance as to become repulsive.”8 It was this point that General Miles had made, about both the refrigerated and the canned beef, and which, with its righteous indignation about the accusation of chemical preservation, the War Department did not refute but buried under the voluminous evidence it presented to prove that no tampering, intentional or not, had transpired.

Little did any of these actors realize that four hundred miles north in Cambridge, Massachusetts, key evidence that could have been used by the defense had already been discovered. In 1895 a lowly assistant to a Massachusetts Institute of Technology (MIT) professor, a young man named Samuel Prescott—who would later become a dean and founding president of the Institute of Food Technologists (IFT)—was tossed the thoroughly unprestigious assignment of helping the local William Underwood Company figure out why their canned clams kept exploding. Prescott and Underwood’s work on the bivalves, as well as other canned seafood and vegetables, was published in 1896, and became the first studies on an essential food-processing concept, thermal death times—the times and temperatures at which bacteria and their spores are killed. This finding, however, would not yet have reached the ears of the army Commissary Department or the Chicago meatpackers who, like the rest of the world, had only just accepted the theory that sickness was caused by germs, tiny organisms (in fact, thermal death times would not become a foundational precept of the canning industry for another quarter century). Since then, our understanding of and ability to control the microorganisms that spoil and poison food, not to mention the forces that cause its physical and chemical deterioration, have exploded. Looking back at the “embalmed beef” scandal with more than a century’s worth of microbiology, bacteriology, and food chemistry at our fingertips, we can determine with a lot more precision what exactly happened. But first, let’s take a brief foray into the ways food can go bad.

THE ESSENTIAL FACT OF OUR EDIBLES is that they are dead—lifeless plant (although fresh fruits and vegetables do continue to respire) and animal tissue already half embarked on the inevitable journey to decay. (Thus the disconcerting similarities between the laying in of food stocks and the mortuary sciences: pretty much every traditional way of storing sustenance—drying, salting, sugaring, smoking, and burying—has been practiced on the human corpse.) Some of the changes are the result of simple physics and chemistry, driven by randomness or gradients. Others are wrought by our invisible cohabiters, the fungi and bacteria that populate our insides and outsides, our homes and gardens, soil, water, and air. Food preservation’s most important task is to defeat these—anthropocentrically speaking—dark forces, which, for the first time, can invade plant and animal tissue and items made from them without having to contend with the formidable defenses of living cells. It must slow or halt the spoilers, which are all over anything expired and which can make a mess of things in as short as a few hours for fresh shellfish to years for dried grains and nuts. And it must destroy the poisoners, for which we are the ideal—or at least one—habitat and whose residency can cause anything from a grumbly tummy to an agonizing death.

Some of food’s problems are simple senescence, sometimes set in motion by electromagnetic waves emanating from the sun. Visible light, which occupies a relatively narrow range of wavelengths, is the sun’s most abundant kind of radiation, and drowns the earth. Conveniently, the energy level of its photons is just right for powering terrestrial chemistry; certain wavelengths of visible or lower-energy UV light make an electron jump to outer orbitals in the atoms of many of the ninety—or hundred—or so naturally occurring elements (scientists disagree on their exact number), making them more reactive. Thus, in food, sun or other light can cause some molecules to decompose, others to glom on to new ones, some to rearrange themselves, and still others to become charged.

An example of one of these changes is lipid oxidation, a complex set of chemical reactions in which two or more molecules do an electron shuffle, in the process creating a new set of molecules. (An oxidant is a molecule that gains electrons from another molecule; a reducer donates them.) When this happens to the fat in stored food, it goes through a chain reaction, producing free radicals (molecules with one or more looking-for-trouble unpaired electrons) that combine with neighboring molecules to cause off flavors, color changes, and nutritional losses. Anything with fat can be susceptible: vegetable oil, butter, meat, dairy, nuts, whole grains, and, of course, fried and snack foods.

Temperature has a similar effect on edibles, although its impact is more generalized and occurs much more often than that of light. Heat from the air enters the food through convection, in which the agitated warmer molecules adjacent to the food pass it some of their energy, in the process becoming cooler and heavier. These cooled molecules are then replaced with warmer ones, which do the same thing, until the temperature of the food and the air are at equilibrium. The food itself heats up through conduction, in which the agitated molecules pass some of their energy to adjacent ones, but without changing position. The molecules in the now-hotter food have more energy, setting off more chemical and enzymatic reactions; these double for every 18°F increase at room-temperature range.

One of the most important of these is the Maillard reaction, or nonenzymatic browning (as opposed to the enzyme-generated discoloration that happens naturally in fruits and vegetables), which occurs when part of an amino acid that has electrons to donate and part of a sugar that can accept electrons combine and then rearrange themselves to form a number of new, highly reactive compounds. These compounds produce a variety of different flavors, colors, and aromas, many of them desirable, for example, the savory taste of cooked meat and the golden crust of bread, and others not, such as darkened packaged pastries, musty boxes of milk, brown dehydrated fruits, and warmed-over-tasting canned meat.

Unlike most of us, food can legitimately blame the majority of its problems on someone else. The minute an organism dies, things go downhill fast. The initial set of changes are internal, and have to do with the continuing activity of enzymes, the uncontrolled release of intracellular fluid, and the halting or breakdown of cellular structures. All others have to do with the invasion of saprophytes, aka death eaters, which are eager for their turn using matter and energy in the food chain, and microparasites caught in between hosts. In edibles, there are two main categories of microbes, fungi (yeasts and molds) and bacteria, which further break down into two classes, spoilage and pathogens.

Although humans can get ill from fungi, it is rarely from ingesting them—at least in the developed world. More often, it’s the airborne spores that cause an allergic reaction or out-of-control inflammatory responses (think sick buildings coated with black mold), or that gruesome condition known as mycosis, in which the little buggers sprout on the skin, in the airways, and even in the eyeballs of a Homo sapiens host—almost always one who is immunocompromised. In the less fortunate nations (that is, most of them), there’s a laundry list of fungi that cause disease through ingestion, almost always by growing on and producing toxins in staple cereal crops (wheat, rice, corn) in the field or during storage. In the United States, however, for the most part, the only mycotoxin of any great concern is aflatoxin, which is found in peanuts and corn (its levels are regulated by the federal government). Aflatoxin is an extremely potent carcinogen; Iraq even weaponized it in the 1990s, possibly for use against the Kurds, though there’s no evidence it was ever utilized.

As a rule of thumb, fungi rush in where bacteria fear to tread: low moisture—or rather a related concept called water activity—low temperatures, high acidity, and salty and sugary environments. They come in two varieties: yeasts, which are small, one-celled organisms that can quickly form large colonies, and molds, which are large multicellular strands that are technically a single organism. Some species of both have been our handmaidens in the biotechnological processing of food for thousands of years. The jolly yeasts lighten beer, wine, and bread, consuming sugars and producing ethanol and carbon dioxide. The molds are a more saturnine lot. When they’re working for us, they add piquant and earthy tastes to cheese—for example, the reverse eponymous Penicillium roqueforti and Penicillium camemberti—and umami to soy products. But when they’re not, they turn everything they touch ghoulish: colored green, blue, gray, and white and coated with a revolting fuzz. Most spoilage from fungi is in perishable foods, but their spores can be heat-resistant, so if conditions are right, they may occasionally germinate in preserved foods.

You’d think that olfactory red flags—eau de diaper, wet socks, dirty dishrag, and skunk, along with “unspecified bad odors”—would be a clear sign of food that can make you sick. But, in fact, most spoilage bacteria, the beasties that produce the stink, are innocuous. Because they evolved to consume dead plant or animal matter, they tolerate low temperatures (your refrigerator’s 35°F-38°F is fine) but languish when the mercury really spikes. In fact, if they could, they’d give us a wide berth. To storm Citadel You, they’d need to survive a dousing in stomach acid (pH 1-2, also good for tanning leather and sterilizing pools); elude the death squads of intestinal immune cells; and withstand an enervating 98.6°F, none of which they’re equipped to do. The stench is from their digestive process, which turns amino acids into amines, including the evocatively named cadaverine, putrescine, and spermidine. (As repulsive as they are, only one, histamine, has been linked to serious negative health effects for people who have allergies to it or who eat certain kinds of improperly stored fish.) In your refrigerator ecosystem, the dominant type is probably Pseudomonas, species of which are responsible for decorating meats with green slime, spoiling milk, and turning leftover moo goo gai pan fetid. Another class of grocery gremlins belongs to the Lactobacillus (LAB) family, which defile meat, milk, and bakery products with acerbity, ooze, and stench. Other LAB branches produce fermented products such as cheese, salami, and sauerkraut—very niftily, when a healthy batch of LABs dominates the food surface, it protects against the growth of colonizers.

The term pathogen, disease-causing, which came into use only in the mid-1800s, is rather egocentric, as if the whole purpose of some bacteria was to ruin your day. The truth is most are just going about their business, which, as is the case for most living things, revolves tiresomely around consumption. That a colony is homesteading on that pile of cafeteria mashed potatoes or that coffee-shop banana cream pie, and happens, when you scarf down your portion, to be given an all-expenses-paid trip to your small intestine, may simply be a case of wrong place, wrong time—even for them. In fact, for some bacteria, we’re bad news from start to finish: their encounter with our considerable defenses (see above) leaves the place littered with small, dead bodies; with luck, a few hapless spores or a modest reservoir of poison escapes. Others, the noninvasive pathogens, are able to duck the gastrointestinal bodyguards, but would be just as happy to inhabit your garden or your compost heap as your food or your stomach (one of the hallmarks of bacteria is flexibility—they have multiple ways to metabolize nutrients and generate energy). And then there are a few, the invasive pathogens, who truly are mad for us.

Noninvasive pathogens are like those supercilious outdoorsy types who are always saying there’s no such thing as bad weather, just inappropriate clothing. They come prepared—for anything: extreme temperatures, nutrient scarcity, environmental stresses, and immune responses. They get their bearings quickly, and before you can set down your gear, they have mounted a base camp and are scouring the woods to rustle up dinner. Their habitats—or lifestyles, as microbiologists coyly call them—can include soil, dust, water, silage (composty stuff), vegetation, insects, animals, food-production facilities, food, and you. In certain environments, they produce toxins that can be unpleasant or even deadly to humans. But again, there’s no need to take it personally: toxins may just be a defense mechanism to disable competitors in the host or other environments. And if things get really dire, they’ve got a Plan B: little packets of genetic material like seeds that can easily outwait just about anything, from Noah’s flood to the Roman Empire—molecular biologists have found viable forty-million-year-old bacterial spores.

The noninvasive pathogens are no gourmets. Like children, seniors, and the vast majority of Americans, they like their comfort food—starches, gravies, filled pastries, meat loaf—which is why they thrive in food-service environments more given to substance than style: buffets, cafeterias, and other institutional settings. Because they are everywhere, it’s just a matter of waiting for the right circumstances—say, a nutritious batch of chicken à la king after several hours on an inadequately heated steam table—before they overrun the place. Not that you’ll know. Per standard pathogen modus operandi, invasions are done on the down low. A couple of the more common noninvasive pathogens are Bacillus cereus, sometimes called “fried rice syndrome”—it’s mad for carbs—and Clostridium perfringens, a protein hound who’s a regular guest at picnics, schools, and prisons and is one of the most ubiquitous in the environment, equally happy in soil, decaying plant matter, the intestinal tracts of humans and other vertebrates, and insects. Both B. cereus and C. perfringens are spore-formers, so food may arrive already impregnated with its own unhappy ending—for C. perfringens, high-heat cooking even benefits it by promoting germination and killing off competing organisms. A final common noninvasive pathogen, although not a spore-former, is Staphylococcus aureas, a resident of the nasal cavities, skin, and sores of infected humans that is also found in soil, water, and air.

By contrast, the invasive pathogens are incorrigible homebodies, luxuriating in indoor temperatures and regular meals. For the so-called microparasites, we—and our farmyard friends, the cows, the pigs, and the chickens—are the cat’s meow. Their idea of a good time is to loll, warm and cozy (98.6°F is ideal, although they’re also willing to be fruitful and multiply at room temperature), on the tropical beach of our small intestines, letting the amino acids and sugars wash gently over them. While they’re waiting for their dream host to appear, they’re willing to hang out in your burger, your spinach, or your raw milk. But since this isn’t their optimal environment, to do so, they may sacrifice their offspring or enter a suspended state known as viable but nonculturable (VBNC)—they won’t grow unless resuscitated. Other strategies for survival in inhospitable secondary habitats include the formation of biofilms, which may explain outbreaks that have been traced to produce.

That’s not to say that invasive pathogens are pacific. To invade the promised land, they come as geared-up as a USAF special op. For starters, they’re stealthy (as are most pathogens), giving no warning—no noxious odors, no slimy textures, no unpleasant tastes—as to their presence in food. In the stomach, they often become temporarily acid-tolerant. In the small intestine, they bind to and disrupt immunological command centers. And, once established, colonies can turn downright mean, ousting other occupants from preferred perches and monkeying around with the body’s natural defenses, such as the protective mucus layer, by releasing toxins. These have a second life as a clever travel strategy. Some of the excreted substances or the infection itself makes you feel very, very sick, triggering a release of bodily effluvia (though, luckily for them, usually not sick enough to kill the host). Greetings, family and friends!

Most cases of food poisoning come from a barnyard trio that spend their lives in a snug circle moving from host to host. Campylobacter jejuni populates the intestinal tract of most livestock, especially chickens, but can survive in raw milk, untreated water, and undercooked meat. At ambient temperature, however, it dies quickly, and in food it’s easily eliminated by heating, drying, freezing, acidic conditions, and disinfectants. Escherichia coli is the world champion of diarrhea, especially among U.S. travelers and children in less-developed nations, of whom 380,000 die annually from dehydration related to the illness. It lives in warm-blooded animals and birds, as well as dung-dirtied food and water. The most common variety is quite mild, but the much more toxic subgroup E. coli O157:H7, which first showed up in ground beef and beef products, can be more serious and on rare occasion may cause death. Salmonella is the most common cause of bacterial food-borne illness; like the other invasive pathogens, its primary habitat is the guts of domestic and wild animals, both birds and mammals—that includes us—from which it infects eggs, poultry, beef, pork, processed meats, and dairy products. Salmonella is handily dispatched with heat, so is not a problem in traditionally processed foods.

Three pathogens should always be treated with the utmost respect, so despite the fact that they belong to both the invasive and noninvasive groups, they get their own special category: cold-blooded killers. Their commonality is a deadly one: a refusal to limit themselves to the gastrointestinal system. (E. coli O157:H7 also belongs to this Most Wanted list; it secretes a toxin that after it has breached the intestinal wall enters the bloodstream and damages tiny blood vessels; most victims survive, but a few die of liver failure.) Listeria monocytogenes is an oddball among the food-borne pathogens in that it is very cold-hardy, which means that stowing infected raw and soft cheeses, ice cream, sushi, deli meat, and hot dogs in the refrigerator does nothing to halt the proliferation of the bacteria. (In nature it is found in soil, water, vegetation, sewage, and silage.) Like the microparasites, it creates infections. However, once in the gut, the bacteria can cross the intestinal wall to enter the bloodstream, from which they slip into liver or spleen macrophages, white blood cells that engulf foreign matter, a sort of immunological wolf in sheep’s clothing. In healthy hosts, these are sussed out and zapped, but in immunocompromised ones, they can then penetrate other organs, including the central nervous system and the placenta of pregnant women, where they may cause brain problems or miscarriage. Another assassin comes from the deep blue sea. Cousins Vibrio parahaemolyticus, the more innocuous one, and Vibrio vulnificus are both denizens of raw and undercooked fish and shellfish. V. vulnificus, if it enters the blood, can cause septic shock and death. The microbe is fragile, requiring a saline environment, and it is easily destroyed by acid, freezing, cooking, and common disinfectants—this is little comfort for raw bar aficionados, however, unless you’re willing to turn your clams on the half shell into lemon-marinated ceviche.

There is one bacterium so fearsome it makes food engineers quake in their rubber-soled, fluid-impermeable shoes: Clostridium botulinum. The noninvasive pathogen is the most difficult to eradicate of all the “bad bugs.” It forms spores at high heat that can persist after processing, thrives in low-oxygen conditions such as cans, and produces a deadly neurotoxin with a high fatality rate. Clostridria are all around us in soil and on plants. In food, they usually ride in on an innocuous vegetable proudly “put up” by a home canner: mild asparagus, green beans, corn, or peppers. These low-acid foods offer ideal conditions for germination. (C. botulinum can’t sprout below 4.6 pH. Accordingly, the Food and Drug Administration [FDA] and the USDA have different regulations for canned low-acid, acidified, and acid foods.) To preserve low-acid foods safely, industry uses a “botulinum cook,” 250°F for at least three minutes, a heat unachievable at home without a pressure cooker. If the endospore germinates and grows, the bacteria produce a toxin that, after it’s eaten, passes through the intestinal walls and into the central nervous system, where it disables the axons that transmit signals, leading to muscle paralysis and, frequently, death.

NOW THAT YOU HAVE A WHIRLWIND TOUR of modern food microbiology under your belt, let’s turn our attention back to the scandal of the fetid fillets. Let’s start with the refrigerated meat, presuming that there were some temperature violations during transport, at its final destination, or both. The beef was shipped in carcasses, which means that almost all microbial colonies would have been on the surface of the meat; this lets off the hook food-preservation enemy number one, botulism, which only occurs with anaerobic environments. The sides of beef could easily have been contaminated with microparasites during slaughter, which inhabit the intestines of warm-blooded animals and have a nasty habit of spilling out during dismemberment. But they are heat labile and not spore-formers, so the army’s presumably thorough cooking before serving would have destroyed them. The same goes for any fungi or spoilage bacteria that might have hitched a ride from the carcasses’ point of embarkation or during the journey, although the latter could very likely have left an unpleasant calling card, the biogenic amines that are the product of their metabolism. The fact that the surgeon who started the whole mess claimed to smell boric acid and other adulterants suggests this, as amines are not inactivated in cooking and can cause nausea and diarrhea. If the refrigerated carcasses were causing serious illnesses, by process of elimination, these were likely imparted in the camp through handling of the meat before serving. In fact, even in the twenty-first century United States, the vast majority of food-borne illnesses are bestowed by the human touch; a full 58 percent of cases are from noroviruses. (Viruses are not a problem in stored or preserved food as very few live long outside a host.)

Now to the canned protein; was it the culprit? According to the Dodge Commission report, the beef had first been cooked, then canned in two- and six-pound tins and sterilized for two to three hours at 215°F-225°F. Given that the seals hadn’t ruptured—and there were no reports that they had—this is more than sufficient to kill all live bacteria and spores. (Even though most modern processors eliminate botulinum spores with higher heat for a shorter time period, a cook time of about an hour at 215°F is also sufficient.) This serious overcooking would have also destroyed any fungi, spoilage bacteria, or other pathogens in the cans, as it would the taste of the meat, producing unpleasant nonenzymatic browning. Add to that some lipid oxidation brought on by storage in high heat, as well as assorted other unpalatable but nondangerous chemical deterioration, and—voilà: canned food you wouldn’t give your dog.

The “embalmed beef” scandal may have shaped American food safety law and practice, but the truth is that if any Spanish-American War soldiers died of gastrointestinal illnesses in Cuba and Puerto Rico, they most likely contracted them there—not from the rations, although these were in some cases spoiled and in others terribly decayed. Did General Miles really deserve to go to the grave shrouded in ignominy from the dustup? He spoke the simple truth of the senses, a truth that the science of the time could not explain, and unwittingly became the fall guy for the missteps of the military and the food industry in their mad rush to modernize American meat eating.