WHAT AMERICA RUNS ON - Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat(2015)

Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat (2015)

Chapter 7

WHAT AMERICA RUNS ON

LUNCH BOX ITEM #1: ENERGY BARS

It’s Saturday morning, and cleats clatter up and down the stairs.

I’m still in bed sipping the cup of motor oil-strength coffee my husband has placed on the bedside table.

“Did you comb your hair?” I shout through the closed door. “Brush your teeth? Put on sunscreen? Do you have your water bottle? What’d you eat for breakfast?”

“A granola bar,” yells Dalila from the bathroom where, octopus-like, she’s complying with my grooming requests. A car pulls up outside. More clattering.

“Have a great game!” She’s off.

Is there any quicker fix than an energy bar? These bars are the leitmotif of my daughters’ childhoods. I’m always removing half-eaten bars from backpacks, coat pockets, lunch boxes, and purses, amazed that no matter how long they’ve been there, they’re never stale or moldy. I find the shiny, foil-lined polypropylene wrappers—which I try to avoid reading; how many sugar variations can you fit in one product?—under beds, in corners, beside the sofas, and, occasionally, in the trash. The treats go with us everywhere: soccer practice, gymnastics class, dance class, nature hikes, the beach, and long car rides. In memory, their faintly sweet smell is almost indistinguishable from that of my children’s breaths—pre-dental decay, pre-bad habits—as they dozed off in their car seats or strollers, the snack clutched in their chubby fists. The bars seemed as innocent as they were.

TOSSED INTO GLOVE COMPARTMENTS and office desk drawers. Tucked into backpacks and gym totes. Stowed in handbags and briefcases. Buried in kitchen cabinets. And, of course, sold by the boxful at every mini-mart, gas station, and supermarket from Bangor to Juneau, San Diego to Orlando. It’s probably no exaggeration to say that the average American is never farther than twenty feet away from an energy bar (or, as they are variously called, granola bar, cereal bar, breakfast bar, nutrition bar, health bar, protein bar, sports bar, or snack bar). The small, flat rectangles of grains and vegetable or dairy protein, bound together with copious sugary syrup, are a fixture of modern-day eating: quick, portable, and (supposedly) nutritious, generating $5.7 billion in sales in 2011. We consume them as meals, in-between meals, and sometimes as dessert. But for all its ubiquity, the energy bar is a newcomer to our diet, making its first appearance in the 1970s and still a novelty by the mid-1990s, the freakish fare of will-deficient dieters and dangerously ardent athletes. From where did it come and why did it take us by storm?

The energy bar story begins a century ago, when the U.S. Army sought to turn chocolate, the world’s favorite sweet, into an emergency ration for tired soldiers on the move. Being the army, however, it wasn’t content just to serve what was already the perfect pick-me-up. It had to meddle and, in signature fashion, took on the two qualities that make chocolate chocolate: its delicious taste and its almost-body-temperature melting point. The fatty fruit of the Theobroma cacao tree, when ground, dissolves at between 93°F and 98°F, which means it tarries on the tongue, spreading and aerating over six hundred flavor compounds in a heady ooze. For three centuries after chocolate’s discovery in the New World, only the rich could afford the fermented, roasted, and milled beans, which they used to make a hot beverage by mixing with water or milk and another irresistible—but expensive—tropical transplant, sugar. By the mid-1800s, sugar production had spread far and wide, and its price had dropped, leaving in its wake terrible teeth, spreading middles and backsides, and the forced resettlement of millions of Africans to charming beachfront properties throughout the Caribbean and the Americas. But no matter how cheap its principal ingredient, there’s only so much hot cocoa a person can drink.

It took the Industrial Revolution, when machines were introduced for pressing powder from the nibs, to unlock the secret to producing solid chocolate. The process left behind prodigious quantities of an oily goop called cocoa butter, and manufacturers, always keen to find a market for industrial waste, hit upon the idea of mixing it back into the sugar-sweetened powder. This made a viscous liquid that could be formed into different shapes as it cooled. Candy! (The conche, another invention of the industrial age, crushed and agitated the blend, giving modern-day chocolate its delectable smoothness.) At first, hardened chocolate was used most often as a coating to create bonbons, which were downed, daintily but unceasingly, by the fairer sex festering in her domestic prison. But the treats, which were bought at the confectioner’s, were still expensive and, what with their association with ladies who lunch, considered sissy food by half the adult population.

Until Milton S. Hershey came along.

By the age of thirty, Hershey had founded a successful caramel company (the main ingredients were imported sugar and milk from the state’s one million hyperlactating—descriptor of the Department of Agriculture—cows) in Lancaster, Pennsylvania, but he was hungry for more. At the 1893 Chicago World’s Fair, he saw some German chocolate-making machinery and decided to give the newfangled process a whirl. Six years later, he sold off the caramel branch of his business and committed himself full-time to producing cocoa, bonbons, syrup, and—finally—bars made from a secret Swiss procedure he’d painstakingly replicated. By substituting a cheap local ingredient—milk—for part of the expensive, imported one—cacao—he was able to make a chocolate candy that was affordable for all (and, with its masculine rectangle, far less threatening to the male ego than the bosomy bonbons of yore). Of course, this technique essentially turned the namesake ingredient into a flavoring agent, because it now made up, and still does, only 11 percent of the whole, but who’s complaining when you’re talking about five-cent candy bars? (The nickel bar didn’t vary in price for almost seven decades; it just got smaller and smaller, until it vanished in a puff of smoke on November 4, 1969, the point at which it became impossible to turn a profit from it.) By the end of the first decade of the twentieth century, the Hershey bar was sold in every corner store, newsstand, and lunch counter across the country, and its inventor was so rich he built two company towns, one in Pennsylvania and the other in Cuba.

The U.S. military had taken notice not only of the world’s growing infatuation with the Mesoamerican food of the gods but also of the fact that German researchers heralded the sugar-laden bars as a better alternative to alcohol—prohibited in the American military since 1832—in stimulating men for long hikes and grueling manual labor. By the late 1890s the army was experimenting with a new emergency ration that included the mood-elevating sweet as well as a knockoff of the indigenous North American road fuel, pemmican, a mouthwatering mixture of pulverized meat and maize in animal fat. It contracted with two companies, the ephemeral American Compressed Food Company and Armour & Company (the latter brand still does business with Uncle Sam as Pinnacle Foods and Smithfield Foods and has had its greedy fingers in every war pie made since its founding in 1867). The army performed what passed for taste testing at the time: sending a bunch of guys to the woods with the new rations to see what stuck. Both prototypes were found lacking. According to the report, “Everybody suffered excessively from hunger, and in spite of the surreptitious begging of food from camping parties, the greater part of the men were reduced to a pitiable state of weakness.”1

Not to be deterred, the army vowed it would devise its own emergency ration, which it did, debuting in 1910 an all-in-one chocolate number that included egg albumin and nucleo-casein for protein and stability. Reactions were still less than enthusiastic—the item apparently provoked nausea and dizziness, and the secretary of war halted the project in 1913. Although the Department of Agriculture’s Bureau of Nutrition Investigations then came up with a slightly more palatable version, when preparations for World War I began, the army still hadn’t found a lightweight, energy-packed battle ration that could be carried in a pocket and eaten on the move.

The war to end all wars broke out, as so many conflicts have, in the Balkans, from where it spread quickly to infect all of Europe with a nasty case of trench warfare. Deep in their warren of dugouts, tunnels, and channels, when soldiers weren’t busy snoozing in the mud, waving off rats, and tossing grenades, they worked their way through piles of rations. First off, the Trench, with canned meat or fish and hard bread, as well as cigarettes and solidified alcohol for up to twenty-five troops. After that, the Reserve, which provided a day’s worth of canned meat, hard bread, sugar, and coffee for one man. When things got really desperate, they dreamed of chocolate … and had another Reserve. Having resoundingly rejected Armour’s can full of pemmican-lite and chocolate candy (three packets of each) almost two decades earlier, the Quartermaster Corps hadn’t placed its wartime order until June 6, 1918. It arrived too late. Five months later, on November 11, 1918, the day the Allies and Germany signed the armistice, the first one million emergency rations, with their payloads of half sugar, half chocolate made by the venerable New York chocolatier Maillard Chocolate Manufacturers, were still floating across the Atlantic Ocean. (The surplus was later sold at below cost to Boy Scouts, hunters, explorers, and others.)

Of course, gastronomic suffering wasn’t limited to the Western Front. At home, Americans were asked to make sacrifices, too, going meatless on Monday and wheatless on Wednesday, and limited to just eight ounces of sugar per week. Industry’s use of the sweetener was also rationed; only commercial fruit preservers, vegetable packers, milk condensers, jam manufacturers, ice-cream makers, and military contractors could buy as much as they wished. Confectioners, on the other hand—yet to convince the public that snacks were a valid fifth food group—were cut to just 50 percent of their previous year’s purchase and scolded for their hoarding ways. In a 1919 memo, George Zabriskie, president of the U.S. Sugar Equalization Board, which controlled distribution and set prices for the commodity, noted acidly: “Our observation has been that candy manufacturers have not only had their normal supply of sugar, but in many cases have anticipated their wants and been able to acquire sugar ahead of more essential industries.”

Much to his dismay, that didn’t include Milton Hershey, who by then owned an entire Monopoly board of assets, including numerous properties in the United States and Cuba, multiple residences for himself and executives, worker housing, mills, factories, schools, stores, parks, and even railroads. Hershey—horrors!—was unable to obtain sufficient sugar to keep vendors stocked with candy. This was not at all satisfactory, especially because competitors on the military payroll faced no such restrictions. (It was at this point that strapped chocolate makers began mixing in breakfast cereals, dried fruit and cookies, giving rise to such delicacies as Mars Mounds and the Nestlé Crunch bar.) Hershey vowed he would not again get stuck on the outside looking in during wartime.

His chance came when the army decided to have another go at the emergency ration—but instead of providing three separate components, it would combine protein, carbohydrate, and sweet in one. Mind you, this couldn’t be too tasty, or the men wouldn’t bother waiting for a crisis to eat them. In 1937, after two years of internal research, Colonel Paul Logan of the U.S. Army Quartermaster Corps, the head of the Subsistence School, began looking for a company foolhardy enough to take on stripping the pleasure from one of humankind’s most seductive foods by reengineering it so that the candy wouldn’t melt in pockets or at tropical temperatures or be overly tempting to sweet-toothed soldiers. He didn’t need to look far; Hershey, now the nation’s largest chocolate factory, jumped at the opportunity—being part of the rations supply chain would mean its sugar-gluttonous manufacturing line would never be subject to wartime restrictions again.

Logan provided his patented formula—about one-third bitter chocolate, one-third sugar, one-sixth skim milk powder, one-fifteenth oat flour, and a few vanillin crystals—for making a barely edible chocolate bar, and after a few days of trial runs in the Hershey lab, it was approved for production. The dough was so thick it had to be pressed into the molds by hand. The finished bars were sealed in foil and then paper-wrapped in sets of three, for a total of 1,800 calories, enough to sustain a man for one day. (Later, when foil became scarce during World War II and the use of chemical weapons seemed imminent—mustard and chlorine gas had been used frequently in World War I—waterproof cellophane and wax-coated boxes were used.) The so-called Logan bar, or D ration, was, by all accounts, awful. Nonetheless, between 1941 and 1944, almost a quarter billion bars were shipped and stockpiled overseas,2and Hershey was basking in its new role as premier chocolatier to the U.S. Army.

This Frankenchoc—part protein, part grain, and a helluva lot of C12H22O11—was the great-granddaddy of the modern energy bar, and has gone to war in soldiers’ pockets in every American military engagement up until the Gulf War. After World War II, there was a gradual split in the evolutionary tree, with the energy bar becoming a cereal-based meal-in-a-fist while nonmelting chocolate returned to its candy roots. But although billions have been produced, the waxy, cacao-based confection never gained any admirers. Hershey tweaked the D ration formula a couple of times. It began making the ever-so-slightly tastier Tropical Bar in 1943 for use in the hot and humid Pacific theater. This recipe was dusted off during the Korean War (1950-53) and the early years of Vietnam involvement (starting in 1955). Oat flour was eliminated in 1957, and the skim-milk powder replaced with nonfat milk solids (the difference between them is in the proportions of protein, lactose, and minerals). For the next several decades, Hershey complacently churned out this slightly, but obviously not much, improved candy. According to a Vietnam-era report, “Mother Bollman hands out bars of Hershey’s Tropical Chocolate for breakfast on the morning of the last day of the patrol… . For us, [they] are a last resort. They’ll provide some energy and a little bit of a needed sugar boost, but they’ll also require a swallow or two of our remaining water to wash away the taste.”3

But trouble was on the horizon. After decades of doldrums, progress was finally being made on producing a heat-tolerant chocolate. First Food-Tek, Inc., in New Jersey used polyhydric alcohols, an emulsifier that helped distribute the fat, and then the Battelle Memorial Institute, a longtime military contractor, in its Geneva laboratories, added water and a top-secret surfactant to increase the substance’s melting point without—according to them—changing its quality. In the late 1980s, researchers at the Natick Center came up with a new formulation based on this concept—tentatively dubbed the Congo Bar—that could withstand up to 140°F, and asked Hershey to produce 144,000 bars for the 1990-91 Gulf War, and a second shipment of 750,000 units several months later. Press releases taunted rival Mars (makers of M&M’s) that the Desert Bar was “a candy bar that melts in your mouth, not in the sand.”4 Mars, which had its eye on expanding its market in the Middle East, responded by going after Hershey’s military business. When DOD put the next contract for the Desert Bar out to bid—some 6.9 million units—Mars won it. And like a decades-old marriage held together by inertia, the historic U.S. Army-Hershey relationship crumbled in an instant. (A resulting legal battle between the two chocolate giants was also won by Mars.) However, neither version of the Desert Bar was well liked, and neither made it to the commercial market.

The quest for heat-tolerant chocolate continues. In the early 2000s, Cadbury—owned by Mondelez, the new Latinized Kraft moniker—found that reconching further broke down the sugar molecules, increasing the overall melting point of the sucrose-in-lipid suspension. And several manufacturers, including Barry Callebaut, experimented with fats—either drastically reducing the cocoa butter or adding in fats that are solid at room temperature. It may not be today, or even tomorrow, but when the breakthrough finally comes it will deliver a prize far larger than thousands of fatigue pockets and camouflage rucksacks: the 3.8 billion chocolate-deprived souls who live in those areas of the world still shockingly deficient in cold chains, continuous refrigeration systems—geographic footnotes such as Africa, Latin America, south and central Asia, and the Middle East.5

WHILE MELTLESS CHOCOLATE AMBLED OFF in a different direction, the idea of a fortified food bar, especially a sweetened one, lingered on, playing a starring role in two of the army’s most important research programs of the 1950s, ’60s, and ’70s: freeze-drying and intermediate-moisture foods. When commercial versions finally began to emerge, they had tucked inside not only enough calories and nutrients to provide a jolt of cheap energy, but also several major breakthroughs in twentieth-century food science, courtesy of the Department of Defense.

Industrial freeze-drying wasn’t born in the field kitchen but in the medic’s tent. Historically, 90 percent of all war fatalities have taken place on the battlefield, most often from loss of blood. This all changed after World War I with the convergence of two discoveries. The first was the understanding that battlefield shock was not, as had been previously thought, a nervous system shutdown, but a circulatory slowdown caused by loss of blood volume. (Think of the difference in pressure between a trickle and a torrent of water from a garden hose.) This meant that doctors and nurses could skip whole-blood transfusions and just inject plasma, the clearish carrier fluid, to stave off the downward spiral to death. The second was a new contraption rigging together airtight chambers, vacuum pumps, and condensers that allowed large-scale freeze-drying in commercial labs. Now, not only could fallen men be treated for shock on the spot, but they could receive plasma that had been made into a powder, shipped thousands of miles, stored for months on end, and rehydrated as needed. A new age of battlefield medicine—indeed, of emergency medicine—was born. (It was brief. It turned out that freeze-drying also preserved plasma’s scruffy viral hitchhikers, hepatitis B and, later, HIV; the use of pooled whole plasma was halted in 1968.)

Until the invention of freeze-drying, the only way to remove fluid from organic matter had been to allow its water to change from liquid to gas, either through a leisurely air-drying or a frenzied cooking. Both of these methods permanently alter tissues from prolonged exposure to oxygen or by denaturing proteins. But with lyophilization, for the first time, water could be dissipated while leaving cell walls and other structures uninjured. It works through a simple two-step process. The material is brought to below the freezing point, which slows down its molecules as if they had a major case of Parkinson’s. Lacking the energy to escape, water molecules slip into fixed positions with their neighbors and ice crystals form in the material. Next, the pressure is drastically reduced. The vacuum forces the water molecules to suddenly relinquish their hold on their fellow H2Os and others—sort of like abruptly encountering no resistance in an arm-wrestling match—and burst out of the surface of the ice. Whoosh! Vaporized. In no time at all, the vacuum arrives at the core of the frozen matter, stripping it of almost all its water. What’s left behind is everything else, more or less intact. (Because water expands when it freezes, some slight damage is unavoidable.)

Freeze-drying’s status as an unsung World War II hero—it indubitably rescued more fallen soldiers than all 464 of the Medal of Honor recipients combined—didn’t stop the army from eyeballing the new preservation tool for more everyday ends: food. Foreseeing a tide that has yet to ebb, military analysts predicted a need for ever-greater troop mobility. And what would be more conducive to that than rations from which most of the weight, which is water, had been removed? Even at the beginning, however, there were signs that this decision was foolhardy. “The experience of World War II and Korea showed that dehydrated products were of questionable acceptability, and it was well known that the commercial sector had all but rejected dehydration as a viable process, [but] the Army still encouraged its proliferation,” wrote Stephen Moody, director of the Natick Center’s Combat Feeding Directorate, in his master’s thesis.6 Throughout the 1950s, the Quartermaster Food and Container Institute sponsored research on freeze-drying, mostly of meat, with contractors such as the University of California, Armour, the American Meat Institute Foundation, Rutgers, Oregon State, and the Georgia Institute of Technology, and also did its own in-house studies.

Liquids, such as coffee extracts, were relatively simple to preserve with the technique because they had no structural elements that needed to be maintained. Volatile compounds take longer to turn to vapor than water and some are trapped in the viscous coffee during the freeze-drying process. When the resulting powders, such as the instant Folgers and Sanka that so captivated our grandparents or the café in café con leche, are added to hot water, the aroma components are released again, giving the brew a reasonably good taste. But when things got more complicated than coffee or tea, the results were this side short of repulsive. Along with the water, the process ended up removing more of the volatiles, which escaped through cracks in the now brittle material, leaving food that tasted flat. Far worse was the impact on mouthfeel. Meat fibers were toughened; vegetable cellulose collapsed, ruptured by the ice crystals; and rehydration was fleeting—after the first bite, the fluid drained, leaving behind something resembling a damp loofah.

But the army wasn’t discouraged; on the contrary, it was gung ho. In fact, by the late 1950s, armed forces subsistence experts had developed a delightfully futurist vision of on-the-go chow based on freeze-drying: there would be stacks of cute little bars, and soldiers would munch their way through three square meals plus snacks (or rehydrate them for a luxury dining experience)—breakfast cereal, bacon and eggs (scratch the eggs—stability problems), split pea soup, hash browns, fruit, carrots and peas, chicken and rice, fudge brownie. And to accommodate the human compulsion to customize, men could season to taste by tearing off a sheet from a booklet of laminated condiments: ketchup, barbecue sauce, sautéed onions, jams, peanut butter, soy sauce, maple syrup, and relish. But the military still didn’t have the industrial base necessary to execute this sweeping plan. So the Quartermaster Food and Container Institute convened a conference in September 1960 on its efforts to create light, long-shelf-life rations. More than four hundred participants from government, industry, and academia attended, and less than a year later a second, more technically oriented international gathering was held to “lead to a more attractive economic outlook for freeze-dried foods.”7 Even so, this pie-in-the-sky project might have continued to languish, but for the fortuitous appearance of a real need for sci-fi-style feeding: the race to the moon.

The U.S. space program was born of the Cold War—a scientific bicep popping intended to intimidate the Soviets by putting a man in space. But by the time President Eisenhower founded the National Aeronautics and Space Administration (NASA) in 1958, the USSR had already beaten the United States to the cosmological punch. The year before, they’d plopped Sputnik, the first artificial satellite, into orbit, rattling DOD cages with (entirely prescient) fears of sky spying and cross-continental missile strikes. In 1961 our archenemy propelled a real live human being, twentysomething Yuri Gargarin, for a whirlwind, one-hour-and-forty-eight-minute spin around the world. Although the United States sent up its first astronaut for a fifteen-minute motor-revving display a month later, the score was an alarming two zip.

NASA’s lengthy to-do list included figuring out what’s for dinner when you’re 238,900 miles from home. Not only did food have to be compact, lightweight—in the early days, every pound off the ground cost $100,000—and ready to eat, it had to be completely hygienic. (Food poisoning, with its attendant ejaculation of chunky fluids from digestive tract termini, was highly undesirable.) Thus, it makes sense that the organization chosen to develop nourishment for astronauts was the army, first through the Quartermaster Food and Container Institute in Chicago and later through its successor, the Natick Center. The challenges of preparing space food and rations were remarkably similar, although space travel had its unique menaces, such as clouds of astronaut-choking, console-clogging, zero-gravity crumbs. First out of the gate, the harebrained-sounding food-bar scheme, which NASA bought lock, stock, and barrel, although with a twist: bite-size cubes instead of bars to eliminate the potential hazard of deadly crumbs. The cubes debuted on a 1962 Project Mercury flight to less than enthusiastic reviews—“harsh” and “dry” were the exact words. The freeze-dried tidbits actually subtracted moisture during mastication, so they left space travelers with a major case of dry mouth.

Negative taste tests notwithstanding, Natick Labs was soon awash in astrodollars. Even in that most affectless of documents, the meeting minutes, it was hard to disguise their elation: “This overall area is being heavily funded. The sophisticated requirements of these food bars have never been considered before and necessitate considerable contractual and in-house work to advance the technology thereof.” By 1963 not only were the army’s own labs working around the clock (during most of the decade, it had ten full-time staff people dedicated to the project), but it was overseeing at least sixteen industry and academic contracts for related research, including ones with Pillsbury, Swift, Archer Daniels Midland (ADM), the University of Minnesota, and MIT. Most were for applied research, developing such things as binders to impart structure and strength, edible coatings to foil flaking and exclude moisture and oxygen, and moisture-mimicking additives to make the bars somewhat edible. This work resulted in many industry patents from the mid-1960s onward. The MIT contract, on the other hand, was for an important basic research question: why didn’t freeze-dried foods taste good? The man Natick chose to unravel this knotty problem was an up-and-coming young professor named Marcus Karel.

MARCUS KAREL IS CUT from a fine twentieth-century cloth that has all but gone the way of the gramophone. Humble, hardworking, and deeply humanitarian, as a teenage member of a semiclandestine Zionist group he helped Jews escape his home country of Poland after World War II. He then emigrated himself, eventually getting a job as an assistant in the MIT packaging lab and entering its food technology doctoral program. By the late 1950s, he had graduated to studying the permeability of different kinds of plastic films to water and flavor and also discovered his life’s work: understanding how water and oxygen affect chemical reactions in food. Karel successfully defended his thesis in 1960 and was offered a faculty position in the Department of Nutrition, Food Science, and Technology.

Vital to Karel’s work—and food science in general—was the concept of water activity, a new way to understand how its molecules behave in a substance. Water is the major component of almost all food. We living things are full of it—animals, about 70 percent; plants, 80-90 percent—and so, naturally, are edibles. The high proportion of water facilitates a dizzying number of chemical reactions and biological processes. After death, this tissue continues to be highly chemically reactive, although functioning, absent cellular respiration, trickles to a halt (except for enzymatic reactions, which depend only on the presence of their substrates to work). The lack of defenses, both barrier and immunological, turns the nutrient-rich organic matter into a virtual bacteria and fungi magnet. But although moisture is definitely associated with spoilage, the amount of water in a food doesn’t always predict whether it will go bad.

The conundrum was solved by William James Scott, an Australian bacteriologist at the Council for Scientific and Industrial Research, who, before World War II, had studied spoilage in chilled ox muscle for that country’s beef exporters, and during the war worked to ensure the safety of the food Australia supplied to the Allies. Starting in 1953, he did a series of experiments, adding different amounts of solutes to a nutritious substrate and then, after a set time period, recording the number of organisms for two of mankind’s bacterial baddies, Staphylococcus aureus and Salmonella. The solutes lowered the substrate’s vapor pressure ratio, a standard chemistry measurement. (The vapor pressure of pure water shows how much force water molecules exert on the surrounding air at a given temperature. The vapor pressure of a food at that same temperature is lower, because some of the water molecules are bound to the food material. The vapor pressure ratio, called the water activity, or aw, by food scientists, is the vapor pressure of the food material at a given temperature divided by the vapor pressure of pure water at that same temperature. The more strongly the water molecules are bound to the food material, the lower the water activity.) What Scott found was that there was a vapor pressure ratio below which the population growth of bacteria was virtually zero—0.85 for S. aureus and 0.90 for Salmonella and most others. (Yeasts and molds can survive down to 0.60.) In early 1957 Scott proposed a new theory, one in which microbial spoilage was related not to absolute water content but to the amount of water available—that is to say, not chemically bound to the food material—for microorganisms to perform vital life functions (ingestion, respiration, reproduction, excretion).

Karel became one of the earliest proponents of Scott’s water activity theory. His Ph.D. student Ted Labuza recalls his final year as an undergraduate in MIT’s Department of Nutrition, Food Science, and Technology: “I had taken a course … with him where he introduced the concept of water activity and the application of kinetics to food storage stability. At that time there were no textbooks with this concept… . He had just then gotten a grant from the U.S. Army Natick Labs and the air force to work on stability of military and space foods.” Karel’s first space program work enumerated the things that could go awry with deeply desiccated foodstuffs.8 As it turns out, there were a lot.

First, enzymes, which are present in all animals, most plants, many microorganisms, and fresh products made of the same, weren’t inactivated by freeze-drying. Unless they were denatured by heat or acidity, these specialized proteins kept on catalyzing chemical reactions, in some cases creating a dark pigment that resulted in unappetizing-looking edibles. On the other hand, discoloration due to the breakdown and recombination of sugars and amino acids, known as nonenzymatic, or Maillard, browning, was minimal in freeze-drying. The surprise was lipids, which, when left high and dry, reacted with oxygen, creating a rancid taste.

“The Karel lab was a primary group combining kinetics, water activity, and packaging engineering, and Marcus was the orchestra master,” explains Labuza, who worked on several of Karel’s projects on deteriorative reactions in dehydrated food, “and the players, his ‘science children’ went on in food engineering as faculty somewhere compounding the impact of what Karel taught them.” In 1965 teacher and pupil attended the first-ever international conference on water activity in food, a life-changing event for Labuza. “The Natick Center was there; their representative was a guy named Harold Salwin. He’d done a study with cookies that stored them at different relative humidities and what he found was that at lower relative humidities, the shelf life was reduced because of the oxidation of lipids. I was quite interested in that and it became the basis of my Ph.D. thesis.”

The two MIT academics also scored a major funding coup, snagging NASA contracts to continue Karel’s work on flavor degradation in freeze-dried foods, as well as one “to design foods for the space program under contract to the U.S. Air Force, in a classified research program called Skylab. The idea was to come up with a bar of some sort and study the shelf life of it,” explains Labuza. After that, there was no stopping them. In the course of a few years, in addition to other technical reports for Natick, the air force, and NASA, the pair published articles together on related topics in the Journal of Food Science (1966), Cryobiology (1967), Journal of Agricultural and Food Chemistry (1968), Journal of the American Oil Chemists’ Society (1969, 1971), Food Technology (1970), and Modern Packaging (1971). In 1969 Labuza; Steven Tannenbaum, a colleague; and Karel made the crucial breakthrough that enabled food technologists to put Scott’s 1957 theory to use: a mathematical model that mapped water activity, temperature, and different deteriorative reactions.9 Water sorption isotherms, which are based on observational data of water activity for each food item under the varying conditions, finally allowed companies to accurately predict shelf life for their products. Says Labuza, “The importance of Natick was that they had the money. The work that they funded really set the basic principles.”

COMPARED WITH THE LIMITED OPTIONS of the first space travelers, the Project Apollo (1968-72) astronauts enjoyed the equivalent of a groaning cruise ship buffet—everything from bacon and chicken sandwiches to potato salad and pineapple fruitcake. There was only one hitch: almost everything was still freeze-dried, either in rehydratable packets or ready-to-eat nuggets. Despite the vastly expanded offering, the reaction was about the same as it had been back in 1962: “Inflight nausea, anorexia and undesirable physiological responses experienced by some crewmen were believed to be partly attributable to the foods,” said the NASA biomedical report. (The rehydratable entrées were also tried out in Vietnam with even worse results—reconstituting your food with jungle water is never a good idea.) At this point, it would have been sensible for the army to be a mite worried. Almost two decades had passed, and they still hadn’t been able to make freeze-dried rations that tasted good. But still the army persevered, albeit conceding that “the use of plasticizers seems essential to this development.” Industry contracts were duly awarded.

But just in case, the Natick Center also hedged its bets with something moist that didn’t require as much expensive equipment as did freeze-drying: dog food. In the early 1960s, General Foods had applied Scott’s water activity principle to a relatively low-risk market segment—canines—and launched a new shelf-stable patty of extruded animal and vegetable proteins. The army wasn’t going to slip actual Gaines-burgers into the mess kit, but if they borrowed a technique here and there and applied it to people chow, who’d know? In 1965 the Committee on Animal Products of the Advisory Board on Military Personnel Supplies had already directed that “more attention be given to investigations on preservation of foods at ‘intermediate’ moisture levels.” A first industry contract, with General Foods, had been completed by 1968, and Natick, the air force, and NASA were working hand in hand, “since such close cooperation assists greatly in the coordination of the research.” The next NASA contract looked at the deterioration of intermediate-moisture foods (IMFs) and named Ted Labuza, an up-and-coming young MIT professor, as principal investigator.

Until IMFs came along, preservation was a Goldilocks story without the “just right” option. There was canned, which was fairly tasty, but heavy. And there was dried, which was generally not so tasty, but light. IMFs usually have a lower water content than normal foods, so they weigh a little less; however, more important, they have significantly reduced water activity, generally between 0.6 and 0.9 (by contrast, the aw of dried food is 0.2 or less), so bacteria simply can’t reproduce in the numbers necessary for spoilage or illness. That means that although they are soft and humid, IMFs can be stored at room temperature for a long time with just regular packaging materials. This development made the military—and its friends in the food industry—very happy indeed.

The master-apprentice relationship has a predictable arc. Awestruck adulation. Amiable peers. And, finally, a battle in which the disciple proves himself worthy by besting his mentor. In 1971, when Labuza was denied tenure and departed for the University of Minnesota, he took some of MIT’s prestigious NASA contracts—in which he’d undoubtedly preened and posed during hiring talks—with him. It wasn’t a clean victory; Labuza had tried to get NASA to invest immediately in the new IMF technology, but he was told that the contracts to produce freeze-dried rations had already been signed for two years.10 But the center of gravity for NASA-funded investigation of cutting-edge food-preservation techniques had shifted from the country’s most glamorous technical institute to a staid but steady public research college—and that was probably the moment when the sun began to set for MIT’s Department of Nutrition and Food Science.

That summer, Labuza had a major triumph: the Apollo 15 astronaut David Scott snacked in space on his intermediate-moisture apricot food bars, which had been manufactured by Pillsbury, by threading them through a circular port in his helmet.

Ground control: Are you eating a fruit bar?

Scott: I might have been eating a fruit bar. I really liked the fruit bars. Anytime that was break time was a good time to have the fruit bar and a drink of water.11

The University of Minnesota’s press release emphasizes the snack’s longevity. “This bar-like food, high in calories per unit of weight, lasts about 6 months without refrigeration.” Reflects Labuza on his contribution to the breakthrough, “Marcus Karel and I were key in the understanding of water activity from a thermodynamic [perspective], not just bound vs. free water… . Our labs and Duckworth’s [another food scientist] set the stage… . Most soft chewy bars are based on our work.” Even the conservative Advisory Council to the Natick Combat Feeding Program could see the commercial potential. According to its 1972 minutes, “It is important to note that the characteristics of these food products also make them of great value to a wide variety of civilian food requirements.”

Labuza’s star was in full ascendance during the 1970s. Over the course of his almost ten-year contract between NASA and the University of Minnesota, he hammered the kinks out of IMFs, mostly by applying the same approaches taken by Karel a decade before with freeze-dried foods. Toward the end of the decade, Labuza tackled humectants, one of the most important ingredients for food preservation; these compounds, such as the polyhydric alcohols glycerol and sorbitol, both lower the water activity by binding water and also impart a sense of moistness and softness, making food more palatable. “Only sweet flavors ended up being made for the space flights,” explains Labuza. “It’s a lot more difficult to lower the water activity of foods without sugar. For example, the water activity in a piece of meat can be lowered by adding salts and sugars, but these change the flavor, often undesirably.”

The food industry has frequently come under fire for its lavish use of these solutes, and, specifically, its extensive research on how to more skillfully manipulate the flavor enhancers to addict us to junk food. But there’s another equally important, if not more important, reason to lace our breakfast cereal, bread, lunch meat, chips, soups, heat-and-serve meals, and cookies with salt and sugar. Our food is geriatric, and these two common chemicals do an ace job at mummifying and bestowing false youth—bright colors, firm shapes, soft textures—to edibles way past their prime.

The ability of sugar and salt to preserve food has been understood instinctually, if not scientifically, since people began to break out the Egyptian bottarga (dried and salted fish roe) and Roman honey-preserved dormice at parties. Over millennia, salt and sugar evolved from precious—even miraculous—substances fit to pay soldiers, grace the tables of kings, or be meted out by apothecaries, to mundane items bought by the box and sack in any grocery store. The two compounds work similarly to reduce deteriorative chemical reactions and prevent microbial growth, although they have some striking differences. Salt is a mineral, an inorganic compound; electromagnetically charged; and impervious to temperature changes. Sugars (there are various kinds) are from plants, are organic compounds (with a carbon molecule skeleton), are not electromagnetically charged, and become more soluble (more can be dissolved in water) the warmer it is. But in food, they do more or less the same thing, which is to sop up water like molecular Bounty; this dramatically lowers the concentration of free H2O molecules around any microbial spoilers or pathogens, which then have their insides sucked dry by the same cell-wall osmosis on which they depend for water and nutrients. Sugar and salt can also break the bonds holding together bacterial enzymes, which inactivates them, and even unravel their DNA.

But that’s just the beginning. If there are two more successful multitaskers, they have yet to be discovered. Both make major contributions to texture. Salt extracts water, creating denser solids that have a satisfying snap. Ordinary sugar (sucrose, from sugarcane or sugar beets) plays both sides, endowing either plasticity or crunch (for example, soft cookie vs. hard), depending on its type, water activity, and processing technique, and increases the viscosity of liquids, giving body and mouthfeel to soft drinks and syrups. Sugar is a consummate makeup artist, touching baked goods with gold and adding shimmer to sauces and glazes. The duo also enable secondary flavors, from sugar’s participation in the world-famous Maillard reaction to salt’s willingness to act as a shill for the artificial flavors in and dusted on chips, nuts, pretzels, popcorn, and extruded snacks. And both are key to activating or controlling important processing techniques such as leavening, marinating, pickling, and freezing. Is it any wonder that when the food industry needs a preservative, humectant, volumizer, bulking agent, dispersant, or color stabilizer, it most often chooses these two natural chemicals that taste good, are dirt cheap, and have a history of safe use that dates back to before the birth of Jesus, Mohammed, or any other major-religion-founding prophet?

Back in Minnesota, Ted Labuza helped his industry colleagues develop and patent IMFs for the consumer market. “The military had hired Pillsbury to make the bars,” he says. “Once they were made, the military released the information.” Former Journal of Food Science editor Daryl Lund explains why: “Information was disseminated nearly immediately because Natick had an interest in IMFs for long-term storage, combat rations, etc. The fact is that they also wanted that information out because they wanted people to develop food that would have this kind of shelf life.” Labuza gave generously of his time, consulting with Pillsbury (its 1970 Space Food Sticks may have been a little before the curve; they bombed), Quaker Oats, General Mills, and other companies. From the mid-1970s to early 1980s, a barrage of energy bars hit the market, all from large companies, many of them frequent Natick partners, including General Foods, Carnation, Kellogg’s, Kraft, and Nabisco. Their target audience? “Mothers will be able to give their children sweet substitutes for candy that are highly balanced in protein, fat, sugar, and vitamins,” explains Labuza.

Meanwhile, Marcus Karel, Labuza’s former teacher, labored on in MIT’s Nutrition and Food Science Department, which, observing waning Defense Department and NASA interest, had beefed up its nutritional research and thrown itself into global public health and the crowded National Institutes of Health ring, competing against a multitude of medical researchers. But despite its change in focus, the program began to fade; it lost the spotlight among the university’s exemplary departments in annual reports, and the number of graduate students and undergraduate majors began to decline. Eventually MIT dismantled the whole operation, parceling out professors to other disciplines, retiring others, and, in an institutional sense, banishing food science to the kitchen to scour pots while inviting its former colleagues chemistry, biology, and engineering to feast in the dining room.

And what became of the army’s double-decade-long, multimillion-dollar investment in freeze-drying? The military quietly changed course and hoped no one would remember the minilibrary of dehydrated, compressed bars it envisioned each soldier would have buried in her rucksack. In the commercial market, the forlorn remnants of this once-bright technology are found in your favorite déclassé breakfast: instant coffee and cold cereal studded with nibs of dried strawberries, raspberries, or blueberries.

IF YOU COULD TRAVEL BACK IN TIME TO 1983, it’s doubtful you would have picked three underachieving or unemployed Frisco runners (a biophysics and medical physics Ph.D., a recently fired track and field coach, and a nutrition student) as the trio who would allay one of the nation’s most deep-seated anxieties, that of engaging in everyday activities without a stockpile of snacks. That year, Brian Maxwell, who six years earlier had blazed to third place in the Boston Marathon, hit the wall in a small race. The wall is the point around the twenty-mile mark when the little orange light starts to glow on the gas gauge: the body has used up all its available blood glucose, stored glycogen, and fats (fats are even less available during intense physical activity because muscles must tie up much of their oxygen to metabolize them). Doubled over in pain, he completed the course, but lost his position as a front-runner. A die-hard competitor, Maxwell vowed never again. He set about creating a lightweight, nutritionally balanced snack that would restore lost vitamins, minerals, and amino acids and provide the perfect punch to finish the race.

Over the next few years, Maxwell, Bill Vaughan, and Jennifer Biddulph, Maxwell’s future wife and a nutrition student, munched on enough not-quite-right energy bars to last several lifetimes. The process was fetchingly homespun: They concocted batch after batch in Maxwell’s kitchen, combining oat bran, corn syrup, maltodextrin (a dry sweet thickener), milk protein, and peanut and sesame butters. Then they would field-test—Maxwell running and occasionally munching; Vaughan trailing him on a bicycle. Following that, they would shelf test by swaddling their creations in Saran wrap and dumping them on Maxwell’s sunny dashboard. The results were not pretty. After a couple of weeks, the bars moldered, emitting an arresting barnyard perfume.

Vaughan, the science guy, came to the rescue. He had originally met Maxwell through a consulting gig at Protein Research, a company that develops and formulates nutritional supplements, and says of their relationship: “I was the chef and he was the cook.” But while he had studied nutrition as part of his Ph.D. program, Vaughan was no food technologist. To solve the case of the odiferous confections, he turned to the Bioscience and Natural Resources Library at Berkeley, where a knowledgeable librarian, Norma Kobzina, who has since died, helped him find resources on controlling water activity. “Berkeley probably didn’t subscribe to Food Science and Technology Abstracts [FSTA, an index that’s the be-all and end-all of food technology research], but it would have been available through Dialog, and my guess is that Norma would have done a Dialog search,” said Axel Borg, another career University of California librarian.

That search would have provided a road map on how to make moist and chewy food bars that could be stored at room temperature, part of the large body of scientific literature, most of it generated by academics, on which food companies rely to develop new products. If you do an FSTA search of water activity in moist foods between the years of 1970 and 1985, you get 101 results: articles, theses, conference proceedings, and patents. One author appears on more than one-seventh of these and on almost all the important ones—“The Effect of Water Activity on Reaction Kinetics of Food Deterioration,” “Effect of Temperature on the Moisture Sorption Isotherms and Water Activity Shift of Two Dehydrated Foods,” and “Prediction of Water Activity Lowering Ability of Food Humectants at High aw”—Ted Labuza. There’s nothing odd about that, of course. Every field has its experts; they get that way because their topics have intrinsic value, and the marketplace of ideas ensures that the cream rises to the top.

Vaughan now had the answer to his problem. He rejiggered the recipe, adding fructose to bind up more of the “free” water, bringing the water activity to below 0.85, which made the confection inhospitable to mold and the proteins in it less prone to enzymatic deterioration. He also dosed it with branched-chain amino acids, especially leucine, after reading an unpublished paper, “The Effects of Submaximal Exercise on Whole Body Leucine Metabolism,” that found that endurance sports increased the consumption of this vital amino acid throughout the body. This became the first formulation for the PowerBar, which went into production in 1986. Buoyed by Brian’s indefatigable appetite for racing—he was, in Vaughan’s less than complimentary term, “a grinder”—and instinctive grasp of grassroots promotion, such as handouts at the local 10K, by 2000, the company’s sales had ballooned to $150 million, the year it was sold to Nestlé for $375 million. Today the energy bar category is so crowded it commands its own shelf at the supermarket: Clif, Balance, LUNA, Atkins, and Odwalla are eagerly joined by the oldie-but-goodie conglomerates—primarily ready-to-eat cereal manufacturers, such as Kellogg’s, Nature Valley (General Mills), and Nabisco.

It’s a heartwarming tale: a couple of penniless but inspired entrepreneurs reenacting our favorite bootstrapping business origin story—the one that proves our meritocratic capitalistic system works, by gum. Their product, vindicated by multidigit sales growth, the result of old-fashioned American ingenuity—with maybe a kindly librarian thrown in for good measure.

Vaughan is vehement that the PowerBar has no military heritage and, in fact, rather testily characterizes army research as “pissing down a dark well.” He continued to disavow any connection in response to several of my follow-up questions:

Did any of the scientific and technical resources he used have military origins?

“[The bioscience library] did not collect army/navy research documents. That would have been in the documents library, which was not a place to go to search for state-of-the-art nutritional information.”

Was he aware of the Natick Center?

“Not in the least.”

Did he know that the military had been researching food/energy bars since the early 1960s, including having had contracts with Pillsbury and General Foods to develop, among other things, cereal-based bars?

“Never heard of it.”

How about intermediate-moisture foods?

“Never heard of it.”*

Vaughan is not alone in his ignorance of the influence of military-funded or -orchestrated research on the very products he developed for the commercial market. Many food technologists don’t even know what the Natick Center (or its predecessor organizations) is, let alone have an awareness of how it has steered the science and technology behind their industry. To understand its importance requires serious detective work. Identifying first-generation impacts—papers, patents, products—is hard enough, as credit is most often claimed by collaborators. But after that, the path to understanding why a particular scientific or technological direction was taken is long, circuitous, and dim, and must be cobbled together from article footnotes, long-ago meetings, professional relationships, and CVs. Unsurprisingly, military provenance often vanishes.