A LOAF OF EXTENDED-LIFE BREAD, A HUNK OF PROCESSED CHEESE, AND THOU - Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat(2015)

Combat-Ready Kitchen: How the U.S. Military Shapes the Way You Eat (2015)

Chapter 9

A LOAF OF EXTENDED-LIFE BREAD, A HUNK OF PROCESSED CHEESE, AND THOU

LUNCH BOX ITEMS #3, #4, AND #5: SLICED BREAD, PROCESSED CHEESE, AND CHEESY CRACKERS

The only time I’ve truly enjoyed bread was during the four years I lived in Ecuador in the 1990s. My future husband and I bought baguettes daily at the panadería near our apartment. Every morning we took turns slicing them lengthwise, spreading the pieces (me, sloppily; him, carefully) with a locally made American-style peanut butter (my taste of home) and mermelada de guayaba (his), and serving it to the other in bed with café con leche. It left our mattress on the floor full of crumbs, but it tasted heavenly—sweet, gooey, and crunchy—and was so satisfying that neither of us would eat again until we met for almuerzo, hearty lunch, or at home after a long day of editing (me, in English; him, in Spanish).

But that was just an interlude. My primary relationship, like that of most Americans, has been with the spongy white or wheat slices stacked in a twist tie-closed plastic bag. They’ve been there for as long as I can remember—or rather, can’t remember, for aside from my maternal grandmother’s almost blackened, buttery toast and the limp, American cheese sandwiches my mother made for car trips, I can barely bring into focus the bread from my childhood. Those same whole wheat, twelve-grain, oatmeal, or white loaves are now the staple in our house, residing in a wood-lined, aluminum 1950s bread box. The contents of the packages never seem to stale—although they occasionally mold after a couple of weeks—just dwindle to a pair of sad-sack heels. In the almost two decades we’ve been raising children, I’ve never seen any family member grab a piece and just eat it, whether for pleasure or hunger. And I certainly can’t imagine a young couple, both alone and far from home, turning the square slices into silent declarations of love. Today our bread is most noticeable when absent. “Could you pick up a loaf when you’re at the market?”

WHETHER BREAD HAS BEEN A NET GAIN OR LOSS for humanity is open to debate; that it has been a boon for dentistry is not. Ever since sinking our teeth into the first loaf—pounded, macerated emmer seeds, a type of early wheat, mixed with water and baked—our oral health has gone downhill. Egyptians, who invented the foodstuff, had the worst teeth ever: ground to stumps by debris-laden, artisanally milled grains; pocked with cavities from bacterial infestation; and marked by deep abscesses, often to the bone, from gum disease. Nonetheless, the North Africans loved their carbs, and with more than forty different types, were never at a loss for their daily—or hourly—bread.

Like those of other ancient civilizations, early Egyptian baked goods were a testament to man’s desperation to make something palatable from the unremitting monotony of a grain-based diet. Still, after all that tiresome Little Red Hen activity (planting, tending, harvesting, threshing, grinding, mixing, and cooking), the result was flat and tough. It would take the accidental discovery of yeast to turn the foodstuff from frumpy to fabulous. Many food archaeologists believe that the uplifting impact of the microbes might have been discovered when a bowl of mash was left to sit overnight, but equally likely, given our propensity for partying, is that its origin was linked to the preparation of bread’s naughty fraternal twin, beer.

Thus was born the protobaguette. The leavened loaf, the original convenience food—no pots, no plates, and no utensils needed—debuted about six thousand years ago. Wheat flour, for which today we use red, white, and hard varieties rather than antiquity’s less pliable emmer, has unique properties stemming from its extraspecial protein, the much-maligned gluten. (Celiac disease affects an estimated 2 percent of the population. It may have been a human adaptation to maximize host nutrition from cereal foods while infested with worms.)1 Gluten macromolecules—of which one of their components, glutenin, is nature’s largest protein—when mixed and kneaded, begin to snag on one another and form lengthy, cross-linked chains. The protein strands are studded with starch molecules, long, branched spirals with a hollow at the center, which easily absorb water, becoming a viscous gel. Gluten gives bread its sponginess; starch, its silkiness. But all this would be just a bland, leaden mass if it weren’t for effervescent yeast.

Yeast, like many of its microbial brethren, is a switch-hitter, changing metabolic pathways at the drop of a hat. There’s the efficient but cumbersome aerobic pathway, supplying a hefty payload of thirty-one energy-giving ATP molecules, which occurs when there’s plenty of oxygen and food, sugar or its sumo-weight cousin starch. Both the sugar and starch molecules are built of the same basic unit, a chain or ring of carbon festooned with oxygen and hydrogen, but sugar is the size equivalent of a single-family home, while starch can be anything from a large apartment complex to a good-size city. The by-product of this route is—bubble, bubble, bubble—carbon dioxide and water. And then there’s the less-efficient but quick ’n’ easy anaerobic pathway, delivering only two ATP, which occurs in oxygen-limited environments again with an ample food source; its by-products are fizzy carbon dioxide and ethanol.

The latter pathway, called fermentation, has proved so valuable to us that it’s been the motive for the capture and forced labor of a particularly sweet-toothed yeast species, Saccharomyces cerevisiae. A classic sycophant, S. cerevisiae is all smiles with its master (us) and a ruthless killer with its microbe peers, using aerobic metabolism to reproduce rapidly, crowding out competitors, and anaerobic metabolism to poison them with toxic levels (to other microorganisms, not to us—at least in moderation) of its waste, alcohol. In beer, both by-products remain in the liquid, while in bread, the hooch is removed, first by venting the dough with a punch and then through evaporation during cooking. Of course, this dominance strategy is double-edged. Once the alcohol level has reached a certain percentage, it kills the yeast itself.

Although tradesmen initially had no clue how fermentation worked—was it magic, spontaneous generation, or something in the air?—for centuries breweries and bakeries had a symbiotic relationship based on their essential ingredient, with the bread makers exchanging cash for used mash to employ as starter. Meanwhile, while not exactly closing in on the cause, beer makers and scientists got a lot better at separating the element that was causing all the hubbub. In the late 1600s, Antoni van Leeuwenhoek vastly improved the microscope, a magnifying glass used in his day job as a cloth merchant to inspect fabric, by figuring out how to make very small glass spheres that could then be ground into powerful lenses. (This paved the way for the development of the modern compound microscope.) With these, he was able to spy on the doings of bacteria and protozoa—stunning the world with the news that there were “animalcules” all around us—as well as to see tiny inert spheres (yeast) in beer. In 1838 Charles Cagniard de la Tour, a French baron and tireless dabbler, postulated that yeast might be a plant after observing reproduction by budding and carbon dioxide bubbles. But until Louis Pasteur—whom you can also thank for the principles of food sterilization and vaccination—showed conclusively in 1858 that fermentation was accomplished through these tiny balls, that they were alive, and that oxygen accelerated their growth but inhibited fermentation, yeast manufacturing didn’t really take off.

As is par for the course with us humans, it wasn’t for an entirely high-minded reason. An improved continuous whisky still, patented in 1830, had been invented by Aeneas Coffey (an Irishman, of course). Distillation works because alcohol has a lower boiling point than water. When you heat a liquid composed of both to 173°F, the alcohol boils out and can be captured as vapor. But a mixture of liquids behaves as a system, altering the boiling point for all, so this vapor contains quite a bit of water. In a continuous still, the vapor travels through a series of chambers at progressively lower temperatures, removing more water at every step. The new invention ran tubing through not one but two high columns, eliminating the need for a midprocess transfer of the liquid and upping the potency of the final product. Suddenly it was possible to make a tidy profit by dumping industrially produced yeast into a mash of cheap grain (or even tubers; hats off to the vodka belt of eastern Europe).

Within a few years, Jacques van Marken founded the first “yeast and methylated spirits” factory in Delft, the Netherlands, and the Fleischmann brothers set up an eponymous compressed-yeast-and-gin operation in Cincinnati, Ohio. These days, you can still find the cheery yellow-and-red packets in the baking aisle and the slightly seedy-looking bottle—the brand is not high on the price-point scale—at the liquor store. Both Fleischmann businesses prospered, and rivals such as Red Star, which began as Meadow Springs Distillery in 1882—almost always linking bread and booze—were founded. By the early twentieth century, most bakers relied on commercial starters obtained from yeast makers/distillers, which were now sold in damp, centrifuged cakes. The live yeast didn’t travel well, however, perishing within ten days even when refrigerated. If companies wanted to expand their market areas, they had to build local production centers. By the early 1940s, Fleischmann’s had seven plants in the continental United States, two in Canada, and three in Latin America. (Rotgut gin, of course, is perfectly fine at room temperature and needed no such pampering.)

The clunky regional yeast distribution system may have worked all right during peacetime, but it was woefully inadequate for World War I soldiers, who expected a daily sixteen-ounce ration of fresh white “American” bread—supplied by a special Quartermaster Corps bakery company—with their meals when stationed abroad. This problem turned into a full-fledged nightmare during World War II, with its unprecedented number of enlisted men and women to feed (not to mention the occasional “untimely appearance of shell fragments in the dough”).2 Shipping live compressed yeast to all corners of the globe was nigh impossible; local supplies were sometimes nonexistent or hard to come by—in Europe, for example, the Italians, Belgians, and Luxembourgers shared their stores, but the French, hoarding for their beloved baguettes, at first balked. It was time for the Subsistence Research Laboratory to work miracles by figuring out how to induce a state of suspended animation that could be easily lifted by bakers half a world away, months into the future, and in extreme climatic conditions.

The goal was to dry out the yeast but preserve all of its essential structures—cell wall, organelles, DNA/RNA, and enzymes—so that later it could be rehydrated and put to work. A few companies, notably Northwestern Yeast Company, did manufacture dried yeast, but it took almost a day to activate and didn’t have the six- to eight-month shelf life required by the army. Fleischmann’s, of course, was asked to work on the problem, as were other yeast companies, such as Red Star Yeast (the brand is now owned by Lesaffre Yeast Corporation and Archer Daniels Midland), and universities. Because there were no theories as to what might induce boundless hebetude in the microbes, the researchers followed the hallowed “cook and look” protocol, fooling around until something worked. “It will not be possible to describe the vast numbers of experiments that were tried,” noted the Quartermaster Corps in a publication on the topic a decade later.3 Different strains of the fungus were tested out. Variables were changed—more heat, less time; less time, more heat. Even the army’s sensation du jour, freeze-drying, was given a whirl; the only thing it succeeded at was an 80 percent kill rate. Eventually, they hit upon the answer: grow the yeast in a relatively nitrogen-poor environment, extrude it in “spaghetti” strips, and then expose it to a stream of warm, dry air for six to eight hours, which reduces its moisture content to 8 percent (that of the compressed cakes was 70 percent).

It would take another forty years before scientists would understand why this worked. The secret was a substance called trehalose, a sugar molecule—experiments at the time found yeast to contain up to 18 percent trehalose. It had been assumed that trehalose was another carbohydrate storage molecule, but in the late 1980s scientists began to elucidate a protective function. Under environmental stress, yeast (and many other organisms) increases trehalose production, especially around the cell membrane where water molecules attach to it like an insulating layer. This allows it to stay limber despite heat, cold, drying, and other insults. Active dry yeast stored in foil packets—their inhabitants’ long dormancy sheltered by trehalose—was supplied to garrison and field kitchens from 1944 until Victory Day in 1945, bringing the homey smell and taste of fresh-out-of-the-oven bread to millions of soldiers.

THAT “HOMEMADE” LOAF would soon all but disappear. Both world wars contributed to its demise. During the first, stubborn consumers were ordered to buy factory-made bread, because group preparation of food minimized use of fuel and other resources that could then be allocated to the war effort. In the second, food scarcity, vitamin enrichment, and low prices increased consumption of store-bought loaves by almost half. When the troops returned home in 1945 and 1946, rather than scale back now that their military buyers had disappeared, companies focused their marketing efforts on the consumer. Overburdened housewives—and what housewife isn’t?—were happy to oblige.

America’s appetite for bread—the whiter and fluffier, the better—was never heartier; in the 1950s, the food accounted for almost a third of people’s daily calories. Many parts of its manufacture were now mechanized and new bulk dough-making methods adopted, but bread, with its need for individual batches and long periods of rest, as well as its aversion to being either pumped or pressurized, was a culinary throwback—the antithesis of the modern factory and its implacable production line. Tinkerers began tinkering, and in the mid-1950s a new equipment design appeared, the Wallace & Tiernan Do-Maker, which eliminated the whole pesky fermentation business altogether, along with the need for the human touch.

Instead of mixing individual bowls of dough, a pool of perma-yeast—a slurry of yeast, yeast food (mineral salts and enzymes that help break down complex sugars into simple ones), and water—was created and continuously squirted into the flour. This blend was then briefly but violently mixed—from three to five minutes—shaped into loaves, left for a short while, and then cooked. Total time for the microbes to do their magic: fifty minutes of final “proofing” right before the loaf was placed on a conveyor belt and carried into the oven. Compare that to the four to six hours of rising that occurred with earlier factory bread-making methods, and to the twelve to sixteen hours in traditional bakeries.*

A few things were lost in this new process—namely flavor, aroma, and texture—all of which were attributable to a long slow rise, according to none other than the army. “The normal ingredients of bread … are all mild in flavor, as is a freshly mixed dough. The tremendous complex of enzymatic reactions during fermentation gives rise to the formation of many new substances sufficiently volatile to produce olfactory stimuli. The actual baking process, in the course of which crust is formed at a temperature which may reach 150°C, while the loaf interior approaches the boiling temperature of water, engenders many new reaction products which contribute greatly to flavor.”4

The textural issues in mechanically developed dough were even worse. First, the gluten molecules reacted poorly to the savage beating they received in “mixing,” breaking some of the bonds between strands, which contributed to a cakier texture and lower height. To mimic the complex network of proteins formed during traditional rising-kneading-rising, bakers switched to higher-protein wheat varieties, added gluten during milling, and sometimes even during dough preparation. (In fact, there’s a whole industry built on the production of “vital wheat gluten,” which is also used in vegetarian meat substitutes, pet food, and, increasingly, as a binder, filler, or protein fortification in other food products.) Second, the bread didn’t rise enough. This was due in part to the drastically shortened proofing period, which didn’t give naturally occurring enzymes as much time to break down the starch. It was also because of a historic shift in flour composition. In preindustrial farming, sheaves of wheat were left in the field to cure. Some of the grains began to germinate, which increases the presence of enzymes, in both the grain and the milled flour. To address these deficiencies, industrial bread makers added malted barley—malting is when you let seeds germinate before drying them—to increase the enzyme content of the flour.

Enzymes are monomaniacs, proteins on the prowl for a particular substance or substances; when they find it, they facilitate one specific chemical reaction—over and over and over again. This may not sound like much, but in fact they speed up regular organic chemistry by factors of millions, billions, or more. All living cells have enzymes, and organisms from different kingdoms can have the same (or a very similar) enzyme; for example, amylase, which breaks up starch molecules, is found in fungi, bacteria, plants, animals, and your mouth and pancreas. In traditional bread, there are two sources of amylase enzymes: the wheat and the yeast. Both snip up starch molecules into the short sugar chains that nourish the yeast that excretes the carbon dioxide that makes the bread rise.

Amylase was the first enzyme ever to be isolated, from malted barley in 1833. This was no accident; the protein was vitally important to three large industries: as a component of malted wheat or barley flour, it was the principal way breweries, distilleries, and bakeries broke down starch into sugar, feeding the yeast that made their products. In Japan, sake was brewed using the same enzyme—but from a mold, Aspergillus oryzae. Thought to have originated in China two to three thousand years ago, koji, as the fungus is called, was used and sold commercially in Japan from the thirteenth century or so on. In 1894 Jokichi Takamine, an immigrant taking inspiration from the libations of his native country, received the first U.S. biotech patent for the industrial production of fungal alpha-amylase. His hope was that American distillers would take to the more powerful starch digester. They didn’t, but he did manage to license his process to Parke, Davis & Company, a Detroit pharmaceutical company, which merchandised it as a treatment for dyspepsia. (With these riches and those from a later venture involving adrenaline, Takamine purchased Washington, D.C.’s famous cherry trees.) Later improvements in enzyme production technology, in particular moving from time-and-space-consuming surface to submerged fermentation, and the increasingly widespread use of bacterial amylases (which tolerate neutral and alkaline pH) in textile and paper businesses to clean up residual starch, paved the way for a lateral hop to the bakery industry.

Army research into preserving bread began in earnest during World War II, when soldier complaints about hardtack and crackers that accompanied battle rations reached a crescendo, but the technical difficulties were so great that nothing was fielded. By the early 1950s, this endeavor was going full bore: of the forty grain and cereal research projects listed in the 1952-53 Survey of Food and Nutrition Research in the United States of America, eleven were related to the Quartermaster Corps’ goal of producing shelf-stable bread. (Another nineteen were devoted to the army’s other major area of interest, the development of baking mixes with dried flour, leavening, and other ingredients; these gave rise to commercial quick bread, muffin, and cake mixes.) A phalanx of researchers at the Quartermaster Food and Container Institute for the Armed Forces, other government laboratories, universities, and food companies around the country were enlisted to study bread flavor, browning, molding, and staling.

A novel idea to keep bread fresh was proposed by a small laboratory founded by the biochemist James S. Wallerstein, a frequent wartime Quartermaster Corps collaborator and the scion of a family that owned an early twentieth-century malt- and hops-processing business. “Widespread use of the recent important development of canned bread is limited to a large extent by the staling of the bread in the cans. Although such canned bread may remain fresh for some time, it firms up eventually and becomes stale in the can. I have found that bread, baked by my process in which the dough contains heat-stable amylolytic enzymes [bacterial amylases], can be canned and still not undergo this crumb staling or firming which heretofore has prevented the widespread use of this canning process.”5 In the early 1950s the Department of Grain Industry at Kansas State College (later University) was tapped to further experiment with how fungal and bacterial enzymes might be applied to baked goods to prolong freshness. (The Fleischmann Laboratories also did some work on bacterial amylases in baked goods and published a paper on the topic in 1953.) The Kansas State team included Max Milner, who had spent the war years at Pillsbury working on rations; John Johnson, a baking technologist; Byron Miller, a chemist and a World War II veteran; and several others. In 1955 Johnson published “Fungal Enzymes in Baking” in Baker’s Digest. A couple of years later, Johnson and a colleague produced a special report on one of their Quartermaster Corps contracts, “Determination of the Feasibility of Producing Non-Staling Bread-like Products,” in which they used both a fatty acid (these molecular chains occur naturally or can be synthesized; they are essential components of fats) and a bacterial amylase to produce a three-day-old bread with a “60% softer crumb,” although they doubted that very long-term storage—two to four weeks—would be feasible. All alpha-amylases not only increase the sugar available for yeast fermentation but also help to soften crumb and increase volume. Fungal amylases, however, like their source organism, prefer it cool and are inactivated by cooking; bacterial alpha-amylase tolerates heat and a portion will continue to function after baking, keeping the bread texture soft for days or weeks. Fifty years later, Kansas State University boasts of this discovery as a “major research [contribution] … to the field of grain science and grain processing.”

In 1953, B.S. Miller, J.A. Johnson and D.L. Palmer published a journal article in which they showed that bacterial alpha-amylase was a potent inhibitor of crumb firming in bread, although their source of enzyme also caused the bread crumb to be sticky. That work confirmed a claim by S.S. Jackel and coworkers at the 1952 AACC [American Association of Cereal Chemists] National Meeting that bacterial amylase could be used to retard the staling of bread. Today, so-called maltogenic alpha-amylase is used to inhibit bread firming for weeks, resulting in huge savings in the cost of delivering bread to the marketplace.6

As is always the case, before a scientific breakthrough can become a lucrative business, a number of technical issues needed to be resolved—in this instance, not new and better machines, but new and better laboratory processes. There were three important ones for industrial enzyme production. The first, which dates back to the late 1940s, was microfiltration, or ultrafiltration, membranes. As a project to copycat German technology, funded by the army at the California Institute of Technology, the filters were originally developed to test drinking water, but in the 1960s and 1970s their use was extended to biotech applications. The second was cell immobilization, the most common technique of which is implanting the cell in a sticky substrate, a process invented in the early 1960s. The final innovation was genetic engineering; the first successful recombinant DNA and transgenic organisms were created in the early 1970s. By the 1980s the stars had aligned, and the biotechnology industry took off—and with it industrial enzyme production for foods and beverages. In 1990 the first genetically modified enzyme—host organism: Bacillus stearothermophilus from an Icelandic hot spring—to prolong the freshness and increase the softness of bakery products without their becoming gummy was launched.

Today almost all supermarket breads are conditioned with microbial enzymes, especially bacterial ones, which soften texture, increase volume, add color, and extend shelf life by one or two weeks. This was a godsend for industrial bakers. Since the 1960s consumers have been increasingly suspicious of chemical additives and governments stricter in their regulation of ingredients. Because enzymes are a “processing aid” that leaves virtually no residue, they are considered “clean label,” meaning they are not required to be listed on the block of print on the packaging. Besides all sorts of bakery items, enzymes are used to make sugar syrups, including the much-vilified high-fructose corn syrup, as well as to filter juices, clarify alcoholic beverages, speed ripening of and add flavor to cheese, create lactose-free milk, and tenderize meat. In the 1990s a market for these biotech ingredients emerged, dominated then and now by Swedish giant Novozymes. The global enzyme business currently earns more than $5 billion annually, of which roughly a third is for food and beverage additives.

All these changes may be a recipe for ill health. The modern loaf has more gluten to withstand mechanical dough development, more yeast to compensate for the lack of rising time, and a less developed fermentation than the loaves of the past. Exogenous enzymes make up the difference. Recent spikes in autoimmune disorders such as Crohn’s and intestinal dysfunctions such as celiac disease (CD) appear linked to changes in bread. The authors of a global overview of Crohn’s pose this question: “What unites Canterbury in New Zealand, Nova Scotia and Manitoba in Canada, Amiens in France, Maastricht in the Netherlands, Stockholm in Sweden, and Minnesota in the US … ?” Here, perhaps, is a clue: sufferers have antibodies for Saccharomyces cerevisiae, baker’s yeast (although the microbes are inactivated by cooking). And here’s another: they are some of the places where people eat the most industrial bread, according to Euromonitor International, a market research company.

Likewise, there has been a worldwide jump in CD. Because this is a reaction to wheat, it’s not surprising that countries where wheat is a staple have high rates of the illness; what is surprising is its occurrence has increased over time. The authors of a 2013 worldwide review of the condition note that “an ‘epidemic’ of CD was described in Sweden from 1985 to 1995, possibly related to a doubling of gluten content in baby food at that time.”7 Some of the natural enzymes in yeast break down flour proteins—in particular gliadin, the component of gluten associated with CD. But with their abbreviated rising times and added wheat protein, finished loaves undoubtedly now have more gluten—and thus gliadin—than ever before. Could the increase in prevalence of CD be related to the higher gluten contents of industrial breads?

We eat it for breakfast, we eat it for lunch, very occasionally we eat it right out of the bag, but does this stuff—immature dough whipped up with air and inoculated with exogenous enzymes—really deserve the name bread at all, one of the holy trinity of fermented foods, along with beer/wine and cheese? To step into that particular breach, we have the French, with their indefatigable knack for proclamations, whose 1993 Décret Pain, among other edicts (no freezing, no additives), orders that real bread be composed of a dough “fermented using baker’s yeast [S. cerevisiae].” By this standard, all those neatly packaged loaves in the supermarket—the farmhouse white, the 100 percent whole wheat, the multigrain—which have risen only fifty minutes, are not bread. What are they? In the words of the army-funded contractors whose 1950s research on enzymes helped to create them, they are “non-staling bread-like products.”

THERE’S A REASON WHY BAKERS ARE BLEARY-EYED; their wares need to be replenished daily. The traditional loaf has a lopsided relationship with time. On the front end, the dough lolls about for hours or even a day. On the back end, it must be eaten quickly or ends up stale. For manufacturers, the trick to profitability is to shorten the first period and extend the second (not for your benefit but theirs—to reduce unsalable product). For the military, this became a necessity in 1991, the year the army switched from canned to plastic-pouch-encased rations.

Although it was not well liked, beginning in the Korean War and continuing through Vietnam, the Quartermaster Corps had finally produced a tinned bread to accompany the combat ration. The trick had been the rigid, metal cylinder, which acted like a minioven, allowing the dough to expand during processing, and kept it from being crushed afterward. The new multilayer plastic and foil pouches offered no such protection, transmitting the pressure of water and crushing the delicate carbon dioxide-riddled gluten network—bread is, technically, a foam—into a dense, inedible mass. Cooking in the cans also eliminated bacterial contamination, because they were sealed before cooling enough to support a beachhead invasion from new microorganisms. Suddenly, everything the army had laboriously developed in the years during and after World War II to create a palatable slice or two had to be unceremoniously scrapped.

Foreseeing this day, Natick had been working since the mid-1980s on an extended-shelf-life pouch bread, among other things, using a new technique called hurdle technology, which had been developed for the German army for food that hasn’t been heat-sterilized. The approach combines multiple mild barriers to microbial growth, such as water activity, acidity, and chemical reactivity, with chilling, heat sterilization, and other factors. While the bread Natick developed wasn’t going to fell recruits with botulism, it still did what old bread always did: the crust got soft, the crumb got hard, and that delicious olfactory advertisement, its yeasty, caramelized aroma, dissipated.

Understanding the physical and chemical changes bread undergoes as it ages is a problem that has bedeviled scientists for well over a century. From a 1940 article in Cereal Chemistry: “Present knowledge as to the nature of the staling process is inadequate, and further strictly controlled scientific researches are necessary to determine the nature of this process before looking for substances that will prevent it.” From a 1981 review in Cereal Chemistry: “The consensus among the various workers and studies still appears to be that changes in starch play the major role in bread firmness. Bread staling, however, is an extremely complex phenomenon and is difficult to define in straightforward terms.” From a 2003 overview in Comprehensive Reviews in Food Science and Food Safety: “The molecular basis of staling is examined… . The conclusion reached is that bread staling is a complex phenomenon in which multiple mechanisms operate… . The key hindrance to development of a preventive strategy for bread staling is the failure to understand the mechanism of the process.”

Amazingly little is known for sure. Many factors contribute to the staling of bread, including the redistribution of water between the gluten matrix and the starch. But most agree that it has to do primarily with changes to the two starch molecules, amylopectin (70-80 percent of wheat starch) and amylose (20-30 percent). In the uncooked starch granule, the two kinds of molecules form helices, with the amylopectin packed together in repeating patterns, while the amylose remains disorganized. During baking, these granules become swollen with water, the helix loosens, and some amylose leaches out. Immediately after baking, however, the amylose hijacks some of the nearby water molecules and crystallizes in a process called retrogradation. This process redistributes the moisture in the bread and is believed to be largely complete by the time it has cooled to room temperature. The amylopectin crystallizes much more slowly, over a period of days, but perhaps because it is present in a greater proportion, its retrogradation is more noticeable, making the bread seem dry, although in fact it may have the same total moisture content as before. But that’s about where scientific agreement on staling ends.

This uncertainty has given rise to the let’s-just-throw-stuff-at-it school of bread-shelf-life science. Emulsifiers, which affect the cooking and swelling properties of starch? Go for it. Surfactants, which retard water penetration and swelling? Sure, why not? Gums and hydrocolloids, which add stability, softness, and mouthfeel? Can’t hurt. And then, of course, there are the bacterial amylases—who knows exactly why they work, but they do, so in they go. This was the approach taken by the Natick Center in trying to tackle the issue of staling in its new pouch bread. The team, consisting of Daniel Berkowitz and Lauren Oleksyk, then a recent graduate of the Framingham State food-science program, began work on the problem in the mid-1980s. “That’s what I did every day with Dan for a couple years,” says Oleksyk. “Every day I’d come in; we’d meet and look at the formulas we’d done the day before. We’d see what worked and what didn’t. Every day, we were tweaking ingredients, making up new prototypes, storing them. We always looked at texture, color, aroma, taste.”

Eventually, they hit on a technique—combining an emulsifier and a hydrocolloid, a thickening gum made of long chains of molecules that evenly disperse in liquid—that seemed to work miracles. “What we added was not done anywhere else,” explains Oleksyk. “We knew that sucrose ester was an emulsifier in a lot of products. Well, we combined that with PVP, an ingredient which although it has GRAS [generally recognized as safe] status, isn’t usually used in commercial baking. [Polyvinylpyrrolidone is a synthetic polymer that passes through the body without being digested; its typical use is as a binder and coating for pills.] We didn’t know it would have the effect it would have… . We found things that we just didn’t expect: the softness, the volume. We knew it had a water-binding effect. But we weren’t expecting it to affect nonenzymatic browning as much as it did. And whiteness. I changed the percentages so many times, I filled an entire notebook with the different formulations, probably 150 or more. Since then, we took PVP out and used other things to get the shelf life, but that was the original patent.”

In the particulars, the Natick approach to preventing bread staling was hit or miss, but it was roughly based on a new theory about how to understand food—one borrowed from polymer science. This did not, as had been done in the past, view food solely as innumerable individual chemical reactions, crucially affected by temperature, time, water, and oxygen, but also as a unified system with its own characteristics and reactions. As individual molecules have phase transitions, the temperatures when solids change to liquids and liquids to gas, collections of disorganized but interacting molecules—called amorphous solids in chemistry terminology—have glass transitions, the temperatures at which they change from brittle to rubbery to liquid. Glass is one such system (hence the name); its components are molten at about 2,400°F; if cooled a couple of thousand degrees rather quickly, they become increasingly viscous, but the molecules never lock into place. Food and food components are others. This similarity was first noticed by candy engineers in the mid-1960s, but the concept wasn’t applied to other edibles with any regularity until the 1980s.

Although various university scientists worked on the idea, the most prolific—and vociferous—advocates were a duo at Nabisco, Louise Slade and Harry Levine. Their early careers had been at General Foods, a century-old conglomerate—General Foods was merged with Kraft in 1989 and their food businesses combined in 1995—where Slade had worked on frozen dough, and Levine on frozen desserts. Both of these foods exhibit the same changes as polymers as the temperature drops, becoming more viscous, then hardening and eventually “collapsing.” (In the other kind of solids, crystalline solids, molecules are locked into regular repeating patterns; because they’re already in their lowest energy state, they’re not affected much by chilling.) In an amorphous solid, lower temperatures reduce the energy available to hold the disparate and disorganized molecules together; eventually these bonds break to find less demanding arrangements. Perhaps it was this observation that led the pair to polymer science and one of their first published papers together, “A Food Polymer Science Approach to the Practice of Cryostabilization Technology.” They’ve since racked up several hundred and are the leading authorities in the field.

In Slade and Levine’s model, as stored food experienced time- and temperature-related physical and chemical deterioration, its glass transition temperature could change. This means that if the food item had been crisp, its glass transition temperature might drop from above to below room temperature, and it could become limp or soggy. If it had been moist, its glass transition temperature might fall still more, allowing increased mobility of its components, resulting in crystallizations. If we apply this idea to aging bread, the retrograding starch molecules capture some of the water, so less is available overall, raising the glass transition temperature and creating a hard, stale crumb. Slade and Levine also determined the glass transition temperatures of the system in the presence of a range of additives, which could then be used as needed, depending on the desired characteristics of the food. It was this idea that Berkowitz and Oleksyk put to work in their formulation for shelf-stable bread.

However, it was still more of a concept than an application, which left too much guesswork in the lab. The Natick Center asked Pavinee Chinachoti, a University of Massachusetts food chemist, to do studies that might explain the underlying staling mechanism, would allow them to measure starch gelatinization and retrogradation, and predict the effect on shelf life of different combinations of ingredients. Chinachoti worked with Natick for more than a decade, eventually editing one of the two textbooks on the topic, Bread Staling. As part of this work, Chinachoti and her Natick counterpart, Linnea Hallberg, often described the remarkable results Natick had had with their nonstaling MRE bread, as well as a whole family of related products, including the shelf-stable sandwich.

Smelling potential profits, Nabisco asked for a piece of the prize. In 1996 the company invited Natick to enter into a Cooperative Research and Development Agreement (CRADA). The blue-chip conglomerate, whose existing business was mostly crackers and cookies, with a couple of intermediate-moisture items such as granola bars, would get to go up close and personal with Natick’s invention; Natick might get some pointers on how to make its extended-life bakery items tastier, or perhaps even gain a manufacturer of the product. The group was a large one on both sides, and from Nabisco included both Slade and Levine, as well as a microbiologist, Martin Cole, now chief of the Australian national science agency’s Division of Animal, Food and Health Sciences. Says Cole, “At the heart of [the shelf-stable sandwich] was an understanding of the starch retrogradation… . Part of the CRADA was to understand the mechanism behind the Natick work, and then come up with other commercially more viable methods to prevent crystallization of starch.” Which was a polite way of saying that the army’s choice of antistaling additives—PVP and sucrose esters, later xanthan and guar gum, and still later 6 percent glycerol solution, according to Slade—might alienate customers. The Nabisco scientists came up with an alternative set of additives and went into the pilot kitchen.

It may have needed some adjusting, but the idea of the shelf-stable bread, especially with a filling, was a “platform”—food company speak for a basic recipe from which dozens of variations could be created—with legs. “The approach and the data that they had were very useful in terms of designing new products,” explains Cole. “Essentially what it did was allowed us to knock out an operating space and saved us a lot of work to understand where the edges of this thing were. Based on that we were able to put some products across a framework, different combinations of things, different moistures… . The work that Natick had done gave two things. It gave us insight into the mechanistic aspect of staling. Also, it helped position the right elements and other combinations of things that might prevent staling.” Nabisco’s prototypes included a whole range of savory and sweet bakery items that tasted “pretty good” and could remain on shelves for months at a time.

They never made it to the market. In 2000, as part of the continuing tobacco industry liability shell game, Nabisco sold all its food brands to tobacco giant Philip Morris, which merged them with those of Kraft (now Mondelez) and General Foods, which it had acquired earlier. Then Nabisco, which had previously merged with and then spun out R.J. Reynolds and that now had no real business, sold itself, along with all its cigarette-related liability, to R.J. Reynolds. Lost in the shuffle, along with the corporate responsibility to pay any new personal injury claims, was Nabisco’s research on extended-shelf-life bread products. Kraft, as new corporate overlords are wont to do, put in place its own research goals and repopulated the Nabisco labs with its own staff. But good ideas, once let loose, don’t die, they just keep knocking on doors until someone lets them in. Today, treating food as a polymer and manipulating its glass transition temperature by the judicious selection of additives is an accepted part of product design, and used in everything from encapsulated ingredients, frozen foods, baby foods, pasta, energy bars, snack foods, bakery items, candy, powders, to extruded cereal products (that pretty much describes 75 percent of the American diet right there), and even to breakfast cereals with add-ins.

CHEESE PURISTS THE WORLD over exalt their mummified milk. Their silken Goudas and savory Emmentalers. Their fetid fetas and squeaky queso frescos. Their moldy Roqueforts and runny Camemberts. These disks of rotted dairy are the pinnacle of thousands of years of experimentation that began when a herdsman carrying a ruminant’s stomach brimming with milk found that by journey’s end, he had a bag full of curds and whey.

Modern cheese making is a little more complicated, but the same principles apply. Fresh milk is allowed to ferment, with either wild or cultured bacteria, typically one of the friendly LABs (lactic acid bacteria). Then, when they have raised the acidity enough, rennet—enzymes from calves’ stomachs (these have now been replaced with laboratory-produced enzymes)—is added. This coagulates the caseins, which make up about 80 percent of the total milk protein, so that they form a gel. Then there’s a lot of manipulation—cutting, stirring, and heating—that removes fluid, or whey, leaving behind solid curds. The curds are put into molds, salted or brined, and pressed, which expels more whey and turns the cheese into a solid mass. Mold may be added, either at the beginning or later in the process. Then, depending on the variety, the cheeses are matured for anywhere from two weeks to two years, allowing enzymes, both those from microbes and those from the rennet, to turn fats and proteins into tasty new substances.

Cheese is one of the bedrocks on which the Western diet is founded—a long-term storage method for excess milk, especially when cool storerooms and caves were available. But the food didn’t fare so well during summer or in hot climates. With heat, animal fat softens or even liquefies, oozing out and creating an oily and unappealing mess. In the early twentieth century, dairymen on either side of the Atlantic—the Swiss pair Walter Gerber and Fritz Stettler in 1911 and James Kraft in 1916—hit on and patented a solution to the seasonal sweats: emulsifying salts. The chemical disperses water-phobic caseins by exchanging sodium for calcium; this permits the now smaller particles to be diffused and suspended in liquid. Melting traditional cheeses and mixing them with the emulsifying salts resulted in a cheese-like product that withstands high temperatures and protracted storage. Even better, this new food could be made and sold very cheaply, because it could be produced, at least in part, from the rinds and irregular bits left over from cutting wheels of cheese into bricks. Melting the ingredients also pasteurized them, inactivating the live bacteria and enzymes and contributing to a longer shelf life.

The army placed its first order for processed cheese—which at the beginning, came in only one flavor: white—during World War I, buying twenty-five million quarter-pound tins from Kraft. This single act probably established Kraft’s century-long (and still going strong) food industry hegemony. By the time World War II rolled around, the military was a raving cheeseaholic, consuming the dairy product by itself, on sandwiches, or as sauces for vegetables, potatoes, and pasta. In 1944 alone, the Quartermaster Corps bought more than one hundred million pounds from Kraft’s parent company, National Dairy Products Corporation (which finally itself took the Kraft name in 1969), as well as five hundred thousand pounds of cheese spread (bacon bits optional) to accompany the K and some of the C rations. During the war, the company’s sales almost doubled. But it still wasn’t enough. The military was hungry for new ways to store, ship, and eat cheese.

At the beginning of the war, the army had embarked on a dehydration-and-compression spree—by removing heavy water and reducing its volume, more food could be packed into a single shipment, always an advantage when there are millions of mouths to feed. All foodstuffs except meat were run through the drying chambers and squashed into bricks—fruits and vegetables, flour, potatoes, eggs, and cheese. As would become its historic pattern, the military funded or supported a variety of efforts, some of which were destined to die a quiet death and others that would garner glory, becoming wartime staples and the basis for future consumer products. Cheese dehydration research was conducted by the Quartermaster Corps’ Subsistence Research Laboratory, through the USDA laboratories, at various universities, including the University of California at Davis, and by industry, notably Kraft. Unless a food has a strong and flexible internal structure—think cellulose, the long chains of sugar molecules that give plant cells their rigidity—it crumbles when it dries out, something food technologists call fines. One can imagine the first experiment in drying and pressing a proud block of Wisconsin cheddar: cheese dust. This ruled out eating reconstituted cheese out of hand in slices or chunks. But for cooking, the granular form would be an advantage.

The first real cheese powder was developed in 1943 by George Sanders, a USDA dairy scientist. (Even before the war began, USDA’s research facilities had been enlisted to work toward military goals, exhorted by Secretary of Agriculture Henry Wallace “to consider their possible contributions to national needs as the defense program approaches the stage of ‘maximum effort.’”8 This relationship continues to this day; the USDA has collaborated with the Quartermaster Corps and later the Natick Center on topics as varied as chemical testing, fungi collection and classification, potatoes, dairy, and, from 1980 on, operation of the army’s radiation food sterilization program.) Until then, it had been “considered impossible to dehydrate natural, fat-containing cheese,”9 because the heat melted the fat, which then separated out. Sanders’s innovation was to divide the process into two steps. In the first, the cheese, shredded or grated, was dried at a low temperature; this hardened the surface proteins of the particles, forming a protective barrier around the lipids. Once sufficient water had been evaporated, the cheese was ground and dehydrated at a higher temperature. The final step was to form it into what the patent describes as cakes. A 1943 war bond ad unveiled the product to the public with a picture of a bare-chested soldier feeding a second soldier bundled up in a parka with a cheese cake on a pointy stick:

For jungle or ski troops—a new kind of cheese! … But they should taste the same—and taste good—wherever they’re eaten. That has meant many headaches for the Army Quartermaster Corps and the food processors who supply them… . For emergency use in arctic and tropics, National Dairy laboratories developed a dehydrated, compressed cheese that keeps well anywhere and takes less shipping weight and space.

In the summer of 1945, Little Boy and Fat Man were detonated in Japan, ending the war and leaving the Quartermaster Corps with warehouses full of food as well as an elaborate manufacturing and distribution system still churning out goods for millions of troops. This would take years to redirect or dismantle. Fearful of the effect of the sudden withdrawal of its huge wartime contracts, the government propped up the dairy business first by buying its excess product and then, in some cases, by selling it back to the same producers at lower prices. (The Commodity Credit Corporation, created during the Great Depression and still in existence, would later distribute these surpluses to welfare recipients and the elderly—the storied “government cheese.”) A temporary federal agency, the Surplus Property Administration, sold off at bargain-basement prices the food the Quartermaster Corps had amassed.

Who doesn’t love something they get for free or at a third of the original cost?10 But what could one do with football fields full of potato flakes, a cave stuffed with dried eggs (the army’s strange storage location for one hundred million pounds of the stuff), or a mountain of dehydrated cheese? Well, there was one group always interested in lowering the cost of finicky fresh ingredients: the grocery manufacturers, businesses such as Swift, Quaker Oats, General Foods, General Mills, Libby’s, Borden, McCormick, Colgate-Palmolive, Gerber, Scott Paper, Kellogg’s, Pillsbury, and Kraft. (The strength of the companies that produced the packaged goods that lined the nation’s nascent supermarkets, many with deep military ties, only grew over the next century, as did that of their trade group, the Grocery Manufacturers Association, today the food industry’s most powerful lobbying organization.) Perhaps instead of real cheese, the food corporations could mix in the cheap powder to add flavor. Not only would they save outright on the cost of ingredients, they’d pay a lot less to ship and store them—after all, that was the army’s primary purpose in developing dehydrated cheese in the first place. These ration conversions inspired a flood of fledgling products, particularly in the new and growing categories of convenience and snack foods.

In 1948 the Frito Company (it merged with H. W. Lay & Company in 1961 to become Frito-Lay, Inc.) debuted the country’s first cheesy snack food, made with the same Wisconsin cheddar the army used for its dehydrated products. Frito Company founder Charles Doolin had been a military supplier, even building a facility in San Diego, where there is a naval base, to service his contracts. According to his daughter Kaleta Doolin, “During the war, tins of chips were sent overseas to be served in mess halls and sold in PXs. This venture helped put the company over the top as a nationwide business.”11 Afterward, new plants were opened in Dallas, Los Angeles, and Salt Lake City, where soon cornmeal and water were being extruded, puffed, fried in oil, and coated with finger-licking, orange dehydrated cheese. Cheetos! Other companies quickly followed suit, producing savory curls, doodles, and puffs.

Today, the cheese powder category has expanded to include a variety of natural cheeses; extended cheese with additional ingredients such as stabilizers, milk solids, and emulsifying salts; and concentrated enzyme-modified cheese, which is used primarily as a flavoring agent. Dehydrated cheese permeates our lives, enlivening everything from beloved basics such as boxed mac ’n’ cheese to the school of Goldfish that swim through our childhoods and the addictive white or orange dust with which Frito-Lay and other food producers continue to coat their many snack foods. The military’s World War II invention, intended to help nourish warriors in battle, became one of the seminal snack and industrial food ingredients of the twentieth century and is still going strong in the new millennium.