The Pleasure Instinct: Why We Crave Adventure, Chocolate, Pheromones, and Music - Gene Wallenstein (2008)
Part II. The Pleasures of the Sensory World
Chapter 6. For the Love of Chocolate
As life’s pleasures go, food is second only to sex.
Except for salami and eggs. Now that’s better than
sex, but only if the salami is thickly sliced.
—Comedian Alan King
Music with dinner is an insult both to the cook and the violinist.
—G. K. Chesterton
If you ever make it to the Amazon, ask your guide to show you what is likely the most revered tree throughout all of South and Central America. The theobromo cacao, or “food of the gods,” was named by Linnaeus, the great eighteenth-century cataloger of nature. The designation makes clear his admiration for the almost indescribably savory taste of its fruit and seeds as well as the role of the tree in world history. Before taking a bite, few people stop to think—and who can blame them?—about the influence of chocolate on the social, political, and economic evolution of those cultures that came in contact with cacao beans as they spread from equatorial America to Europe, and then Asia.
After a short trek into the jungle, your guide will stop in front of an odd-looking tree, probably no taller than ten meters or so. If it is a mature tree—older than three years—it will have large patches of pink or blue cauliflorous growth on its bark, but your eyes will skip right past this feature and focus on the strange football-size pods that dangle expectantly from its trunk. The outer covering of the tree’s fruit is a tough hide of corrugated green and yellow, which when broken reveals a soft, whitish pulp. The taste of the pulp will catch you off guard. Most people anticipate the tangy sweet flavor of fruit with their first bite, only to be surprised by the subtle, bittersweet taste of chocolate. Enveloped inside the pulp are dark, purple-colored seeds—about thirty to forty per pod—that after being dried and processed can be recognized by epicureans around the world as “chocolate beans.”
Few foods inspire such passion in people as chocolate. This love affair goes far beyond the typical fondness for sweets: after all, we’re not likely to head out into a snowy night, panic-stricken after finding that we are out of lemon-crème pie or bubble gum. There is something special about chocolate that drives us to extraordinary lengths. Chocoholics find nothing strange in spending a small fortune for even a sampler box of champagne truffles. No, a simple sweet addiction is not the same as a chocolate addiction—indeed, many connoisseurs prefer the darkest, most bitter variety.
The history of the chocolate bean is a story riddled with desire that transcends cultural distinctions. The roots of chocolate go back some twenty-six hundred years to the great Olmec and Mayan civilizations that flourished throughout southern Mexico, Belize, Guatemala, and Honduras. Spouted, teapot-shaped vessels have been excavated from towns such as Colha in northern Belize, and found to contain residue of ancient chocolate. The Mayan drink was very different from the watery, sugar-laden version of hot chocolate that dominates modern society. The journals of Spanish conquistadors are filled with descriptions of middle Mayan culture that include the preparation of dried cacao beans ground into a powder and mixed with water, honey, chili pepper, and sometimes maize. The liquid would then be heated and repeatedly poured from one vessel to another to produce a thick head of rich chocolate foam that was the most coveted part of the drink.
A reverence for chocolate was also present in Aztec culture throughout the region. The great Aztec emperor Moctezuma reportedly drank up to fifty flagons of chocolate per day, believing it to have restorative and even aphrodisiac powers. Within this culture, the cacao bean became the primary form of currency, and folklore has it that when the Spanish conquistadors stormed Moctezuma’s temple, they found beans in place of gold.
After conquering the Aztecs, Hernando Cortés returned to Spain and brought King Carlos treasures of cacao beans and a recipe for making xocoatl, which was sweetened with sugar by members of his court. Today this recipe lives on, and the domesticated cacao tree grows on farmlands near the equator in a number of regions including the Caribbean, Africa, southeastern Asia, and in several South Pacific islands such as Samoa and New Guinea.
Whereas modern processing and distribution technology has made chocolate a more common item on the food landscape, its seductive properties have remained a mystery to science. It has only been in the past few years that neuroscientists and biochemists have begun to get a handle on why we find chocolate so pleasurable.
Chocolate contains more than 350 known compounds, several of which activate three important brain systems that contribute to the experience of pleasure.The first ingredient that gives chocolate its wide fan base is plain old sugar, an underappreciated compound these days. Considering our modern tendency to detest all that is carbohydrate and the epidemic-like rates of diabetes in many subpopulations, it is easy to understand why sugar is seen as something to avoid if you consider yourself a health-conscious individual. But in reasonable doses, sugars have a profound and positive impact on our physiology, most notably in the form of a calming effect. Placing a small amount of liquid sweetened with either glucose or sucrose on the tongue of a crying newborn has an immediate calming effect that can last for several minutes. Sugars, in their varying chemical structures from lactose to sucrose, have been shown to activate the brain’s opioid system, a set of circuitry that plays a prominent role in regulating the body’s stress response.
In addition to sucrose, chocolate contains small amounts of theobromine (a mild stimulant) and phenylethylamine, a substance that is chemically similar to amphetamine. Once in the brain, each of these ingredients has an effect on the dopamine and noradrenergic neurotransmitter systems, which are implicated in attention and general arousal. These compounds are thought to provide the “boost” we all experience after eating chocolate.
But chocolate gives us more than a mere boost; most people crave the sense of euphoria that lingers long after the treat is gone. A recently discovered trio of chemicals has been identified in chocolate that seems to be at the heart of this feeling of well-being that is familiar to all chocoholics. Anandamide is a chemical messenger in the brain that binds to the same nerve cell receptors that are activated by tetrahydrocannabinol (THC)—that’s right, the active compound in marijuana. Anandamide, it turns out, is released in small quantities during times of stress and provides a calming and analgesic effect; however, it is quickly broken down by naturally produced enzymes, so there is never very much of the substance in the brain under normal circumstances.The buzz one gets from marijuana is another story altogether—in this case a deluge of THC enters the brain, overwhelming the ability of the enzymes to break it down, so it has a prolonged and more intense effect than the naturally occurring version. The “THC buzz” is essentially an exaggeration or amplification of normal cannibinoid brain system functioning.
The chocolate buzz occurs through a slightly different mechanism. Small amounts of anandamide are present in chocolate (darker chocolates have larger quantities), but not so much that would activate the brain’s cannibinoid system above normal. The key to unraveling this mystery came when two additional anandamide-like compounds were identified in chocolate and found to be present in fairly large quantities. While these related compounds don’t activate THC receptors directly, they increase the effect of naturally occurring anandamide by blocking the enzymes that usually break it down.This means that even small amounts of naturally occurring anandamide or that ingested while eating chocolate will stay in the brain for a prolonged period of time, since it is not metabolized as quickly as normal.The result is that blissed-out feeling of calm that we experience after downing a hot chocolate or going through a few Droste pastilles.
It is easy to see why the depressed and stressed among us self-medicate with chocolate. It quenches the pleasure instinct by activating three key brain transmitter systems that are involved in reward, although they have evolved as adaptations to very different environmental circumstances.
The sucrose in chocolate is just a “souped-up” version of fructose—a form of sugar that is naturally present in most fruits that were widely available to early hominid hunter-gatherers. Sugars are a critical component of life because they provide metabolic energy in the form of ATP that powers the many biochemical reactions within every cell of our body. For the average hunter-gatherer, fruits were a very good nutritional choice, since ounce for ounce they offer a rich source of energy with virtually no exposure to dangerous horns, teeth, or claws. The only problem is in identifying fruits as a desirable substance to eat.
Early in our hominid evolution, the brain opioid system became very important for controlling our eating behavior, mainly in functioning to make sure that certain foods seemed more palatable than others. During this point in the evolutionary history of humans, opioid system activation and the pleasurable sensations that result became associated with the consumption of foods that have relatively high concentrations of sugar. This association was strictly in terms of alterations in brain wiring—some hominids evolved changes in their opioid system that made its indirect activation possible through receptors that were sensitive to the presence of sugar. In a very real sense, the opioid system, which until then probably played a role mainly in sexual reproduction, was co-opted by selection factors that made it very cost-effective (energywise) for hominids to be able to find and want to eat sugar-rich fruits. Hominids with a tendency to experience pleasure when eating something sweetened by natural sugar had a clear survival advantage over their peers who were not “afflicted” with this important mutation to opioid system wiring. Similar mutations may have occurred within the dopamine reward system and the cannibinoid system, making the taste of chocolate a “triple threat.”
Sugar and Health
Modern societies have very different survival pressures, and hence selection factors, than those of early hominids. Not only do we have access to all the fruits we want, we have also perfected the packaging and delivery of refined sugar such as sucrose in a staggering variety of forms that include processed foods and candies. Our opioid systems are awash in a sea of sweet-tasting stimulants, and this has serious consequences for societal health.
Study after study has shown that in humans, the most palatable foods release the highest levels of beta-endorphins into our bloodstream. Endorphins are the brain’s natural opioids that are typically released during stress. When this system is rendered inactive by administering an opioid antagonist (drugs that bind to opioid receptors but do not activate them) such as naltrexone or naloxone, subjects report that foods taste less palatable and food consumption is often substantially reduced. Thus there is very clear evidence that the opioid system is involved in the hedonic experience of food.
In the past decade research has shown that obese people often have a different opioid system response compared with nonobese individuals. At least two independent studies have found that obese subjects produce up to three times as much beta-endorphin in their blood plasma after consuming a palatable meal when compared to their skinnier counterparts. One interpretation of this finding is that some individuals may be predisposed to obesity because they have hyperactivated opioid systems and literally experience more intense pleasure in response to opioid-system-activating foods than others. It is currently unknown, however, whether this change in opioid system functioning is a cause or a result of obesity.
The opioid system also plays an important role in attachment behaviors. In mouse pups, the response to being separated from their mother consists of ultrasonic vocalizations accompanied by a brief period of hyperactivity until the two are reunited. Mice that have their opioid system blocked by chemical agents or through genetic manipulations fail to display the same plaintive calls as normal mice. They do, however, protest to other events such as sudden changes in temperature or the introduction of an adult male. Hence, attachment behaviors depend on opioid system activation.
The fact that attachment behaviors seem to involve the opioid system is interesting when one considers that breast milk is rich in lactose, a sugar that serves to stimulate the activation of this system. Newborns and infants are innately attracted to the smell (see chapter 5) and taste of breast milk, and they have an uncanny ability to identify milk from their own mother. By the end of the first week of life, newborns prefer the taste of their mother’s breast milk over cow’s milk. There are many compounds in mother’s milk that may account for this preference. Besides being sweetened with the sugar lactose, it is rich in essential fatty acids that we will see are sought out by virtually all humans, from newborns to adults. Additionally, mother’s milk contains a number of important immune factors and whole immune cells (one reason that synthesizing human breast milk has not been possible for manufacturers of baby formula) that may be critical in helping the infant identify and develop a preference for its own mother’s milk over milk from an unrelated lactating mother.
Hence, newborns that are allowed to breast-feed will naturally self-stimulate their own opioid system, which itself may be a necessary component for the development of normal attachment. The timing of this sequence of behaviors is important, since the ingestion of lactose occurs coincidentally with other forms of stimulation that are known to activate the opioid system, such as the sensation of being touched and the familiar smell of Mom. All of these stimuli activate the opioid system at a highly opportune time for the development of maternal-offspring attachment—during feeding behavior.A biological mechanism such as this, which uses the experience of pleasure to prod newborns toward behaviors that at once maximize both attachment and the intake of nutrition, is likely to have tremendous survival value.
Getting Wired for Taste
Modern science has identified five basic food tastes: sweet, sour, bitter, salty, and the latest entry, umami, which is caused by the presence of monosodium glutamate (MSG). The first step in the path toward taste perception begins with the humble taste bud. Under an electron microscope, a taste bud has a shrublike appearance—think rhododendron—that is shaped by forty or so elongated epithelial cells. Each epithelial cell has receptors that preferentially respond to the presence of compounds affiliated with one of the five taste groups. As I eat my lunch of stir-fried vegetables, the natural sugars in the carrots and snow peas that pass by the approximately five thousand taste buds that line the perimeter of my tongue will activate groups of cells that are most sensitive to sweet-tasting food; salt-sensitive cells will become activated in response to the presence of sodium and potassium in the veggies and sauce added for seasoning; still other cells will be excited by the MSG.
The collective ensemble of activated cells sends this taste information on to the next stage of processing in the medulla and other nearby brain-stem structures that control the automatic behaviors involved in feeding such as sucking, salivation, and swallowing. From the brain stem, this signal makes its way to the thalamus, and finally branches out to cortical gustatory areas where the conscious perception of taste occurs and to various limbic nuclei where taste information can be integrated with memory, emotions, and motivation centers that regulate our desire to eat.
From the basic physiology and anatomy of the two chemical senses, we know that taste and smell are processed in very different ways. Epithelial cells in the olfactory mucosa respond to thousands of different types of odorants, while those that make up the gustatory system seem to have evolved a preference for basically five dominant taste classes. Besides this difference, however, there are remarkable similarities in the evolution of these systems in the human species and their development in the individual.
By the beginning of the second trimester of pregnancy—just about the time when Melissa was regaining her desire to eat and morning sickness was making a thankful exit—Kai’s taste buds were beginning to mature. It is probably no coincidence that his first real sucking and swallowing behaviors also started at about this time, since the continued development of his taste buds, and most importantly their anatomical connection into functional taste circuitry in the brain-stem, depend on stimulation. The brain-stem sites mature very early as well and will provide Kai with a complete set of reflexive movement patterns for getting the nutrition he needs—everything from sucking and swallowing behavior to changes in facial expression in response to sweet versus bitter tastes. But it is unlikely that fetuses can consciously perceive tastes at this point in their development, since cortical taste sites are not yet mature. Anencephalic newborns lack most of their cerebral cortex, yet they are capable of the same behaviors. These include tongue protrusions to reject bitter-tasting liquids, and salivation in response to sweets, even though detailed investigations show that these infants have no genuine awareness of such tastes.
Comparisons across a wide range of mammalian species has shown that the taste circuitry that projects from the epithelial cells to brainstem sites is highly conserved across very different animals, and hence is likely to have evolved rather early in the evolutionary lineage of hominids. Just as the conscious perception of taste and its integration with brain systems that regulate pleasure are likely to be relatively newer adaptations built on existing brain-stem circuitry, so it is for the developing fetus, who fails to show signs of real taste preferences until about the third trimester, when brain-stem connections to cortical and limbic regions are complete.
At this point in his development, Kai is experiencing all sorts of tastes—sweets, sours, bitters, you name it—all of which are incorporated into the amniotic fluid through Mom’s diet. Even before my son is born, he has a sweet tooth.Although he tends to move most in the late evening hours, his fetal gymnastics can be brought on at any time of the day if his mother indulges in a bowl of Häagen-Dazs’s Dulce de Leche. And this is not unusual. Before the days of ultrasound, X-ray contrasts were commonly used to assess fetus health in the final trimester. Studies performed during this time show that fetuses increase their swallowing behavior and movements if a sweet solution such as saccharine is injected into the amniotic fluid, while they decrease their swallowing if a bitter or noxious-tasting substance is injected. These results are consistent with the idea that taste perception and preferences emerge during this developmental period.
Well before Kai has any exposure to the outside world, he is already establishing taste preferences that will form a lifetime of eating habits. Evidence from both animal and human research indicates that taste variety is remarkably important during this stage of development. For instance, rats born to mothers who have had their salt intake curtailed during the final stages of gestation lack the ability to perceive the substance after birth. Likewise, rats born to mothers who consume diets rich in particular tastes such as apple juice or alcohol show an enhanced preference for the taste after birth when compared to rats born from mothers with a normal diet. Both of these forms of experience-expectant learning also occur in humans. Finally, it is important to note that a very general relationship appears to exist between the experience of taste variety in the womb and acceptance of novel foods after birth. Newborn rats and humans exposed to an increased variety of tastes in utero typically show less fear of novel foods when compared to newborns from mothers who had a more restricted diet.
Much like we saw with smell, although newborns have an innate preference for specific tastes such as sweets and certain fats, they also exhibit an impressive potential for developing novel taste preferences based on what was experienced in the womb. Moreover, these experiments demonstrate that the development of normal taste perception depends critically on experiencing a wide variety of tastes while in the womb, since limited exposure to a taste class (for example, salts) can result in a reduced ability to detect and perceive these tastes after birth.
Survival of the Fattest
So far we’ve seen that humans find the consumption of sweets innately pleasurable, and that the evolution of this tendency can be traced to the evolutionary pressure to identify and desire high-energy foods (such as fruits and mother’s milk) that are rich in natural sugars and relatively plentiful and safe to consume. But what about fats? Why do humans have such an insatiable appetite for fatty foods?
Although many of the ancient Greeks, including Aristotle, considered fat a basic taste class, it has only been in the past few years that food scientists and psychologists are willing to accept the idea that fat has a specific taste. Previously, most scientists believed that fat only acted as a food texture or flavor carrier. But this has changed with the discovery that simply putting a fatty food such as cream cheese into your mouth raises blood serum levels of triacylglycerol (TAG), an indicator of blood fat loading, even if the food is never swallowed. Richard Mattes, a food scientist at Purdue University, and his students followed up on this original study by showing that blocking the subjects’ ability to smell the cream cheese has no effect on the outcome, suggesting that it is the taste component of a fat that produces this change in blood TAG levels.
These findings are probably no surprise to researchers such as physiologist Adam Drewnowski, who in the early 1980s showed that subjects’ rating of the pleasantness of a food is directly related to the relative proportions of sucrose and fat in the samples tested. We all love foods that are laden with sugar, but there is a limit beyond which we find a food to be too sweet. Hedonic preference ratings first rise and then typically decline with increasing sucrose concentration in these experimental studies. This is not, however, what happens with fatty foods. Surprisingly, hedonic preference ratings typically continue to rise with increases in dietary fat content. It is possible, then, that our innate fondness for fats is even more intense than for sweets. And this makes perfect sense from both evolutionary and developmental perspectives.
Let’s start with evolution. In the past 2.5 million years, the hominid lineage leading to humans has evolved significantly larger brains relative to body size when compared to other primates. Understanding the reason for this dramatic expansion has been a long-standing question for those concerned with human evolution. Many theories argue that brain expansion followed the development of some key cognitive or behavioral milestone such as the emergence of bipedalism or language or social group formation or toolmaking, and so on. The list is long and varied, but the question remains: Did these new functional capacities result from or cause the dramatic increase in hominid brain size?
Michael Crawford of the Institute for Brain Chemistry and Human Nutrition in London has argued that hominid brain expansion is the direct result of dietary shifts that accompanied the migration of Homo sapiens from the open savannas to freshwater and saltwater shoreline regions, predominantly in the East African Rift Valley some 250,000 years ago. Human babies have combined brain and body fat that accounts for a whopping 22 to 28 percent of their total body weight, a finding that is not seen in any other terrestrial animals. Fats are an indispensable component for building brains.The very foundation of life—the cell membrane—is made from a double layer of lipids that protects and shields the internal organs of the cell while at the same time permitting the perfect amount of elasticity so the cell can respond to physical changes in the extracellular environment. In human babies, a high level of dietary fat is critical for normal brain development because it provides energy for growth in the form of fatty acids found in triglycerides; contains important chemical precursors to ketone bodies that regulate brain lipid synthesis; and provides a store of long-chain polyunsaturated fatty acids, most notably docosahexaenoic acid (DHA) and arachidonic acid (AA), which are essential for the formation of retinas and synaptic junctions where brain cells communicate.
Both DHA and AA are present in abundance in human milk but noticeably absent in cow’s milk. Recognizing the importance of these fatty acids for human brain growth and development, many formula manufacturers have begun supplementing their existing recipes with DHA and AA. Human body fat contains more DHA and AA at birth than at any other time during life, and in the newborn approximately 75 percent of its total energy expenditure goes to brain growth. Hence the fatty acids DHA and AA are important for brain development because they serve as an energy supply to fuel cell growth and proliferation, and because they have a molecular structure that is a unique component for building synapses.
Michael Crawford and his colleagues have suggested that since the natural supply of DHA and AA in human newborns is only enough to last the first three months of life or so, the continued supply of these fatty acids must occur through the child’s diet. This means that the availability of foods that are natural sources of DHA and AA is a rate-limiting factor on human brain development and would naturally restrict the expansion of hominid brain size throughout evolutionary history. So where do you find rich veins of DHA and AA? Both are part of the larger omega family of fats whose synthesis requires the presence of two essential fatty acids that are not manufactured by the body, and consequently must be obtained through the diet. Alpha-linolenic acid (ALA) is the foundation of the omega-3 family of fatty acids that your body uses to make DHA, and linoleic acid (LA) is the foundation of the omega-6 family that is used to make AA. Both substances emerged in response to evolutionary pressures in plants to efficiently store and access energy reserves. Photoplankton, algae, and green leaves synthesize ALA in their chloroplasts, while flowering, seed-bearing plants store lipids in the form of seed oils loaded with LA.
Crawford’s group has argued that the evolution of the hominid brain to the human form we know today would have been impossible unless early Homo sapiens incorporated large amounts of both ALA and LA into their daily diet. They suggest that the most dramatic increase in hominid brain expansion co-occurred with the migration of Homo sapiens to shoreline environments and lacustrine estuaries, where dietary ALA and LA were plentiful.
Whether or not Crawford’s hypothesis is correct, two things are absolutely clear: growing brains require significant amounts of both ALA and LA, and these critical ingredients must be obtained through diet. ALA and LA deficiency in animals and humans results in altered structure and function of brain cell membranes and can lead to severe cerebral abnormalities. These anatomical changes have been linked to a number of disorders. For instance, both ALA and LA are involved in the prevention of some aspects of cardiovascular disease (including cerebral vascularization), and reduced levels of the fatty acids have recently been attributed as a cause of stroke, visual deficits, and several neuropsychiatric disorders, including depression, presenile dementia, and most notably Alzheimer’s disease. Another study showed that ALA deficiency decreases the perception of pleasure by directly altering the efficacy of sensory organs and by creating abnormal changes in frontal cortex anatomy.
Taken together, these findings provide compelling evidence that many selection factors were operating to ensure that early Homo sapiens with a taste for foods containing ALA or LA would have a survival advantage over their peers who lacked this preference. Nature has solved this problem by giving us not just a “sweet tooth,” but also an appetite for fats. Given these findings and the results of Drewnowski’s experiments in the early 1980s, it is possible that we may find eating fats even more pleasurable than sweets.
It is known that humans and other animals can discriminate among different dietary fats and have a preference for corn oil, which can be used as a positive reinforcer in conditioning experiments. Corn oil has three major fatty acid components: linoleic acid (52%), oleic acid (31%), and palmitic acid (13%). Recent experiments in rats have shown that LA has an important effect on the physiological responses of the epithelial taste cells that make up taste buds. It appears that when LA binds to these cells, it increases the strength of the electrical signal that they normally send to the brain-stem in response to a food source. For instance, if LA and sucrose are consumed together, the combined signal sent from the taste bud that announces the arrival of food is stronger than would be the case with sucrose alone. This physiological response has a marked impact on food intake regulation. In a series of behavioral experiments, psychologist David Pittman and his students at Wofford College found that in rats, LA acts to increase the intensity of sweet, salty, and sour tastes such that the natural preference or avoidance of each is enhanced.As predicted from the physiological findings, animals preferred the taste of a solution containing LA and sucrose together more so than a solution with sucrose alone. Likewise, when Pittman’s rats were given a mixture of LA with salt or citric acid, they consumed less than when the salt or citric acid solution was offered alone.
Linoleic acid is present in a variety of natural vegetable oils, and since it has a direct effect on the physiological responsiveness of epithelial cells, it is likely to be one of several compounds that give fats their pleasurable taste. The fact that LA can be used as a positive reinforcing stimulus in conditioning tasks tells us that humans and animals are motivated to consume foods that contain the substance. Hence, the pleasure we find in eating fats may serve to ensure that enough essential fatty acids are included in our typical diet to promote and maintain normal brain growth and development. At the same time, this pleasure-mediated mechanism provides yet another example of how modern food manufacturing technology, in proliferating the availability of refined sugars and fats, has essentially removed the selection factors that originally led to these important adaptations. In doing so, we are a society vulnerable to a number of disorders, such as obesity and diabetes, that emerge when the pleasure we receive from eating certain foods is filled well beyond the natural limits imposed by the environmental circumstances of younger hominids.