HEY, BUD, CAN YOU DO ME A FAVA - Survival of the Sickest: A Medical Maverick Discovers Why We Need Disease - Sharon Moalem

Survival of the Sickest: A Medical Maverick Discovers Why We Need Disease - Sharon Moalem (2007)


A distinguished-looking man, debonair to his core in a way that the bright orange prison coveralls cannot obscure, stands in his jail cell looking out at an attractive brunette who has presumed—presumed!—to question him. She’s testing him—and he’s having none of it. “A census taker once tried to test me. I ate his liver with some fava beans and a nice Chianti,” says Hannibal Lecter.

If the doctor they called the cannibal had been an epidemiologist instead of a psychiatrist, he might have killed his victim with those fava beans—not just served his liver with them.

Before we started calling them fava beans, after the Italian word for them, we called them broad beans—and the range of legend that surrounds them is certainly broad. The Greek scholar Pythagoras supposedly warned a flock of future philosophers, “Avoid fava beans.” Of course, since fava beans were used as ballots at the time—white for yes and black for no—he may have just been giving his students advice that all good philosophers should still ponder today—“Avoid politics.”

In fact, the legends surrounding Pythagoras’s warning are almost as varied as the legends around the bean itself. A different theory holds that Pythagoras’s concern was something much less grave than possible poison and much less theoretical than possible politics—according to Diogenes, Pythagoras was just worried his students would eat too many beans and, well, pass too much gas. Two thousand years ago Diogenes supposedly said:

One should abstain from fava beans, since they are full of wind and take part in the soul, and if one abstains from them one’s stomach will be less noisy and one’s dreams will be less oppressive and calmer.

A cult called the Orphics believed that the fava plant contained the souls of the dead: according to them, “Eating fava beans and gnawing on the heads of one’s parents are one in the same.” Aristotle alone had five different theories about Pythagoras’s broad beans, saying that Pythagoras warned against them

either because they have the shape of testicles, or because they resemble the gates of hell, for they alone have no hinges, or again because they spoil, or because they resemble the nature of the universe, or because of oligarchy, for they are used for drawing lots.

It’s no wonder all those ancient Greeks were philosophers—they clearly had a lot of time on their hands. But they weren’t the only people to notice the mysterious reaction many people have to fava beans. In the twentieth century, a schoolteacher in Sardinia, an island off the coast of Italy, is said to have noticed a seasonal lethargy that settled on her students every spring and lasted for weeks. Supposedly recalling Pythagoras’s warning, she connected her students’ nodding heads to flowering fava plants. Superstitions against eating uncooked fava beans were common throughout the Middle East. In Italy, fava beans are traditionally planted on All Souls’ Day, and cakes shaped like a fava bean pod are called fave dei morti—“beans of the dead.”

As you’ve probably come to suspect, where there’s folklore smoke, there’s medical fire—in the case of the fava bean, a whole lot of it.

Favism, as modern medical science has so aptly labeled it, is an inherited enzyme deficiency carried by 400 million people. It’s the most common enzyme deficiency in the world. In extreme cases, people who have favism and eat fava beans (or take certain drugs) experience rapid, severe anemia that can often lead to death.

SCIENTISTS FIRST CAUGHT wind of the truth behind some people’s deadly reaction to fava beans during the Korean War. Because malaria was common in parts of Korea, American soldiers who served there were prescribed antimalarial drugs, including one called primaquine. Doctors soon discovered that about 10 percent of African American soldiers developed anemia while taking primaquine, and some soldiers, especially those of Mediterranean descent, experienced an even more severe side effect called hemolytic anemia—their red blood cells were literally bursting.

In 1956, three years after the ceasefire that ended the Korean War, medical researchers isolated the cause of the soldiers’ reaction to the antimalarial drugs—they lacked sufficient amounts of an enzyme called glucose-6-phosphate dehydrogenase, or G6PD for short. G6PD is thought to be present in every cell in the body. It’s especially important in red blood cells, where it protects cellular integrity, mopping up chemical elements that would otherwise destroy the cell.

You’ve probably heard about free radicals in the news and may have a general sense that they’re not so good for you. the easiest way to understand free radicals is to remember that Mother Nature likes pairs—she’s something of a chemical matchmaker. Free radicals are essentially molecules or atoms with unpaired electrons—and unpaired electrons look to pair up. Unfortunately, as far as your body is concerned, those electrons look for love in all the wrong places. As the unpaired electrons seek to pair with electrons in other molecules, they cause chemical reactions. Those reactions can disrupt cellular chemistry and lead to the cell’s early death. That’s one of the reasons free radicals are thought to be a major cause of aging.

G6PD is like a bouncer in the red blood cell bar: when it’s on the job, it throws out the free radicals so they can’t start trouble. But when you don’t have enough G6PD, any chemical that produces free radicals can wreak havoc on your red blood cells. That’s what happened with the soldiers who experienced adverse reactions to primaquine—one of the ways primaquine is thought to stop the spread of malaria is by stressing your red blood cells and making them a generally unpleasant place for malaria-causing parasites. But if you don’t have enough G6PD to maintain cellular integrity, when the primaquine puts stress on your red blood cells, some of the cells can’t take it—the free radicals cause the cell membranes to burst, destroying them. And that loss of red blood cells spells anemia—specifically, hemolytic anemia, which is anemia that is caused by the early breakdown of red blood cells. the person undergoing the hemolytic crisis will experience severe weakness and fatigue; there may be signs of jaundice. Untreated, hemolytic anemia can lead to kidney failure, heart failure, and death.

THOSE ANCIENT GREEKS were onto something—for some people, fava beans are killers. They contain two sugar-related compounds called vicine and convicine. Vicine and convicine produce free radicals, especially hydrogen peroxide. When people who have favism eat fava beans, they undergo a reaction similar to the one that occurs after taking primaquine. If the hydrogen peroxide isn’t cleared out with the help of G6PD, it starts to attack your red blood cells, ultimately breaking them down. When that happens, the rest of the cell leaks out, resulting in hemolytic anemia, with potentially deadly effect.

The gene that is responsible for G6PD protein production—or deficiency—goes by the same name, G6PD. This gene is carried on the X chromosome. As you probably remember from science class, the X chromosome is one of the two sex chromosomes; the other is Y. Humans with two X chromosomes—XX—are female; humans with an X and a Y chromosome—XY—are male. Because the gene for G6PD deficiency is carried on the X chromosome, the condition is much more common in men. When a man has the mutation on his one X chromosome, all his cells take direction from that mutation. For a woman to have serious G6PD deficiency, she has to have the mutation on both X chromosomes. If she has it on only one chromosome, some of her red blood cells will have a normal gene and some won’t, and she should produce sufficient G6PD to avoid favism.

There are two normal versions of the G6PD gene, one called Gd B and the other Gd A+. There are more than 100 possible mutations of this gene, but they fit into two major categories, one that arose in Africa, called GdA-, and one that arose around the Mediterranean, called Gd Med. These mutations cause serious problems only when free radicals start overwhelming your red blood cells and there isn’t enough G6PD to clean them up. Problems can be triggered in people with favism by some infections and some medications—like primaquine—that release free radicals into the bloodstream. But as we’ve discussed, the most common trigger is eating fava beans—which is why it’s called favism, of course.

Humans have been cultivating fava beans for thousands of years. The oldest seeds found so far were discovered in an archaeological dig near Nazareth. They’re thought to be around 8,500 years old, having been dated back to 6500 b.c. From Nazareth, in what is now the northern part of Israel, fava beans are thought to have spread throughout the Middle East and then north, around the eastern Mediterranean, into Turkey, across the Greek plains, and on into southern Italy, Sicily, and Sardinia.

If you marked up a map to show where favism is most common, and then overlaid that with the areas where fava bean cultivation is most common, guess what? At this point, you may not be all that surprised at what I’m about to tell you—favism genes and fava bean farms? Same places, same people. Favism is most common—and most deadly—in North Africa and Southern Europe, all around the Mediterranean. Which happen to be exactly the places where fava beans are historically cultivated and consumed.

Here we go again—somehow millions of humans have evolved with a genetic mutation that is only likely to cause problems when they eat something that is most common to the diet in their part of the world?

Well, if we’ve figured out anything so far, it’s that evolution doesn’t favor genetic traits that will make us sick unless those traits are more likely to help us before they hurt us. And a trait that is shared by more than 400 million people is definitely an evolutionary favorite. So there has to be some benefit to G6PD deficiency, right?


BEFORE WE DIG further into the connection between favism and fava beans, let’s take a look at the broader connection between evolution in the animal kingdom and evolution in the plant kingdom. We’ll start with breakfast. You see that strawberry in your cereal? The vine it came from wants you to eat it!

Plants that produce edible fruit evolved that way for their own benefit. Animals pick fruit and eat it. the fruit contains seeds. Animals walk or lope or swing or fly away and eventually they deposit those seeds somewhere else, giving the plant a chance to spread and reproduce. the apple doesn’t fall far from the tree—unless an animal eats it and takes it for a ride. It’s a gastronomic hitchhike, and it works well for everybody. In fact, that’s why ripe fruit is easy to pick and often falls off while unripe fruit is harder to harvest—the plant doesn’t want you to take off with the fruit until the seeds within it have finished developing. No free lunch in Mother Nature’s outdoor cafe.

On the other hand, as much as plants want animals to eat their fruit, they don’t want animals to get much closer than that—when creatures start to nibble on their leaves or gnaw at their roots, things can get tricky. So plants have to be able to defend themselves. Just because they’re generally immobile doesn’t mean they’re pushovers.

Thorns are plants’ most obvious defense mechanism, but they’re by no means the only one, or the most powerful—these guys have a whole arsenal. Plants by far are the biggest manufacturers of chemical weapons on earth. Everybody knows about the beneficial effects we receive because of basic plant chemistry. they convert sunlight and water into sugar by using carbon dioxide they absorb from the atmosphere, in turn producing oxygen, which we get to breathe. But that’s just the starting point. Plant chemistry has the power to make a significant impact on its environment, influencing everything from the weather to the number of local predators.

Clover, sweet potato, and soy all belong to a group of plants that contain a class of chemicals called phytoestrogens. Sounds familiar, right? It should. Phytoestrogens mimic the effect of animal sex hormones such as estrogen. When animals eat too much of a plant that contains phytoestrogens, the overload of estrogenlike compounds wreaks havoc on their reproductive capability.

There was a sheep-breeding crisis in Western Australia during the 1940s. Otherwise healthy sheep weren’t getting pregnant or were losing their young before giving birth. Everyone was stumped until some bright agricultural specialists discovered the little culprit—European clover. This type of clover produces a potent phytoestrogen called formononetin as a natural defense against grazing predators. And, yes, if you’re a plant, a sheep is a predator! Accustomed to the humidity of Europe, the imported clover plants were struggling to cope with the drier Australian climate. When clover has a bad year—not enough rain or sunshine, or too much rain or sunshine—it protects itself by limiting the size of the next generation of predators. It increases production of formononetin and prevents the birth of baby grazers by sterilizing their would-be parents.

The next time you’re looking for some convenient birth control, you don’t have to snack on a field of clover, of course. But if you take many forms of the famous “Pill,” you’re not doing something all that different. The gifted chemist Carl Djerassi based his development of the Pill on just this kind of botanical birth control. He wasn’t using clover, though; he was using sweet potatoes—the Mexican yam to be exact. He started with disogenin, a phytoestrogen produced by the yam, and from that base, he synthesized the first marketable contraceptive pill in 1951.

Yams aren’t the only source of phytoestrogens in the human diet. Soy is rich in a phytoestrogen called genistein. It’s worth noting that today many processed foods, including commercial baby formulas, use soy because it’s an inexpensive source of nutrition. There’s a growing concern among a small number of scientists that we don’t have a handle on the potential long-term effects of what seems to be an ever-greater level of phytoestrogens and soy in our diet.

PLANTS ARE GOOD at birth control—but they’re great at poison. Most of the toxins they produce aren’t directed at humans, of course; they don’t really have to worry about us too much. the real problem that plants have is all those committed vegetarians grazing and buzzing and flying around that rely solely on them for food. But that doesn’t mean we don’t have to be careful, because plant toxins can cause lots of problems for us too. And chances are, you’ve probably eaten your fair share in the last week.

Ever have tapioca pudding? Tapioca is made from the cassava plant. Cassava is a large, thick-skinned tuber that looks kind of like a long white sweet potato with a coconut’s skin. It’s a major part of the diet in many tropical countries. Yet it contains a precursor to deadly cyanide. Of course, when it’s cooked and processed correctly, it can be harmless—so don’t go biting down on the next raw cassava plant you see. Not surprisingly, cassava is especially high in cyanide compounds during drought—exactly when it needs additional protection against predators to make it through the growing season.

Consider another example, the Indian vetch, which is cultivated in Asia and Africa. Its chemical weapon of choice is a powerful neurotoxin that can cause paralysis. The neurotoxin is so powerful that the vetch can often survive when all other crops die out because of drought or infestation. For that reason, poor farmers in some parts of the world cultivate it as an insurance crop—insurance against starvation in the event of a famine. And sure enough, the incidence of disease related to this organic poison climbs after a famine in those areas where the vetch is grown. Not surprisingly, some people choose to risk the vetch’s poison rather than starve to death.

The nightshades are a large group of plants, some edible, some poisonous. All nightshade contains a large portion of alkaloids, chemical compounds that can be toxic to insects and other herbivores and affect humans in ways ranging from helpful to hallucinogenic. Some people speculate that “witches” included some types of nightshade in their “magic” ointments and potions—and then hallucinated that they were flying!

One of the most common members of the nightshade family, which includes potatoes, tomatoes, and eggplant, is the jimsonweed, which got its name from Jamestown, Virginia. About a hundred years before the Revolutionary War there was a short-lived revolt called Bacon’s Rebellion. It was defeated pretty quickly, but not without some hiccups along the way. When British soldiers were sent to Jamestown to put down the rebellion, they were secretly (or accidentally) drugged with jimsonweed in their salad. In 1705 Robert Beverley described the result in The History and Present State of Virginia:

Some of them ate plentifully of it, the effect of which was a very pleasant comedy, for they turned natural fools upon it for several days: one would blow up a feather in the air; another would dart straws at it with much fury; and another, stark naked, was sitting up in a corner like a monkey, grinning and making mows at them; a fourth would fondly kiss and paw his companions, and sneer in their faces with a countenance more antic than any in a Dutch droll…. A thousand such simple tricks they played, and after 11 days returned themselves again, not remembering anything that had passed.

Jimsonweed is a tall green weed with big leaves that is common throughout America. People eat it accidentally every year, usually because it’s mixed in with other plants in their garden.

Plant chemicals can paralyze, sterilize, or make us crazy. They can also affect us in much milder ways, like interfering with digestion or burning our lips. Wheat, beans, and potatoes all have amylase inhibitors, a class of chemicals that interfere with the absorption of carbohydrates. Protease inhibitors, found in chickpeas and some grains, interfere with protein absorption. Many of these botanical defense systems can be disabled by cooking or soaking. the Old World tradition of soaking beans and legumes overnight does exactly that—it neutralizes most of the chemicals that mess with our metabolism.

If you’ve ever bitten down on a raw habanero pepper, you probably felt like you were being poisoned. And you were. That burning sensation is caused by a chemical called capsaicin. Mammals are sensitive to it because it tickles the nerve fibers that sense pain and heat, but birds aren’t—and that goes to show just how clever old Mother Nature can be when she’s doing the evolution dance. Mice and other rodents that would otherwise be drawn to the fruits of chili plants avoid them because they can’t take the heat. That’s good for the chili, because the digestive systems of mammals destroy its small seeds, a process that pretty much takes the point out of the gastronomic hitchhike. Birds, on the other hand, don’t destroy chili seeds when they eat chili peppers—and they aren’t affected by capsaicin. So mammals leave the peppers for the birds, and the birds take the seeds to the air, spreading them along the way.

Capsaicin is a sticky poison—it adheres to mucous membranes, which is why your eyes burned if you ever rubbed them after handling peppers. It’s also why the heat from a hot pepper sticks around so long—and why water does nothing to cool the burn. Its stickiness acts to prevent capsaicin from easily dissolving in water. You’re much better off drinking milk (but this is one time to pass on the skim!) or eating something else with fat in it—since fat is hydrophobic, it helps to peel the capsaicin away from your mucous membranes and cool you down.

Capsaicin doesn’t just cause a burning sensation—it can actually cause selective degeneration of some types of neurons. In large quantities, hot peppers can be very harmful. Scientists are still debating the connection, but people in places like Sri Lanka where hot peppers are almost a staple, as well as other ethnic groups who eat lots of hot peppers, tend to have much higher rates of stomach cancer.

From an evolutionary perspective, it’s not surprising that plants have evolved mechanisms to ensure that their predators think twice before making them their next meal. What’s more surprising is why we continue to cultivate and consume thousands of plants that are toxic to us. The average human eats somewhere between 5,000 and 10,000 natural toxins every year. Researchers estimate that nearly 20 percent of cancer-related deaths are caused by natural ingredients in our diet. So if many plants we cultivate are toxic, why didn’t we evolve mechanisms to manage those toxins or just stop cultivating them?

Well, we have.

Sort of.

HOW MANY TIMES have you had a craving for something sweet? Or something salty? How about something bitter? Can’t you just see yourself saying, “Man, all I really want is something really bitter for dinner.” Doesn’t happen, right?

There are four basic tastes in Western tradition—sweet, salty, sour, and bitter. (There’s a fifth in other parts of the world that is gaining traction in the West, both culturally and scientifically—it’s called umami, and it’s the savory flavor you find in aged and fermented foods, like miso, parmesan cheese, or aged steaks.) Most tastes are pleasing, and the evolutionary reason for them is simple—they attract us to foods that contain nutrients, as well as the salt and sugar, that we need.

Bitterness is different—bitterness turns us off. Which, as it turns out, is probably the point. A study published in 2005 by researchers working collaboratively at University College London, Duke University Medical Center, and the German Institute of Human Nutrition concluded that we evolved the ability to taste bitterness in order to detect toxins in plants and avoid eating them. (Which is why the plants produce the toxins in the first place and has led to the term many plant biologists use to describe them—antifeedants.) By reconstructing the genetic history of one of the genes responsible for the growth of bitter taste receptors in our tongues, scientists have traced the evolution of this ability to Africa, sometime between 100,000 and 1,000,000 years ago. Not all humans have the ability to taste bitterness—and not all are as sensitive to it as others—but given how widespread the ability is across the globe, it’s pretty clear that tasting bitterness gave humans a significant survival advantage.

About one-quarter of humanity is even more highly attuned to taste. They’re called supertasters—because they are. Chemists discovered them almost by accident while studying reactions to a chemical called propylthiouracil. Some people can’t taste it at all. Some people find it to be mildly bitter. And some people—supertasters—find even the smallest taste to be repulsive. Supertasters find more bitterness in grapefruit, coffee, and tea. They may be as much as twice as sensitive to sweetness and are much more likely to feel the fire at a hint of chili.

Interestingly, the same collaborative paper that linked bitterness to the detection of plant poisons noted that it may not be such an advantage today. Not every version of the compounds that taste bitter is poisonous; in fact, as I mentioned in the description of nightshade, some of these compounds are beneficial. The scopolamine in jimsonweed that causes temporary madness is a bitter-tasting alkaloid—but so are some of the compounds in broccoli that have anticancer properties. So today, especially in developed countries where the need for a natural alarm bell against plant toxins has pretty much faded away, it may be a disadvantage to have a strong reaction to bitterness. Now, instead of steering you away from poisons, it’s steering you away from food that’s good for you.

WITH A QUARTER-MILLION plants to choose from and a keen sense of taste, why haven’t we cultivated plants that aren’t poisonous and bred the toxins out of plants that are? Well, we’ve tried—but like everything else in the evolutionary kingdom, it’s complicated. And there are consequences.

Remember, plants’ chemical weapons aren’t aimed at us for the most part; they’re directed more at insects, bacteria, fungi, and, in some cases, mammals that are dedicated herbivores. So if we impose unilateral disarmament on a plant, it’s like giving the keys to the candy store to a busload of schoolkids—pretty soon there’s nothing left for anyone else to eat. the plant’s predators just finish it off.

Of course, sometimes plant breeders have gone the other way and bred in too much natural resistance, turning an otherwise edible food into an almost deadly poison. All potatoes contain solanine, especially those that are a little green in color. Solanine is also what protects potatoes from potato late blight (imagine a deadly case of athlete’s foot and you’ll get the idea of what blight means to a potato). Solanine is a fat-soluble toxin that can cause hallucinations, paralysis, jaundice, and death. Too many solanine-rich french fries and you’re french fried. Sometimes, of course, blight can overwhelm the protection solanine provides. The fungus was responsible for the devastating Irish potato famine in the mid-nineteenth century that led to mass starvation, death, and emigration from Ireland.

In En gland during the 1960s, plant breeders worked to develop a blight-resistant potato, in order to increase the efficiency of potato crops. They called their special spud the Lenape. the first person who ate a Lenape potato didn’t feel very special, though—it contained so much solanine, it was nearly deadly. You won’t be surprised to hear that they pulled those Lenapes from the market like—hot potatoes.

Celery is a similar case that sheds light on the sometimes double-edged nature of organic agriculture. Celery defends itself by producing psoralen, a toxin that can damage DNA and tissue and also causes extreme sensitivity to sunlight in humans. the funny thing about psoralen is that it becomes active only when it’s exposed to sunlight. Some insects avoid this poison by keeping their victim in the dark—they roll themselves up in a leaf, protected from the sun, and then spend the day chewing their way out.

Garden-variety celery doesn’t pose a problem to most people, unless you visit the tanning salon after a bowl of celery soup. Psoralen generally poses more of a problem for those who handle large amounts of celery over a long period of time—many celery pickers have developed skin problems, for example.

Now, the thing about celery is that it’s especially good at kicking psoralen production into high gear when it feels under attack. Bruised stalks of celery can have 100 times the amount of psoralen of untouched stalks. Farmers who use synthetic pesticides, while creating a whole host of other problems, are essentially protecting plants from attack. Organic farmers don’t use synthetic pesticides. So that means organic celery farmers are leaving their growing stalks vulnerable to attack by insects and fungi—and when those stalks are inevitably munched on, they respond by producing massive amounts of psoralen. By keeping poison off the plant, the organic celery farmer is all but guaranteeing a biological process that will end with lots of poison in the plant.

Life: it’s such a compromise.

NOW THAT WE have a better understanding of the relationship that plant evolution has on humans, let’s take another look at the connection between fava beans and favism.

What do we know so far? We know that eating fava beans releases free radicals into the bloodstream. We know that people who have favism, with a deficiency in the G6PD enzyme, lack the ability to mop up those free radicals, which causes their red blood cells to break down and results in anemia. We know that a map of fava bean cultivators and a map of likely favism carriers would highlight the same portions of the globe. And we know that any genetic mutation as common as favism—more than 400 million people—must have given its carriers some advantage over something even more deadly.

So what’s a threat to human survival that is common in Africa and around the Mediterranean and has a connection to red blood cells? Four out of five dentists may recommend Trident—but ten out of ten infectious-disease experts will give you the same answer if you ask them to solve that riddle: the answer is malaria.

Malaria is an infectious disease that infects as many as 500 million people every year, killing more than 1 million of them. More than half of the world’s population live in areas where malaria is common. If you’re infected, you can experience a terrible cycle of fevers and chills, along with joint pain, vomiting, and anemia. Ultimately, it can lead to coma and death, especially in children and pregnant women.

For centuries, starting with Hippocrates’ treatise On Airs, Waters, and Places, doctors believed many diseases were caused by unhealthy vapors emanating from still water—lakes, marshes, and swamps. They called these vapors or mists miasma. Malaria, which is Old Italian for “bad air,” was one of many diseases they thought miasma caused. The association with hot, wet swamps turned out to be correct—but because of the mosquitoes that thrive in those areas, not the vapors they emit. Malaria is actually caused by parasitic protozoa (microsopic organisms that share some traits with animals) that are deposited in the human bloodstream through the bite of female mosquitoes (males don’t bite). There are a few different species that cause malaria, the most dangerous of which is Plasmodium falciparum.

The theory that miasma causes malaria was wrong, but it led to the development of at least one modern comfort many people would sweat to lose. According to James Burke, the author of the Connections series, a Florida doctor named John Gorrie thought he had malaria licked in 1850, with the help of a new invention. Dr. Gorrie correctly noticed that malaria was significantly more common in warmer climates. And even in cooler places, people seemed to get sick only in warmer months. So he figured if he could find a way to eliminate all the warm “bad air,” he could protect people from malaria.

Dr. Gorrie’s malaria-fighting contraption pumped cool air into the malaria hospital ward. Today, a version of his invention probably pumps cool air into your home—you call it an air conditioner. And while the air conditioner didn’t improve the prognosis of any of Dr. Gorrie’s malaria-infected patients, it has had an impact on the disease. Air-conditioning allows people who live in malarial parts of the world to stay inside with their doors closed and windows shut, which helps to protect them from infected mosquitoes.

There are still hundreds of millions of malarial infections every year—and while it’s one of the ten highest causes of death in the world, not everybody who gets infected dies. Even more to the point, perhaps, not everyone who gets bit by malaria-carrying mosquitoes gets infected. So what’s helping the malaria survivors?

J. B. S. HALDANE was one of the first people to understand the idea that different environments impose different evolutionary pressures, producing distinct genetic traits that in certain populations cause disease. More than fifty years ago, he suggested that certain groups—specifically people with a genetic tendency for sickle-cell anemia or thalassemia, another inherited blood disorder—had better natural resistance to malaria.

Today many researchers believe that a genetic trait far more prevalent than sickle-cell anemia or thalassemia may also provide protection against malaria—G6PD deficiency. In two large case-controlled studies, researchers found that children with the African variant of the G6PD mutation had twice the resistance to P. falciparum, the most severe type of malaria, that children without the mutation had. Laboratory experiments confiremd this—given a choice between “normal” red blood cells or G6PD-deficient red blood cells, the malarial-causing parasites preferred the normal cells time after time.

Why? P. falciparum is actually a delicate little creature. It only really thrives in nice clean red blood cells. The red blood cells of someone with G6PD are not just less hospitable to malaria, they are also taken out of circulation sooner than those of people without the mutation, and that disrupts the parasite’s life cycle. This explains why populations exposed to malaria would select for favism. What it doesn’t explain is why those populations would also cultivate fava beans. What’s the point in living through a mosquito’s breakfast if your own lunch can kill you?

The answer is probably straightforward—redundancy. Malaria is so widespread and so deadly that vulnerable populations needed every possible defense in order to survive and reproduce. By releasing free radicals and raising the level of oxidants, fava bean consumption makes the blood cells of non-G6PD deficient people a less hospitable place for malarial parasites. With all the free radicals, some red blood cells tend to break down. And when someone with a mild or partial deficiency in G6PD eats fava beans, the parasite is in deep trouble.

As far as partial deficiency is concerned, remember that the genetic mutation that causes favism is only passed on the X chromosome, and remember that females have two X chromosomes. That means that (in populations where the mutation is common) many women have a red blood supply that is partially normal and partially G6PD deficient. That gives them additional protection against malaria, but doesn’t make them vulnerable to an extreme reaction to fava beans. And considering that pregnant women are very vulnerable to malaria, it’s a good thing that many women can have their favism and eat it too.

HUMANS HAVE BEEN relying on herbal remedies since, well, probably before there were humans. Archaeologists have found evidence suggesting that Neanderthals may have used plants for healing 60,000 years ago. The ancient Greeks used opium milk, which is the fluid that oozes out of the opium poppy when it’s slashed, as a painkiller—today we derive morphine, one of the most powerful painkillers available, from the same place.

the first really effective antimalarial medicine came from the bark of the cinchona tree. George Cleghorn, a Scottish army surgeon, is one of those credited with discovering the antimalarial properties of cinchona bark early in the nineteenth century, but it still took another century before French chemists isolated the specific beneficial compound—quinine—and made a medicinal tonic from it. The tonic tasted awful, though, so legend has it that British soldiers mixed their gin rations with their tonic treatments and presto, a classic was born. Tonic water still contains quinine today, but unfortunately, if you’re going to travel somewhere that malaria is prevalent, you still need a prescription for your antimalarial drug; just about every strain of malaria has become somewhat resistant to quinine. Good thing we have those helpful fava beans.

Eat your vegetables. Your vegetables can kill you.

Mother Nature is sending mixed messages again. the truth—as you’ve no doubt gathered—is complicated. Many plant toxins can be good for us. The trick is understanding how they work, how we work, and how it all works together.

Those phytoestrogens that can cause sterility? It looks like genistein, the phtyoestrogen in soy, might help to stop or slow the growth of prostate cancer cells. Some researchers think the same compound may ease the effects of menopause, which could explain why Asian women report far fewer problems with mid-life changes.

Capsaicin, the hot in hot peppers, stimulates the release of endorphins, which induce feelings of pleasure and reduce feelings of stress. Capsaicin also increases your metabolic rate—some think by as much as 25 percent. Even more, there is a growing body of evidence that capsaicin may be helpful in alleviating pain caused by everything from arthritis and shingles to postoperative discomfort.

The list goes on. The psoralen in celery can cause skin damage—but it also is a real help for people with psoriasis. Allicin, which comes from garlic, prevents platelets in your blood from sticking together and becoming clots, which makes it a potentially powerful weapon against heart disease. The aspirin a day that keeps the doctor away? It started out as a chemical in the bark of willow trees to keep the insects away. Today it’s a virtual drug of all trades—a blood-thinning, fever-reducing pain reliever. Taxol? the powerful anticancer drug is another tree bark derivative—in this case from the bark of the Pacific yew.

Around 60 percent—or more—of the world’s population still relies directly on plants for medicine. Probably isn’t such a bad idea for us to drop in every once in a while, take a look at what they’re cooking, and wonder why.