Survival of the Sickest: A Medical Maverick Discovers Why We Need Disease - Sharon Moalem (2007)
Chapter 3. THE CHOLESTEROL ALSO RISES
Everybody knows that humanity’s relationship with the sun is multifaceted. As we all learned in elementary school, almost the entire global ecology of our planet depends on sufficient sunlight—beginning with the production of oxygen by plants through photosynthesis, without which we wouldn’t have food to eat or air to breathe. And as we all have learned more and more over the last couple of decades, too much sun can be a bad thing on a global level and an individual one, throwing our environment into chaos by causing drought or causing deadly skin cancer.
But most people don’t know that the sun is just as important on an individual, biochemical level—and the relationship is just as two-sided. Natural sunlight simultaneously helps your body to create vitamin D and destroys your body’s reserves of folic acid—both of which are essential to your health. To manage this can’t-live-with-you-can’t-live-without-you relationship, different populations have evolved a combination of adaptations that, together, help to protect folic acid and ensure sufficient vitamin D production.
VITAMIN D IS a critical component of human biochemistry, especially to ensure the growth of healthy bones in children and the maintenance of healthy bones in adults. It ensures that our blood has sufficient levels of calcium and phosphorus. New research is discovering that it’s also crucial to the proper function of the heart, the nervous system, the clotting process, and the immune system.
Without enough vitamin D, adults are prone to osteoporosis and children are prone to a disease called rickets that results in improper bone growth and deformity. Vitamin D deficiencies have also been shown to play a role in the development of dozens of diseases—everything from many different cancers to diabetes, heart disease, arthritis, psoriasis, and mental illness. Once the link between vitamin D and rickets was established early in the twentieth century, American milk was fortified with vitamin D, all but eliminating the disease in America.
We don’t have to rely on fortified milk for vitamin D, however. Unlike most vitamins, vitamin D can be made by the body itself. (Generally speaking, a vitamin is an organic compound that an animal needs to survive but can usually obtain only from outside the body.) We make vitamin D by converting something else that, like the sun, has been getting a bad rap lately, but is 100 percent necessary for survival—cholesterol.
Cholesterol is required to make and maintain cell membranes. It helps the brain to send messages and the immune system to protect us against cancer and other diseases. It’s a key building block in the production of estrogen and testosterone and other hormones. And it is the essential component in our manufacture of vitamin D through a chemical process that is similar to photosynthesis in its dependence on the sun.
When we are exposed to the right kind of sunlight, our skin converts cholesterol to vitamin D. The sunlight necessary for this process is ultraviolet B, or UVB, which typically is strongest when the sun is more or less directly overhead—for a few hours every day beginning around noon. In parts of the world that are farther from the equator, very little UVB reaches the earth during winter months. Fortunately, the body is so efficient at making vitamin D that, as long as people get sufficient sun exposure and have enough cholesterol, we can usually accumulate enough vitamin D reserves to get us through the darker months.
By the way, the next time you get your cholesterol checked, make a note of the season. Because sunlight converts cholesterol to vitamin D, cholesterol levels can be higher in winter months, when we continue to make and eat cholesterol but there’s less sunlight available to convert it.
It’s interesting to note that, just as it blocks the ultraviolet rays that give us a suntan, sunblock also blocks the ultraviolet rays we need to make vitamin D. Australia recently embarked on an anti–skin cancer campaign it called “Slip-Slop-Slap.” The campaign was especially effective at producing unintended results—Australian sun exposure went down, and Australian vitamin D deficiencies went up.
On the flip side, researchers have discovered that tanning can actually help people who have vitamin D deficiencies. Crohn’s disease is a disorder that includes significant inflammation of the small intestine. Among other things, the inflammation impairs the absorption of nutrients, including vitamin D. Most people who have Crohn’s have a vitamin D deficiency. Some doctors are now prescribing UVB tanning beds three times a week for six months to get their patients’ vitamin D back up to healthy levels!
Folic acid or folate, depending on its form, is just as important to human life. Folate gets its name from the Latin word for “leaf” because one of the best sources for folate is leafy greens like spinach and cabbage. Folate is an integral part of the cell growth system, helping the body to replicate DNA when cells divide. This, of course, is critical when humans are growing the fastest, especially during pregnancy. When a pregnant woman has too little folic acid, the fetus is at significantly higher risk for serious birth defects, including spina bifida, a deformation of the spinal cord that often causes paralysis. And as we said, ultraviolet light destroys folic acid in the body. In the mid-1990s an Argentinian pediatrician reported that three healthy women all gave birth to children who had neural tube defects after using indoor tanning beds during their pregnancies. Coincidence? Probably not.
Pregnancy isn’t the only time folate is important, of course. A lack of folate is also directly linked to anemia, because folate helps to produce red blood cells.
THE SKIN, AS you’ve probably heard, is the largest organ of the human body. It’s an organ in every sense of the word, responsible for important functions related to the immune system, the nervous system, the circulatory system, and metabolism. The skin protects the body’s stores of folate, and it’s in the skin that a crucial step in the manufacturing of vitamin D takes place.
As you might have guessed, the wide range of human skin color is related to the amount of sun a population has been exposed to over a long period. But darker skin isn’t just an adaptation to protect against sunburn—it’s an adaptation to protect against the loss of folic acid. The darker your skin, the less ultraviolet light you absorb.
Skin color is determined by the amount and type of melanin, a specialized pigment that absorbs light, produced by our bodies. Melanin comes in two forms—red or yellow pheomelanin, or brown or black eumelanin—and is manufactured by cells called melanocytes. Everybody on earth has around the same number of melanocytes—differences in skin color depend, first, on how productive these little melanin factories are and, second, on what type of melanin they make. The melanocytes of most Africans, for example, produce many times the amount of melanin that the melanocytes of Northern Europeans produce—and most of it is eumelanin, the brown or black version.
Melanin also determines hair and eye color. More melanin means darker hair and darker eyes. The milk white skin of an albino is caused by an enzyme deficiency that results in the production of little or no melanin. When you see the pink or red eyes that albinos usually have, you’re actually seeing the blood vessels in the retina at the back of the eye, made visible by the lack of pigment in the iris.
As everybody knows, skin color changes, to some extent, in response to sun exposure. The trigger for that response is the pituitary gland. Under natural circumstances, almost as soon as you are exposed to the sun, your pituitary gland produces hormones that act as boosters for your melanocytes, and your melanocytes start producing melanin on overdrive. Unfortunately, it’s very easy to disrupt that process. The pituitary gland gets its information from the optic nerve—when the optic nerve senses sunlight, it signals the pituitary gland to kick-start the melanocytes. Guess what happens when you’re wearing sunglasses? Much less sunlight reaches the optic nerve, much less warning is sent to the pituitary gland, much less melanocyte-stimulating hormone is released, much less melanin is produced—and much more sunburn results. If you’re reading this on the beach with your RayBans on, do your skin a favor—take them off.
Tanning helps people cope with seasonal differences in sunlight in their ancestral climate; it’s not enough protection for a Scandinavian at the equator. Someone like that—with very little natural ability to tan and regular, unprotected exposure to tropical sun—is vulnerable to severe burning, premature aging, and skin cancer, as well as folic acid deficiency and all its associated problems. And the consequences can be deadly. More than 60,000 Americans are diagnosed with melanoma—an especially aggressive type of skin cancer—every year. European Americans are ten to forty times as likely to get melanoma as African Americans.
AS HUMANITY WAS evolving, we probably had pretty light skin too, underneath a similar coat of coarse, dark hair. As we lost hair, the increased exposure of our skin to ultraviolet rays from the strong African sun threatened the stores of folate we need to produce healthy babies. And that created an evolutionary preference for darker skin, full of light-absorbing, folate-protecting melanin.
As some population groups moved northward, where sunlight was less frequent and less strong, that dark skin—“designed” to block UVB absorption—worked too well. Now, instead of protecting against the loss of folate, it was preventing the creation of vitamin D. And so the need to maximize the use of available sunlight in order to create sufficient vitamin D created a new evolutionary pressure, this time for lighter skin. Recent scientific sleuthing reported in the prestigious journal Science goes so far as to say that white-skinned people are actually black-skinned mutants who lost the ability to produce significant amounts of eumelanin.
Redheads, with their characteristic milky white skin and freckles, may be a further mutation along the same lines. In order to survive in places with infrequent and weak sunlight, such as in parts of the U.K., they may have evolved in a way that almost completely knocked out their body’s ability to produce eumelanin, the brown or black pigment.
In 2000, an anthropologist named Nina G. Jablonski and a geographic computer specialist named George Chaplin combined their scientific disciplines (after already combining their lives in marriage) to chart the connection between skin color and sunlight. The results were as clear as the sky on a cloudless day—there was a near-constant correlation between skin color and sunlight exposure in populations that had remained in the same area for 500 years or more. They even produced an equation to express the relationship between a given population’s skin color and its annual exposure to ultraviolet rays. (If you’re feeling adventurous, the equation is W = 70-AUV/10. W represents relative whiteness and AUV represents annual ultraviolet exposure. The 70 is based on research that indicates that the whitest possible skin—the result of a population that received zero exposure to UV—would reflect about 70 percent of the light directed at it.)
Interestingly, their research also proposes that we carry sufficient genes within our gene pool to ensure that, within 1,000 years of a population’s migration from one climate to another, its descendants would have skin color dark enough to protect folate or light enough to maximize vitamin D production.
There is one notable exception to Jablonski and Chaplin’s equation—and it’s the exception that proves the rule. The Inuit—the indigenous people of the subarctic—are dark-skinned, despite the limited sunlight of their home. If you think something fishy’s going on here, you’re right. But the reason they don’t need to evolve the lighter skin necessary to ensure sufficient vitamin D production is refreshingly simple. Their diet is full of fatty fish—which just happens to be one of the only foods in nature that is chock-full of vitamin D. They eat vitamin D for breakfast, lunch, and dinner, so they don’t need to make it. If you ever had a grandmother from the Old World try to force cod liver oil down your throat, she was onto something for the same reason—since it’s full of vitamin D, cod liver oil was one of the best ways to prevent rickets, especially before milk was routinely fortified with it.
IF YOU’RE WONDERING how people who have dark skin make enough vitamin D despite the fact that their skin blocks all those ultraviolet rays, you’re asking the right questions. Remember, ultraviolet rays that penetrate the skin destroy folate—and ultraviolet rays that penetrate the skin are necessary to create vitamin D. Dark skin evolved to protect folate, but it didn’t evolve with a switch—you can’t turn it off when you need to whip up a batch of vitamin D. So that would seem to create a new problem for people with dark skin—even if they lived in a sunny climate—because even though they received plenty of exposure to ultraviolet rays, the skin color that protected their supply of folate would prevent them from stocking up on vitamin D.
It’s a good thing evolution’s such a clever sort, because it took that into account—it kept room for a little guy called apolipoprotein E (ApoE4) in the gene pool of dark-skinned population groups. And guess what ApoE4does? It ensures that the amount of cholesterol flowing through your blood is cranked up. With more cholesterol available for conversion, dark-skinned people can maximize the use of whatever sunlight penetrates their skin.
Much farther to the north, without a similar adaptation, the light-skinned people of Europe would face a similar problem. There, instead of plenty of sunlight that was largely blocked by dark skin, they had to deal with too little sunlight to make enough vitamin D even with the benefit of their light skin. And sure enough, ApoE4 is also common throughout Northern Europe. The farther north you go up the continent, the more you’ll find it. As it does in Africans, the ApoE4 gene keeps cholesterol levels cranked up, allowing its carriers to compensate for limited ultraviolet exposure by maximizing the cholesterol available for conversion to vitamin D.
Of course, in characteristic evolutionary fashion, ApoE4 comes with a trade-off. The ApoE4 gene and all the extra cholesterol that accompanies it put people at greater risk for heart disease and stroke. In Caucasians, it even carries a higher risk for development of Alzheimer’s disease.
And as you’ve seen with iron loading and diabetes—one generation’s evolutionary solution is another generation’s evolutionary problem, especially when people no longer live in the environment that their bodies adapted to through evolution. (If you want a funny-sounding example of an environmental defense turned environmental hazard, you need look no further than your nose. ACHOO syndrome—its full name is autosomal dominant compelling helioopthalmic outburst syndrome—is the name of a “disorder” that causes uncontrolled sneezing when someone is exposed to bright light, usually sunlight, after being in the dark. Well, way back when our ancestors spent more time in caves, this reflex helped them to clear out any molds or microbes that might have lodged in their noses or upper respiratory tract. Today, of course, when someone is driving through a dark tunnel and emerges into the bright sun and gets a sneezing fit, ACHOO isn’t helpful or funny at all—it can be downright dangerous.) But before we examine more instances of the effect a new environment has an old adaptations, let’s take a look at another example of different population groups taking divergent evolutionary paths—this time, not just for environmental reasons, but for cultural reasons too.
IF YOU’RE OF Asian descent and have ever had an alcoholic beverage, there’s a fifty-fifty chance your heart rate shot up, your temperature climbed, and your face turned bright red. If you’re not Asian but you’ve ever been in a bar frequented by people with an Asian background, chances are you’ve seen this reaction. It’s called Asian flush or, more formally, alcohol flush response. It happens to as many as half of all people of Asian descent, but it’s uncommon in just about every other population group. So what’s the story?
When you consume alcohol, your body detoxifies it and then extracts calories from it. It’s a complex process that involves many different enzymes and multiple organs, although most of the process takes place in the liver. First, an enzyme called alcohol dehydrogenase converts the alcohol into another chemical called acetaldehyde; another enzyme—cleverly called acetaldehyde dehydrogenase—converts the acetaldehyde into acetate. And a third enzyme converts that into fat, carbon dioxide, and water. (The calories synthesized from alcohol are generally stored as fat—beer bellies really do come from beer.)
Many Asians have a genetic variation (labeled ALDH2*2) that causes them to produce a less powerful form of acetaldehyde dehydrogenase—one that isn’t as effective in converting acetaledehyde, that first by-product of alcohol, into acetate. Acetaldehyde is thirty times as toxic as alcohol; even very small amounts can produce nasty reactions. And one of those reactions is the flushing response. That’s not all it does, of course. After even one drink by people who have the ALDH2*2 variation, the acetaldehyde buildup causes them to appear drunk; blood rushes to their face, chest, and neck; dizziness and extreme nausea set in—and the drinker is on the road to a nasty hangover. Of course, there’s a side benefit to all this—people who have ALDH2*2 are highly resistant to alcoholism. It’s just too unpleasant for them to drink!
In fact, the resistance to alcoholism is so strong in people with ALDH2*2 that doctors often prescribe alcoholics with a drug called disulfiram, which essentially mirrors the ALDH2*2 effect. Disulfiram (Antabuse) interferes with the body’s own supply of the acetaldehyde dehydrogenase enzyme, so anyone who drinks alcohol while taking it ends up with something that looks an awful lot like Asian flush and feels truly awful to boot.
So why is the ALDH2*2 variation so common among Asians and virtually nonexistent among Europeans? It’s all about clean water. As humans began to settle in cities and towns, they got their first taste of the sanitation and waste management problems that still plague cities today—but without even the possibility of modern plumbing. This made clean water a real challenge, and some theories suggest that different civilizations came up with different solutions. In Europe, they used fermentation—and the resulting alcohol killed microbes, even when, as was often the case, it was mixed with water. On the other side of the world, people purified their water by boiling it and making tea. As a result, there was evolutionary pressure in Europe to have the ability to drink, break down, and detoxify alcohol, while the pressure in Asia was a lot less.
Alcohol isn’t the only beverage that requires some specific genetic mutation to enjoy, by the way. If you’re reading this while sipping a latte or slurping an ice cream cone, you’re a mutant. The great majority of the world’s adults cannot eat or drink milk without experiencing a very unpleasant digestive reaction; once they no longer feed on breast milk, their bodies stop producing the enzyme that we need to digest lactose, the main sugar compound in milk. But if you can drink milk without the characteristic bloating, cramping, and diarrhea that signify lactose intolerance, you’re a lucky mutant. You probably are descended from farmers who drank animal milk; somewhere along the line, a mutation sprang up that allowed people to keep producing the lactose-processing enzyme called lactase as adults, and that mutation spread throughout farming populations until it landed in your genome.
PEOPLE OF AFRICAN descent have darker skin and are much more likely to have a gene that causes them to produce greater amounts of cholesterol. People of Northern European descent have pale skin and are much more likely to have iron loading and a predisposition for Type 1 diabetes. People of Asian descent are much more likely to be unable to process alcohol efficiently. Are those racial differences?
It’s not a question that can be easily answered. First of all, there’s no real agreement as to what race means. On the genetic level, it’s pretty clear that skin color isn’t reliable. We’ve already discussed how the skin color of a transplanted population would change to match the level of ultraviolet exposure in its new environment. Recent genetic studies bear this out—in terms of common genetics, some dark-skinned North Africans are probably closer to light-skinned Southern Europeans than they are to other Africans with whom they share skin color.
On the other hand, many Jews seem to share a distinct genetic heritage despite the fact that they may be fair, blond, and blue-eyed or dark, black-haired, and brown-eyed. This has been borne out by recent research as well. Jews divide themselves into three groups to preserve certain religious traditions. The groups are based on which biblical tribe they are descended from—the Cohanim are members of the priestly tribe that traces its roots to Moses’ brother Aaron, the original high priest. Levites are descendants of the tribe of Levi, the traditional princes of the temple. Today, descendants of the other twelve tribes are simply called Israelites.
A group of researchers recently compared the DNA of a large group of Cohanim to the DNA of a large group of Israelites. the researchers were stunned to discover that—despite being spread across the world—the genetic markers of the Cohanim were so specific that they were all almost certainly descended from just a few male individuals. They came from Africa, from Asia, from Europe—and though their appearance ran the gamut from light-skinned and blue-eyed to dark-skinned and brown-eyed, most of them shared very similar Y chromosome markers. This controversial data even allowed the researchers to estimate when the originators of the Cohanim genes were alive. According to the researchers, that would have been 3,180 years ago, between the exodus from Egypt and the destruction of the First Temple in Jerusalem—or exactly when Aaron walked the earth.
NATURE GENETICS, A prominent journal, recently editorialized that “population clusters identified by genotype analysis seem to be more informative than those identified by skin color or self-declaration of race”—that makes a lot of sense. Instead of worrying about whether or not there are distinct “races,” let’s concentrate on what we do know and use that to advance medical science. What we do know is that distinct populations do share distinct genetic heritages, which are almost certainly the result of different evolutionary pressures our various ancestors experienced as they settled and resettled across the globe.
the current mainstream consensus is that modern humans evolved in Africa around 250,000 years ago. According to that theory, they migrated from Africa northward toward what is now the Middle East. then some went right, populating India, the Asian coast, and ultimately, the Pacific Islands. Other groups headed left, settling across Central Europe. Still others continued north, spreading across Central Asia or venturing farther, by boat or by ice bridge, over the top of the world and then down into North and South America. All of that migration probably took place within the last 100,000 years. Of course, we don’t know for sure yet. It’s also possible that humans evolved in multiple places, and that different groups of prehumans and Neanderthals even interbred.
Whatever the truth is, it’s clear that, as humanity evolved, different groups of humans encountered widely different circumstances—from infectious tropical diseases to sudden ice ages to pandemic plagues. The evolutionary pressure that accompanied all these challenges was probably intense enough to account for the differences we see between populations today. We’ve discussed a few examples, but the range is broad. Skull shape, for example, may have evolved as a mechanism to facilitate storage and release of heat depending on a population’s climate.
Dense hair on the forearms and legs—the parts of the body usually exposed even with moderate dress—may have been a defense against malaria carried by mosquitoes. With the exception of Africa, where the heat was an evolutionary counterweight to thick body hair, the densest hair is generally found in the same places where malaria is most common—the eastern Mediterranean basin, southern Italy, Greece, and Turkey. In Africa, where the heat was an evolutionary argument against denser body hair, people are prone to sickle-cell anemia, which, as we’ll discuss, offers some protection from malaria.
It’s also important to remember that, in migratory terms, humanity has been on an express train for the last 500 years. the result, of course, is a blurring of genetic distinctions as people from different parts of the world meet and mate. Populations have always tended to combine genetic material (aka making babies) with nearby populations, but that genetic intermixing is taking place on a global scale today. In fact, genetic testing is revealing that the human population as a whole is already far more mixed than most people assume. Take Dr. Henry Louis Gates, for example, the distinguished scholar who is the chair of African and African American Studies at Harvard. Dr. Gates is black, but he and his family have long believed that they had at least one distant ancestor who wasn’t black. Most likely a former slave owner who was thought to have been involved with his great-great-grandmother. And then some genetic testing revealed that Dr. Gates had no relationship to the slave owner—but fully 50 percent of his genetic heritage was European. Half of his ancestors were white.
Finally, we have to keep in mind that, in the right circumstances, heavy evolutionary pressure can breed a trait into—or out of—a population’s gene pool in just a generation or two.
When you combine the possibility of relatively fast changes in a given gene pool with the rapid migration of the last 500 years, you can understand that population subsets with distinct genetic traits can emerge pretty quickly. A controversial theory looks to a shameful period in our history to explain the high rate of high blood pressure among African Americans.
High blood pressure, or hypertension, is a particularly insidious disease—it’s responsible for as much as 25 percent of end stage kidney failure, but it usually has no noticeable symptoms; that’s why it’s often called the “silent killer.” It is almost twice as common among African Americans as it is in the rest of the American population. Doctors first noticed the elevated incidence of high blood pressure in African Americans in the 1930s and assumed that all blacks shared a propensity for it. They were wrong. Blacks living in Africa do not have the same rate of hypertension as people of African descent in America. What’s the explanation?
You’ve probably heard that salt can raise your blood pressure. Research has demonstrated that this is especially true for African Americans; their blood pressure is very reactive to salt. Now, salt also got a bad rap for a while, especially when it was first linked to high blood pressure, but it’s a critical component of your body chemistry. It regulates fluid balance and nerve cell function. You can’t survive without it. But when people who are especially reactive to it eat a diet high in salt, it can contribute to high blood pressure.
When Africans were taken to America against their will by slave traders, they were transported under horrible conditions—they usually weren’t fed or even given sufficient amounts of water. The death rate was very high. It’s possible that those with a natural propensity to retain high levels of salt had a better chance to survive—the extra salt helped them to maintain enough water to avoid fatal dehydration. If that’s true, you can see how the slave trade might have produced a very unnatural selection for an increased ability to retain salt in many African Americans. When you couple that ability with a modern diet high in salt, it results in increased rates of hypertension.
FROM A MEDICAL perspective, it’s clear that specific diseases are more prevalent in specific population groups in a way that is significant and deserves continued, serious exploration. On a proportional basis, African Americans have almost twice as many fatal heart attacks as European and South Asian Americans; their rate of cancer is 10 percent higher. European Americans are more likely to die of cancer and heart disease than Latino, Asian, or Native Americans. American Latinos are more likely to die of diabetes, liver disease, and infectious disease than non-Latinos. And Native Americans have higher rates of tuberculosis, pneumonia, and influenza. It seems like new examples crop up every month in the scientific literature. The most recent study discovered that African Americans who smoke a pack of cigarettes a day are far more likely to develop lung cancer than whites with the exact same habit.
Now, these statistics don’t necessarily tell the whole story. For starters, they don’t always control for other differences in these groups that have nothing to do with genetics and evolution. Differences in diet and nutrition, environment, personal habits, and access to health care will all have an effect on these studies. But that doesn’t mean we should ignore the large trends we see among different population groups—to the contrary, the more we understand how our evolution has shaped our genetic makeup, the more we can understand how to live a healthy life today. Let’s look at a few examples.
We’ve discussed two parallel adaptations to manage the sun’s dueling effects on body chemistry—the evolution of dark skin to protect our stores of folate and the evolution of a genetic trigger for increased cholesterol to maximize production of vitamin D. Both of those adaptations are common in people of African descent and are effective—in the bright, strong sun of equatorial Africa.
But what happens when people with those adaptations move to New En gland, where the sun is much less plentiful and far less strong? Without enough sunlight to penetrate their dark skin and convert the additional cholesterol, they’re doubly vulnerable—not enough vitamin D and too much cholesterol.
Sure enough, rickets—the disease caused by a vitamin D deficiency that causes poor bone growth in children—was very common in African American populations until we started routinely fortifying milk with vitamin D in the last century. And there appear to be connections among sunlight, vitamin D, and prostate cancer in African Americans as well. There is growing evidence that vitamin D inhibits the growth of cancerous cells in the prostate and in other areas, including the colon, too. Epidemiologists, who specialize in unlocking the mystery of where, why, and in whom disease occurs, have found that the risk of prostate cancer for black men in America climbs from south to north. When it comes to prostate cancer in black men, the risk is considerably lower in sunny Florida. But as you move north, the rate of prostate cancer in black men climbs until it peaks in the often cloud-covered heights of the Northeast. There is a growing belief among some researchers that a lack of vitamin D may also be one of the reasons we get sick more often in the winter than in the summer months.
The combination of excess cholesterol and lack of exposure to sufficient sunlight may well be part of the reason that African Americans have such a high rate of heart disease. the ApoE4 gene keeps the blood full of cholesterol even though there’s not enough sunlight in a northern climate to convert it to vitamin D. As cholesterol builds up, it attaches to the walls of your arteries—eventually, it can build up so much that it results in a blockage that causes a heart attack or a stroke.
The pharmaceutical industry has begun to take the genetic differences of populations into account. This study of how genetic variation can affect pharmaceutical treatment is called pharmacogenetics, and it’s already producing results. There’s a general consensus that some of the usual therapies for hypertension, for example, don’t work as well for African Americans. The U.S. Food and Drug Administration (FDA) recently approved a controversial drug called BiDil for “self-identified” black patients who have heart failure.
New research has demonstrated that it’s not just the presence of a specific genetic variation that can affect our body chemistry (and thus, the way we respond to a given drug)—it’s how many times that gene occurs in our genome. In other words, it’s quantity and quality.
For example, a gene called CYP2D6 affects the way people metabolize more than 25 percent of all pharmaceuticals—including very common drugs like decongestants and antidepressants. People who have very few copies of this gene are called “slow metabolizers.” It’s thought that up to 10 percent of Caucasians fall into this category, but only 1 percent of Asians fit the bill. If you’ve ever taken a standard dose of Sudafed and felt a tingling sensation and a rapid heartbeat, you’re probably a slow metabolizer, and you should talk to your doctor about cutting your dosage.
On the other end of the spectrum are ultrarapid metabolizers; these folks can have as many as thirteen copies of the CYP2D6 gene! Of Ethiopians, 29 percent are metabolizers on hyperspeed, compared to less than 1 percent of Caucasians. The more we learn about the way genetic makeup affects an individual’s response to a given drug, the more it’s clear that “personalized medicine,” where dosing and drugs are tailored to fit your genome, has the potential to provide significant health benefits.
Scientists suspect that the presence and quantity of genes like CYP2D6 in different populations are related to the relative toxicity of a specific population’s environment. Fast metabolizers can “clear”—detoxify—harmful substances more successfully. So the more toxins—from food, insects, whatever—in a particular environment, the more evolution favored multiple copies of toxin-clearing genes. Sometimes that fast metabolizing can be a problem too: some fast metabolizers actually convert certain drugs—like codeine—into much more potent forms. There was a recent report of a patient who became ill because she converted the codeine in her prescription cough syrup into morphine much faster than anyone expected. Sure enough, she was a CYP2D6 fast metabolizer.
Another gene, this one called CCR5-?32, appears to prevent human immunodeficiency virus (HIV) from entering cells. One copy of this gene significantly hampers the virus’s ability to multiply, reducing the viral load in people who carry the gene and become infected. And two copies of the gene? Almost complete immunity from HIV. Tragically, CCR5-?32 is almost completely absent in Africans, where AIDS is epidemic, but it occurs in some 5 to 10 percent of Caucasians. Some researchers have suggested that CCR5-?32 was selected for in the same way hemochromatosis was—because it offered some type of protection against the bubonic plague—but, unlike hemochromatosis, no clear mechanism for this selection has been suggested.
One thing is clear—there is mounting evidence that where our ancestors came from, how they adapted to manage their environment, and where we live today all combine to have a significant impact on our health. That understanding ought to inform everything from research in the laboratory to medical care in the doctor’s off ce to life in our homes. Today, the most widely prescribed therapy for high cholesterol is a class of drugs called statins.Although they are considered generally “safe” drugs, over time, statins can cause serious side effects, including liver damage. If you knew that you might be able to reduce your excess cholesterol by getting enough sunlight to convert it to vitamin D, wouldn’t you rather hit the tanning salon before starting a lifetime of Lipitor?
That’s food for thought.