Curious Folks Ask: 162 Real Answers on Amazing Inventions, Fascinating Products, and Medical Mysteries - Sherry Seethaler (2009)
Chapter 6. Assorted ailments
It seems that those who have compromised immune systems, such as AIDS patients, are more vulnerable to both infectious diseases and certain types of cancer. Please explain what science currently knows about immunity to cancer relative to immunity to infectious diseases.
Our immune systems are trained to tolerate “self” and to attack invaders. Because cancer cells start out as normal cells in the body, the immune system tends to ignore them.
Nevertheless, many cancer cells have cancer-specific proteins on their surfaces, and efforts are under way to develop vaccines to teach patients’ immune systems to recognize these proteins as foreign and to attack the cancer. Because cancer is not one disease, but more than 100 diseases characterized by the uncontrolled growth of cells, vaccines must be tailored for specific cancers.
In terms of how susceptibility to cancer is influenced by the state of one’s immune system, research has revealed a paradox. On the one hand, people with AIDS and other conditions that suppress the immune system are more susceptible to cancers that are associated with infectious agents, such as human papillomavirus. Infectious conditions are thought to cause at least 15 percent of human cancers.
On the other hand, people with chronic inflammatory conditions—in which the immune system is overactive—also have a greater risk of getting cancer. Some chronic inflammatory disorders are triggered by infections, and others are genetic. Long-term usage of anti-inflammatory drugs has been shown to reduce cancer risk.
If the immune response simply consisted of Pac-Man-like immune cells gobbling up nasty invaders, a chronically “on” immune system would be more likely to find cancer cells and annihilate them. But the immune response also involves chemical warfare, which is effective against nasty microbes, but can also damage the DNA of healthy cells and cause them to become cancerous.
The immune system is not all about warmongering. It also plays a role in reconstruction efforts, and conditions that favor growth of normal cells also favor the growth of cancer cells. For example, angiogenesis—expansion of the network of blood vessels—is needed for wound repair, but it also provides the supply routes that nourish a growing tumor.
Further complicating the immune system and cancer story are reports in the older medical literature of acute infections—those triggering an “on/off” immune response—leading to spontaneous remission of tumors. Some historical cancer treatments involved injecting patients’ tumors with bacterial toxins. One analogous conventional therapy is approved by the U.S. Food and Drug Administration. It uses a live bacterium, applied directly to the tumor, in the treatment of bladder cancer.
Why do chocolate and oily foods produce acne?
Isaac Asimov once said, “The first law of dietetics seems to be: If it tastes good, it’s bad for you.” Teenagers will probably find that Asimov’s words echo the advice they receive on diet and acne—advice that is a blend of truth and folklore.
Acne is less prevalent in rural, nonindustrialized societies. For example, acne was absent in Inuit people when they were following traditional ways of living and eating. After their transition to modern life, the prevalence of acne became similar to that in Western societies.
Some scientists have therefore speculated that acne is caused by high-glycemic diets—diets high in sugar and refined starches—which are typical of adolescents in Westernized societies. They hypothesize that frequent consumption of high-glycemic food leads to chronically elevated insulin. High insulin may initiate a hormone cascade that leads to overproduction of skin cells in pores and results in blockages.
It is a tantalizing hypothesis, but testing it satisfactorily requires getting large numbers of people to change their diets radically for a significant period of time. At least one such study is under way.
Studies of the effect on acne of individual foods, including sugary foods like chocolate and greasy foods like French fries and pizza, typically have found no relationship between consumption of the food and acne. Chocolate has been vindicated in several studies.
Milk was implicated in one recent study, in which women were grouped according to how many milk products they recalled consuming as teens. In the highest-consumption group, a slightly higher percentage of women had had severe acne as teens compared to the lowest-consumption group. The researchers speculated that the many hormones that occur naturally in milk could cause acne. However, this study does not prove that milk causes acne. It is equally possible that acne causes milk consumption—due to pressure by family members telling acne sufferers that drinking milk will clear up their skin.
Dermatologists think that iodine sensitivity may be responsible for a small number of cases of acne. Iodine in kelp, shellfish, and certain mineral supplements could irritate pores in sensitive individuals. Incidentally, milk also contains iodine from supplements fed to cows and sanitizing solutions used on udders.
So although a healthy diet is good advice for everyone, and some people may have sensitivities to certain foods, acne is a complex disease. No compelling evidence exists that any one food is a complexion nightmare.
Much ado about nothing
What causes allergies? Why are some people allergic to certain substances while others are not?
The National Institute of Allergy and Infectious Diseases reports that allergies affect more than 50 million Americans and cost the health care system $18 billion annually.
An allergic reaction occurs when the body mobilizes defenses against a harmless substance. On encountering an allergen such as grass pollen, immune cells in the body of an allergy-prone individual manufacture large amounts of a type of antibody called IgE.
IgE antibodies attach themselves to mast cells, which usually produce chemicals in response to invading microbes. When the IgE antibodies encounter the allergen they recognize, they stimulate the mast cells to produce chemicals, including histamine, which act on the blood vessels, mucous glands, and other organs and produce the symptoms of the specific allergic condition.
Common respiratory allergens are pollen, mold spores, dust mites, and animal fur and dander. Latex, insect bites, drugs (for instance, penicillin), and jewelry can also cause allergic reactions. Common food allergens are milk, eggs, nuts, wheat, and seafood.
Not all food intolerances are allergies. For example, many people lack an enzyme needed to break down lactose—a sugar found in milk. Lactose intolerance is different from an allergy, which involves producing antibodies to a component of milk. Even so, the symptoms of food intolerances and allergies can be similar.
Researchers have estimated that more than 50 genes influence people’s susceptibility to developing allergies. For example, some people have an inherited tendency to produce excessive amounts of the IgE antibody. Hay fever, asthma, eczema, and food allergies are common in these families.
Nongenetic factors play an important role as well. For example, infants who have been breastfed are less likely to develop allergies. Psychological stress has been shown to aggravate allergy symptoms.
The “hygiene hypothesis” was introduced to explain the rising incidence of allergies in Western societies ever more obsessed with cleanliness. It suggests that exposure to airborne allergens and microbes early in life prevents our immune systems from becoming overly sensitive, reducing allergy risk. However, the hypothesis fails to explain the high rates of asthma and allergies in youth exposed to inner-city air pollution.
With respect to food allergens, frequent, early exposure may increase allergy risk. For example, rice allergy is more common in children in Japan, and fish allergy is more common in Scandinavia. Researchers think this is because children have more porous or “leaky” intestines than adults, making it easier for proteins to get into the bloodstream and initiate an immune reaction.
Why do we hiccup?
Hiccups are involuntary contractions of the diaphragm—the main muscle used for breathing. The resulting intake of air is abruptly interrupted by the closing of the glottis—the opening between the vocal cords—leading to the characteristic sound.
Eating too fast or anything else that can trigger a sudden spasm of the diaphragm can cause hiccups. They are more likely to occur when the stomach is stretched following a meal. Certain medical conditions can also cause hiccups, such as a stroke that interferes with the part of the brain that regulates breathing, or an irritation of the diaphragm due to pneumonia.
Opinions vary on whether hiccups serve any purpose. One proposal is that hiccups are an evolutionary by-product of respiratory behaviors in lower vertebrates. Another is that the hiccup may help open the sphincter—a ring-like band of muscle—in the lower esophagus, permitting the escape of gas from the stomach and thus relieving pressure.
What causes hiccups, and why does taking a lump of sugar with a few drops of raspberry-flavored vinegar stop them? (That is my son’s cure, and it has never failed. We have never tried it with plain or any other type of vinegar.)
There are many folk remedies for hiccups. Some of them, such as drinking from the far side of a glass, may be more interesting to watch than effective. Interrupting the normal respiratory cycle by holding your breath or being startled (gasping) sometimes provides relief. So can stimulating the back of the throat, as happens when you swallow a lump of sugar. The resulting nerve impulses must shut down the hiccup circuitry.
Presumably these home remedies did not work for Charles Osborne, who made it into Guinness World Records for his 68-year attack of hiccups.
If the sensation of a lump in the throat is referred to as globus syndrome or globus hystericus only if medical tests rule out injury or disease as a possible cause, what kinds of injury or disease could cause the sensation?
When there is a physical rather than psychological explanation for the enduring sensation of a lump in the throat, globus hystericus is not the correct diagnosis. Unfortunately, it is common for people to be misdiagnosed with the syndrome. One study, in which extensive medical tests were conducted on 231 patients who had been diagnosed with globus hystericus, revealed that 80 percent of them had physical conditions responsible for the sensation.
A surprisingly wide range of disorders can cause the uncomfortable sensation of a mass in the throat. The most obvious is a real mass—a benign or cancerous tumor. The sensation can also result from neurological disorders, such as degenerative diseases or damage from a stroke. Low blood sugar and electrolyte disturbances—low levels of calcium, potassium, magnesium, or sodium—may also be responsible.
Another possibility is gastroesophageal reflux disease, or acid reflux. One study employed catheters to measure the acidity along the length of the esophagus of patients with the globus sensation. It showed that acid reflux limited to the lower third of the esophagus could cause the sensation. The researchers speculated that the vagus nerve could transmit the irritation from the lower to upper esophagus. Treatment for acid reflux relieved the globus sensation in the majority of these patients.
An extensive list of possible physical causes for throat lumps can be found in “Globus Hystericus: A Brief Review,” a paper in the journal General Hospital Psychiatry, volume 26 (2004).
I’ve heard it said that when the barometer falls, people have increased joint pain and claim that the pressure caused the pain. What exactly does this mean?
Barometric pressure is the force of the atmosphere pushing down at a particular location. Pressure is higher where air is slowly descending and lower where it is slowly rising. Low pressure often brings precipitation because air cools as it rises and the moisture in it condenses. High pressure usually is associated with clear weather, because the warming of air as it descends hampers the formation of clouds.
In addition, atmospheric pressure is lower at higher elevations because there are fewer air molecules. On weather maps, air pressure is adjusted to factor out altitude. This adjustment reveals the more subtle differences in pressure associated with movements of air that influence weather conditions.
The belief that weather and physical well-being are linked is longstanding. Hippocrates wrote about it in the fifth century B.C. “Wind wet” is the literal translation of the Chinese characters for rheumatism.
The majority of patients with joint inflammation report that their condition is affected by weather. Some claim that the aches in their joints provide an accurate weather forecast. There may be some truth to this, but researchers are still puzzling over the exact relationship between weather and joint ailments.
For example, according to some studies, patients’ joint symptoms flare up when barometric pressure is higher. Other studies have found the exact opposite. Still other studies suggest that joint pain worsens only when the pressure changes. Cold weather was also typically found to aggravate symptoms, as was increased humidity. Some studies have found that decreased sunlight and increased wind speed worsen symptoms.
The reason for the variability in the research findings is unclear. It may be due to differences in patient populations. The different types of arthritis and other joint problems may respond to weather in different ways. Then again, it may depend on the geographic location in which the study was carried out. Different combinations of weather variables may play a role in different climates.
Several possible mechanisms could account for the effects of weather on pain. Changes in temperature or pressure could make nerve endings more sensitive. Alternatively, because ligaments, tendons, bones, muscles, and scar tissue are all different densities, atmospheric changes could cause pain by contracting and expanding these tissues differentially. Weather patterns also affect mood in some people, which can alter pain perception.
What is the most common way people lose brain cells?
Surprisingly, neither disease nor trauma wipes out the greatest number of brain cells. The grimmest reaper in our brains is normal development. In some parts of the nervous system, it wipes out half the total number of nerve cells generated.
During development, the number of nerve cells rises to a maximum as cells proliferate and migrate to their final destinations. Nerve cells send out “feelers” called growth cones to find one another and form connections. Cells that fail to make appropriate connections are tidily eliminated.
Although this process may seem wasteful, it permits much more flexibility than would be possible if every nerve cell and connection were specified by our genes. Instead, it provides a mechanism through which brain anatomy and function in humans and other complex animals can respond to environmental influences.
The form of cell death that neatly eliminates unwanted cells is apoptosis. The term is derived from a Greek root that means “dropping of leaves off a tree.” During apoptosis cells shrink and display signals on their surfaces that tell other cells to eat them. Neighboring cells or white blood cells known as macrophages engulf the dying cells. Apoptosis is carefully controlled and does not injure the surrounding tissue.
In contrast, following infection, stroke, or trauma (such as when a big rock conks Wile E. Coyote on the head), brain cells mainly die via necrosis. Necrosis is much messier than apoptosis. It involves the leakage of cell contents and inflammation, which can damage nearby cells.
This is not to say that apoptosis is always a good thing. Sometimes apoptosis gets activated abnormally. For example, exposing the developing brain to alcohol can activate apoptosis and delete millions of nerve cells. The resulting damage is responsible for the most disabling features of fetal alcohol syndrome.
Evidence is mounting that apoptosis also plays an important role in many disorders characterized by slow degeneration of the central nervous system, such as Lou Gehrig’s disease (ALS), Parkinson’s disease, Huntington’s disease, and Alzheimer’s disease. Therefore, a great deal of research is directed at understanding what cellular signals activate apoptosis.
In modern society, the most common cause of intellectual deterioration is Alzheimer’s. In the United States, about 2 percent of the population is affected, but more than half of individuals over age 85 may have the disease, according to an article in the American Journal of Medicine. Cell numbers may decline by 20 to 80 percent in certain regions of the brain over the course of one or two decades as the disease progresses.
What happens in the brain of a boxer that causes him to lose consciousness and fall to the mat after a blow to the head?
The fluid surrounding the brain helps cushion it during everyday activities, but blows to the head create mechanical stress on brain tissue. When nerve cells are forcibly stretched and compressed, channels open in the outer membranes of the nerve cells. Ions can then flood into the cells and create a sudden electric discharge, which can result in loss of consciousness.
The electric discharge also leads to the release of neurotransmitters—chemicals nerve cells use to talk to one another. Excessive release of neurotransmitters can injure nerve cells. Damage can also result directly from the shearing strain on brain tissue. Brain scans have revealed that the longer the period of unconsciousness following the blow to the head, the deeper the location of the lesions in the brain.
Approximately 20 percent of professional boxers suffer from chronic traumatic brain injury (CTBI). Symptoms of CTBI are impairments in thought, behavior, and muscle control. When the symptoms are severe, CTBI is known as dementia pugilistica or punch-drunk syndrome. As in Alzheimer’s, the brains of people with CTBI accumulate senile plaques (abnormal deposits of protein) and tangles (twisted bundles of fibers).
How do people become addicted to caffeine?
In North America, 80 to 90 percent of adults use caffeine on a regular basis. The average daily intake among caffeine consumers is 280 milligrams, equivalent to drinking a large mug of coffee and a couple cans of cola. Some studies have shown that drinking as little as one cup of coffee per day can result in caffeine addiction. On the other hand, based on how caffeine affects the brain, some researchers dispute the notion that caffeine is addictive.
Addictive drugs such as cocaine and amphetamines act on the brain’s reward system. In humans and other animals, the normal function of the brain’s reward system is to generate pleasurable feelings to reinforce behaviors that support survival. By artificially activating reward system structures—specifically, the nucleus accumbens, ventral tegmental area, and prefrontal cortex—drugs hijack the reward system.
At normal doses, caffeine stimulates the prefrontal cortex, but not the other regions of the brain’s reward system. So caffeine’s action on the brain does not look like that of a typical addictive drug. Nevertheless, the withdrawal symptoms that many people experience when deprived of their java can motivate the regular use of caffeine and make it difficult to kick the habit.
In experimental studies, 50 percent of people got headaches when they stopped using caffeine, and 13 percent had withdrawal symptoms serious enough to impair their ability to function normally. In addition to headaches, symptoms of caffeine withdrawal include fatigue, difficulty concentrating, depression, irritability, nausea, and muscle pain. Symptoms usually begin within 12 hours of ceasing caffeine consumption and last for up to a week.
Caffeine initially increases heart rate and alertness because it blocks the action of adenosine, a natural chemical in the body that inhibits nerve activity. Adenosine opens blood vessels, and caffeine constricts them. That is why caffeine is found in some headache medications, such as Anacin; caffeine relieves some headaches by reducing the volume of blood in the brain.
The body responds to regular caffeine exposure by increasing the activity of adenosine. Drowsiness and headaches occur when one abruptly cuts out caffeine because caffeine is no longer around to counteract the effects of adenosine. Gradually reducing caffeine intake gives the body time to adjust and results in fewer withdrawal symptoms than going cold turkey.
The good news is that as long as caffeine is not keeping you awake at night, little evidence exists to suggest that caffeine consumption is harmful to your health.
Is there such a thing as heart cancer? I’ve never read anything about it.
Yes. From autopsy data it is estimated that heart cancer afflicts between one in 10,000 to one in 100,000 people. Since it is very rare, it does not get much media attention. Similarly, we do not usually hear about male breast cancer, but around 1 percent of all breast cancers occur in men.
According to the World Health Organization, 10 million people are diagnosed with cancer annually. The most common cancers worldwide are cancers of the skin (approximately one-third of newly diagnosed cancer cases), lung cancer (12 percent), breast cancer (10 percent), and cancer of the rectum/colon (9 percent). Cancers of the stomach, prostate, liver, cervix, esophagus, bladder, lymph nodes, blood (leukemia), and mouth/throat are the next most common types of cancer.
The majority of cancers originate in epithelial cells, which form the outer layers of the skin and line the digestive, respiratory, reproductive, and urinary systems. Because epithelial cells divide very quickly (relative to muscle cells, for example), ample opportunity exists for something to go awry. The heart is mainly muscle tissue, and most cases of heart cancer are due to tumors that originated in other parts of the body (for example, in the lung or breast) and spread to the heart.
Our skin cells are continually dying, sloughing off, and being replaced by new cells. Why do we get skin cancer from sun-damaged skin when those damaged cells are being replaced continually by new cells?
The outer layer of our skin—the stratum corneum—consists of dead skin cells called keratinocytes. As you point out, these cells are constantly being shed and replaced by keratinocytes from deeper layers. A keratinocyte lives for about a month, but cancer often takes many years to develop, because a cell accumulates on average five mutations before becoming cancerous. Indeed, skin cancer would be much more prevalent if skin cells were not replaced so frequently.
The three most common forms of skin cancer are melanoma, basal cell carcinoma, and squamous cell carcinoma. Melanoma is cancer of the melanocytes—the cells that produce the dark pigment melanin in our skin. Melanocytes are not replaced rapidly like the keratinocytes. However, basal cell carcinoma and squamous cell carcinoma are cancers of the keratinocytes. Scientists now believe these two forms of skin cancer arise in keratinocyte stem cells.
One characteristic of a stem cell is that it renews itself when it divides. When a keratinocyte stem cell divides, one of the two daughter cells eventually makes its way to the skin surface and flakes off, but the other remains behind as a stem cell. If the parent stem cell had mutations, these are passed along to each daughter cell. The daughter stem cell (or its descendants) can reside in the skin long enough to accumulate multiple mutations and become cancerous.
On the mend
A year ago I was diagnosed with Type 2 diabetes. I was started on medication. I also began a program of diet and exercise. As of today I have lost 60 pounds and, with my doctor’s approval, have stopped taking diabetes medication. My last blood sugar test was in the normal range. Do I still have diabetes?
A basic dictionary definition—to cure is to restore to health—implies that your diabetes is cured, but this definition is deficient when it comes to complex diseases. According to Steve Edelman, a diabetes specialist at the University of California San Diego School of Medicine, your diabetes is totally controlled, but not cured. We do not yet have a cure for diabetes. If you regained the weight, your diabetes would return, which would not be the case if your diabetes were truly cured.
Even for infectious diseases, the definition of cure is not straightforward. According to Francesca Torriani, an infectious-disease specialist at the UC San Diego School of Medicine, a cure means that no signs of infection (sickness, inflammation) are evident and that the tests for the disease agent are negative. However, some people who test positive for a bacterium or virus never become ill, because people’s susceptibility and immunity influence who gets an infection.
Cancer is arguably the most difficult disease to classify as cured, because cancer is not a single disease, but a collection of more than 100 different diseases. According to Greg Daniels, a melanoma expert at the UC San Diego Moores Cancer Center, curing cancer means that the cancer is no longer there and it never comes back.
He points out that this definition is retrospective. Two patients with the same cancer, given the same treatment, who seem to respond equally well in the short term, may fare differently in the long term. For example, treatment for Stage 1 melanoma cures 90 percent of patients, but 10 percent will have a recurrence. Until a recurrence happens, the cured and uncured patients are indistinguishable.
Even the first part of the definition—no longer there—is tricky, because it depends on the sensitivity of the tests for that cancer. PET scans and CAT scans make it possible to detect clusters of billions of cancer cells, but a cluster of a million cells would not be visible.
Therefore, for all diseases, signs of the disease must be absent before a patient can be considered cured, but concluding that a disease is cured often requires the test of time. Some diseases, although treatable, are incurable.
What exactly is dandruff? Why doesn’t it spread to a person’s eyebrows, mustache, or beard? And, most of all, what makes it itchy?
For more than 100 years, a fungus called Malassezia has been implicated in dandruff. Puzzlingly, however, Malassezia is found naturally on all of us, and people with dandruff do not have more of it than people without dandruff. Only recently did scientists discover how otherwise-harmless microorganisms cause dandruff in some people but not others.
Malassezia feasts on sebum, the oil produced by the skin. Sebum is a mixture of many different oily, fatty, and waxy substances. The scalp is not the only place the fungus can get a free lunch, so dandruff can occur elsewhere, including eyebrows, forehead, and behind the ears. Malassezia is a picky eater, consuming only certain fats and releasing broken-down fats as waste products. In doing so, the fungus changes the composition of skin oils.
Sebum usually lubricates and protects the skin, but a recent study determined that applying waste fats produced by the fungus to the scalps of dandruff-prone people caused irritation and skin flaking. Under a microscope the skin flakes looked like normal dandruff. On the other hand, individuals who were not dandruff-susceptible did not get dandruff when the fats produced by Malassezia were applied to their scalps.
The researchers concluded that the skin of dandruff sufferers is more permeable than that of non-dandruff sufferers, which allows the fats from Malassezia to irritate the skin, leading to itchiness and excess turnover of scalp cells. Changes in the skin’s permeability could also explain why some people suddenly develop dandruff when they are under stress or have a weakened immune system.
The occurrence of dandruff during development and maturity follows the pattern of sebum production, which is under hormonal control. The glands that produce sebum are active at birth under the control of maternal hormones. This allows initial Malassezia colonization. Malassezia is likely an irritating factor for cradle cap, a scalp flaking disorder in infants.
The glands then get smaller and decrease sebum production, causing Malassezia populations to decline along with the incidence of dandruff, until puberty. Sebum production drops again after menopause in women and after age 50 or 60 in men.
Dandruff is prevalent, affecting more than 50 percent of adults. Active ingredients in the many antidandruff shampoos differ, but antifungal activity usually is the common mechanism of action.
What causes recurring muscle cramps? Are there ways to prevent them?
Cramps can occur due to an imbalance of a lot of things, such as calcium and potassium, according to Joseph Scherger, a professor and physician at the University of California San Diego School of Medicine. He recommends that people suffering from cramps see their primary care physician, who, after initial tests, might refer them to a rheumatologist—a doctor who specializes in disorders of the joints and muscles.
A search of the medical literature turns up a long list of possible causes of muscle cramps. In addition to imbalances of electrolytes (magnesium, calcium, potassium, sodium) and dehydration, cramps can occur as a side effect of certain medications and as a symptom of many diseases, including diabetes, thyroid disease, and peripheral vascular disease—narrowing of the blood vessels, especially in the legs.
Treating an underlying disease would be the first course of action in eliminating muscle cramps. However, muscle cramps often turn out to be “idiopathic”—of unknown cause.
To prevent cramps, the authors of the Harvard Health Letter recommend staying hydrated, especially because, as we get older, our thirst impulse gets weaker, and we may forget to drink. In addition, they report that the average American does not consume an adequate amount of potassium. They recommend almonds and fruits and vegetables including bananas, oranges, spinach, lettuce, and mushrooms as good sources of potassium.
They also suggest wearing comfortable, supportive shoes, stretching your muscles regularly, and making sure your bed covers are not too snug. Tight covers can press on your feet and tighten the muscles in your calf and foot. Tight muscles are more susceptible to cramping.
Medications and vitamin E, which has proven helpful in some studies, are other options that patients might explore in consultation with their doctors.
What makes the head ache during a headache? What is different with a migraine?
Headaches come in a surprising variety, with different causes and mechanisms. The latest diagnostic classification by the International Headache Society lists well over 200 kinds of headaches, including the tension-type headache, the migraine, the cluster headache, the alcohol-induced headache, and the headache attributed to ingestion of a cold stimulus—the ice cream headache to you and me.
“You make my brain hurt” is a favorite retort to an annoying person in comics and comedies, but brain tissue does not feel pain. A patient can be awake during brain surgery (allowing the surgeon to monitor what is safe to cut) and not feel the knife. Headaches arise from other structures in the head and neck, including skin, joints, muscles, sinuses, and the blood vessels in the dura mater—a tough covering around the brain.
The tension-type headache is the kind most everyone has experienced. Nevertheless, its mechanism is not well understood. The old idea that it results from involuntary muscle contractions that block blood flow to the head has been ruled out, but it may have something to do with “a pain in the neck.” Frequent sufferers of tension-type headaches experience increased tenderness of the neck muscles and tendons. Another, albeit controversial, hypothesis is that a tension-type headache is an earlier, less severe phase of a migraine, and that they share a similar mechanism.
Migraines affect 18 percent of women and 6 percent of men each year. The World Health Organization lists migraines among the top 20 causes of disability worldwide. Migraines are severe, often throbbing headaches that intensify with physical activity, and they may be accompanied by nausea and aversion to strong light, smells, or sounds. In some people the migraine is preceded by an aura—neurological symptoms such as flashing lights, blind spots, or numbness.
A migraine begins with a wave of decreased nerve activity that moves across the brain’s cortex (surface layer). The decreased nerve activity, also known as the cortical spreading depression (CSD), initiates many changes in the levels of chemicals that brain cells use to communicate with each other and dilates the blood vessels in the dura mater. The swelling blood vessels stretch the nerves around them, causing them to send signals to the trigeminal nerve, which relays pain messages in the face and head.
Stress, certain foods, sleep disruptions, skipping meals, and hormonal changes can trigger migraines. Exactly how these factors trigger the CSD, and why the triggers are different for different people, is not yet understood.
What causes the nausea associated with heart attacks?
A region in the brain stem that receives input from the body and other parts of the brain coordinates vomiting. Being able to detect and reject toxins inadvertently consumed with food has an obvious adaptive advantage for an organism. Therefore, it is not surprising that the digestive tract contains sense organs to detect noxious chemicals and convey that information to the brain.
Sense organs that affect nausea are also found in the chest area, including the heart and lungs. Exposing the heart to certain chemicals, mechanically distending it, or electrically stimulating the right cardiac nerve can initiate reflexes involved in vomiting. Sense organs in the left ventricle of the heart that detect tension appear to trigger the nausea associated with heart attacks.
The adaptive benefit of this nausea response is unclear. However, since these same sense organs in the heart may be responsible for the nausea that sometimes accompanies heavy exercise, perhaps their role is to serve as a warning to an organism to prevent fatal overexertion.
My sister has lupus. One of the tests the doctors did was an antinuclear antibodies (ANA) test, which detected antibodies to her cell nuclear contents. But aren’t the nuclear contents contained in two membranes (cell and nuclear), so the white blood cells should have no contact with them? Can the white blood cells tell the self DNA apart from non-self DNA?
DNA and other contents of the cell’s nucleus are indeed carefully contained within healthy cells. In contrast, when cells die, the contents of the nucleus are often released. So our white blood cells—which defend us against invading microbes—do come into contact with DNA from our body’s cells (self DNA).
White blood cells produce protein weapons called antibodies that bind to and neutralize invaders. White blood cells are skilled warriors and make different antibody weapons for different intruders. Therefore, blood tests for many diseases work by identifying specific antibodies in the blood.
For 40 years, the ANA test has been used to help diagnose lupus. Despite the test’s long history, two things remain puzzling. First, if everyone’s white blood cells come into contact with self DNA, why doesn’t everyone have antibodies against it? It turns out that we do, but the antibodies are present in much smaller amounts, and bind to DNA much more weakly, than the anti-DNA antibodies found in lupus patients.
The second puzzle is whether these antibodies play a role in causing the symptoms of lupus. In systemic lupus, the body’s immune system attacks its own tissues, including the joints and internal organs. Antibodies would not harm the body by simply binding to DNA released by dying cells.
Researchers have found that the antibodies tend to collect in the kidneys of lupus patients, where they may penetrate into cells. The exact role of these antibodies in lupus remains under investigation.
White blood cells can tell the difference between self and non-self DNA, or at least between DNA from bacteria versus DNA from mammals. Since the DNA building blocks, or bases, used by bacteria (A, T, G, C) are the same as the ones that make up our genes, this is surprising.
However, although the building blocks are the same, the way they are strung together is different. Specifically, bacteria have many more sequences that are particularly rich in C and G bases than we do. Also, in us this sequence is more likely to be modified by the addition of four atoms called a methyl group. These features allow white blood cells to distinguish bacterial DNA from our DNA.
Halo of stars
I’ve always wanted to know what causes the sensation of seeing stars. Today, after swimming laps, I was lying in the sun. As I opened my eyes, about to get up, I saw intermittent small white dots buzzing around for about 30 seconds. My best guess is that it has something to do with oxygen. Can you shed some light on this for me?
There are at least three possible reasons for seeing stars. Sylvester the cat saw stars when Granny walloped him over the head for trying to eat Tweety Bird. A blow to the head can cause the vitreous fluid that fills the back two-thirds of the eyeball to rub against the retina. In fact, as we age, the vitreous fluid becomes thicker and can push or pull on the retina even with more modest movements of the head.
The retina does not feel pain; it just responds to stimulation by sending a light signal, according David Granet, a professor of ophthalmology at the University of California San Diego School of Medicine. Certain types of exertion cause the “stars” by stimulating the retina. “Of course, a shower of stars, flashing light, or a curtain on vision are all potential warning signs of retinal detachment and should be of concern,” he said.
Injury to the retina should be treated immediately to minimize further tearing and bleeding into the eye. If the damage is not too extensive, retinas can be repaired with a laser on an outpatient basis.
Another reason for seeing stars is small clumps of gel that form in the vitreous fluid. These “floaters” cast a shadow on the retina when they pass in front of it and are most obvious when you are looking at a plain, light-colored background.
The third reason for seeing stars has to do with levels of oxygen and/or nutrients reaching the brain. According to Joseph Scherger, a professor and physician at the UC San Diego School of Medicine, “The brain, including vision, runs on glucose, oxygen, a balance of electrolytes, and ample circulation/blood pressure. One might have visual changes like ‘stars’ if any of these are low.”
Hot, hot, hot
I asked my physician if the temperature of a hot flash can be measured and what it might be, and he said he didn’t know. I’ve tried with my home thermometer, and it always reads normal, but I don’t feel normal! Can you measure the temperature of a hot flash? If so, how? Where in the body does the signal for a hot flash originate?
Hot flashes seem to be triggered by an overly sensitive body thermostat. A useful analogy is a house thermostat that is adjusted so that a temperature increase of a fraction of a degree switches on the air conditioning. A false alarm, such as a waft of heat from opening the oven door, could activate the air conditioning, but since the house was not overly warm to begin with, the AC would rapidly turn off.
When the body’s thermostat, which is located in the hypothalamus of the brain, decides it is too hot, it cranks on the body’s air conditioning—sweating and dilation of blood vessels in the skin. The rush of warm blood to the skin creates the feeling of intense heat characteristic of the hot flash. If the ambient temperature is not actually high, our AC quickly shuts down. The blood vessels in the skin constrict, and the blood drains away, leaving the skin pale and cold.
Studies have shown that, on average, women who experience hot flashes have a lower core body temperature and a lower sweating threshold—that is, they begin sweating at a lower body temperature—than women who do not experience hot flashes. However, the temperature difference is small, just a fraction of a degree, and therefore requires a very sensitive thermometer to measure.
The standard explanation for the occurrence of hot flashes during menopause is that they are triggered by declining levels of estrogen. Estrogen has been shown to ameliorate hot flashes by increasing the sweating threshold. It is not understood how declining estrogen levels increase the sensitivity of the nerve cells in the brain that control body temperature.
Although considered the hallmark of the menopausal transition, hot flashes can occur at other times of life and can affect both women and men. Also, not all women experience hot flashes during menopause. Research is ongoing to determine if other hormones are involved, and to identify health and lifestyle factors that might increase a woman’s risk of having hot flashes.