The Secret Life of the Grown-up Brain: The Surprising Talents of the Middle-Aged Mind - Barbara Strauch (2010)

Part II. The Inner Workings

Chapter 6. What Changes with Time

Glitches the Brain Learns to Deal With

I was talking with Deborah Burke, a neuroscientist at Pomona College in California, and we were having one of those maddening conversations in which neither of us could remember the name of the person we were talking about.

She was telling me about a dinner she’d been to the night before where she could not, for the life of her, remember the name of a well-known scientist.

“I was at this party and I was talking and I just could not remember Richard Dawkins’s name,” Burke was telling me. “Then I couldn’t remember the name of that evangelical guy, you know, the one who was against homosexuals and then turned out to be one? Oh, you know. What was his name? Oh, dear, now it’s happening again.”

I tried to think myself. I wanted to help. I could see this guy in my mind, picture him giving a news conference. I, too, knew his name well. Just not at that moment.

“I think his name was Ted. Ted something. And I think it had an H in it,” I offered, doing the best I could.

In fact, over the course of our conversation—for the next two hours—neither Burke nor I could come up with this man’s name. At certain points, Burke would stop and wonder out loud, “Oh, what is his name? This is really bugging me. I can see his church. It’s like a big warehouse. What is his name? I’m going crazy.”

We were not going crazy. We were mired in the Swamp of Lost Names.

And this time, I was mired along with one of the leading authorities on why we lose names to begin with. In particular, Burke studies what we call “tip of the tongue,” that gnawing feeling that you knowsomething but you just can’t find it in your brain, a sensation likened to being on the brink of a sneeze.

Over the last few years, Burke has tried to figure out why, in fact, we can’t sneeze. Where are those names anyhow? If they are on the tips of our tongues, why can’t we spit them out?

Despite all the newly recognized powers of the brain in middle age, there are—as we all know—a few glitches as well. The truth is, by midlife, most of our brains show some fraying around the edges and names are often the first edge to go ragged. Even names we know well vanish. We have the strong feeling we know the name, but we just cannot bring it to mind. Who are you?

It’s annoying. It’s frightening. In her research, Burke has found that tip-of-the-tongue incidents, or “Tots,” as she calls them, start to creep in as young as thirty-five and are a big part of middle age. Tots are much more common with proper names than with the names of objects. And they also occur more often with proper names than, say, the names of occupations.

“If I say Mr. Baker is a potter and Mr. Potter is a baker, you will remember the occupation, that someone is a baker or a potter, much easier than the names Mr. Potter or Mr. Baker,” Burke told me. Indeed, when I see my plumber, my brain easily says to me, “There’s my plumber,” but his name . . . well. That’s not popping into my head.

In survey after survey, this tip-of-the-tongue phenomenon is listed as the most irritating, embarrassing, and worrisome part of the aging brain.

So why does it happen? After all, remembering names—Mama, for instance—is one of the first vocal tricks we learn and seems fairly crucial to the species. Why lose that? As we age, our overall vocabularies improve. As Burke says, “A seventy-year-old has a better vocabulary than a twenty- or thirty-year-old. We just keep accumulating words and our verbal abilities get better.”

So with all those words and stacks of names stuffed inside our heads, why can’t we reach in and grab the one we want? Do we have too many names in there? Do they get lost in the clutter—or just lost?

A Retrieval, Not a Storage, Problem

Well, for starters, the names are not technically gone. Research into the cellular activity of the aging hippocampus—where most memories are processed—indicates that much of what we learn, in the form of chemical markers, is not missing, it’s just at the bottom of the pile. For the most part, it’s a problem of retrieval, not storage. It’s like trying to find the right book in a well-stocked library.

Burke’s work has shown this elegantly. In one study, she found that if an older person is shown a picture of someone well known—say, Brad Pitt—and has the feeling that he knows the name but can’t recall it—a Tot—he will be much more likely to retrieve the name successfully later on if he is, in between, asked to answer a question whose answer is “cherry pit.” Even though he is unaware that the mention of cherry pit is in any way connected to the picture of Brad Pitt, the twinning of the sounds in the two words—pit and Pitt—is enough to prime the memory on a subconscious level and help him get over his tip-of-the-tongue problem and retrieve that name. Interestingly, such prompting does not generally improve performance with young brains, but does help middle-aged and older brains, presumably because that’s when we need the help.

Burke has also discovered why lost names often come to us, seemingly out of the blue, later on, usually long after we need them—again because of certain clues we’re unaware of. If you’re trying, unsuccessfully, to recall the word Velcro, for instance, and you later hear the word pellet, it’s much more likely that the heretofore-missing word, Velcro, will spring to mind. Even when it’s only an internal sound that is similar—in this case, “el” occurs in both words—the second word can draw out the lost word. To us, it seems the words come out of nowhere. Burke calls these “pop-ups.” And they, too, occur more often as we age.

But if our brains are doing so well in middle age, why do names go missing in the first place? Burke’s theory is that “it’s because of the way words are stored and organized in the brain.” She says, “The sound of the word—its phonology—and the information about that word—the concept of the word—are in different areas of the brain and the connection between them weakens. It can weaken if we don’t use the name. But it also weakens as we age,” much like that running muscle you haven’t used much lately, either. It happens most often with the names of people we know but have not seen recently.

It happens, too, because the link between a person and his name is so arbitrary. Names that are unusually descriptive, like Grumpy, or names that have acquired meaning from the characteristics of a person they refer to, such as Scrooge, are remembered more easily than random names such as Peter Pan. (This must be why I’ve never forgotten the name of my childhood dentist, Dr. Smiley.) In general, there’s absolutely no reason for Brad Pitt to be called Brad Pitt. There’s no reason for Mr. Baker to be called Mr. Baker.

At the same time, we remember what a person does. That’s because a person’s occupation embodies a wide range of information that’s stashed all over the brain, and that can be retrieved through various paths. When we hear that someone is a baker and later we’re trying to come up with that fact, we might get there through an assortment of associations, from white and apron to flour or even hats.

“If I say baker, all sorts of information is called to mind,” Burke says. “There are different ways to activate the 100,000 neurons, lots of different connections that lead you to that concept.”

The thing to remember—if possible—is that forgetting names is part of normal aging and it is only one piece of processing an identification or recognition. If you forget that your husband’s boss’s name is Ed, it might be a bit embarrassing at the office party. But it’s not Alzheimer’s, a progressive disease where you might forget you have a boss, or even what a boss is.

At age sixty, Burke says she doesn’t spend much time fretting about all this and she doesn’t think the rest of us should, either. But it might help to plan ahead. Before going to a party, Burke sometimes makes a list of who will be there. She also uses a trick many of us secretly use. If she meets someone she knows but whose name she has forgotten, she resorts to the alphabet, going through each letter until she gets to one that prompts the name.

Still, it’s unsettling. At the end of our talk, Burke still could not remember the name of the evangelical pastor. As we spoke, she’d interrupt herself to say, “This is driving me nuts.”

“That is the emotional part of this,” she said. “It can make you very upset. What was his name?”

At that point, I decided I’d better do something. While I was talking with Burke on the phone, I stopped taking notes, went to Google, and typed in “evangelical, resigned, homosexual.”

That’s the other lesson here, of course. As we learn to love and accept our middle-aged brains, we should—rather than panicking over these little peccadilloes—relax and get help. We’re lucky. Those of us in middle age now are the first group to have a neurological elf standing by—the World Wide Web.

And, of course, in a second I had our answer. “It was Ted!” I told Burke, feeling triumphant. “I looked it up and his name was Ted and it did have an H. Ted Haggard.”

Together, we sighed in relief. “Ah, that’s it,” Burke said. “I just knew it was Ted something. Thank you.”

A World of Distractions

And so we both felt a bit better. But—as we all know, too—it doesn’t end with a name here and there. By middle age, if our brains are not misplacing names, they’re often misplacing themselves. We get distracted.

And it doesn’t take much to knock us off course. A doorbell rings and we forget we’re boiling water for the potatoes. We meet a friend at the hardware store and, after a brief chat, we no longer remember we went to the store to get a rake. My friend Phillis, fifty-one, who runs her own consulting company, told me that as she was climbing her building’s stairs to go to her office on the fourth floor not long ago, she suddenly looked up and found herself on the eighth floor. As she climbed, she had glanced out a window to see who was in her parking spot, got distracted, and simply went right by her own office door. “Oh, my goodness,” she said, “I’ve never done that before. Is that middle age?”

Such wanderings become increasingly common as our brains age. “When I ask my patients what is troubling them, distraction comes up again and again,” says Adam Gazzaley, a neurologist at the University of California at San Francisco. “They’ll tell me that they’re sitting on the couch and they go to the kitchen to get something and by the time they get there they don’t remember what they went there for. I hear that all the time. And when I ask them how that occurs, they say that something distracts them, maybe the phone rings, something makes them not pay attention.”

Not long ago, the writer Judith Warner, talking about her new distractibility, confessed that she had “invited a couple to dinner and forgot to give them our unlisted phone number or address” and sent her daughter “to dinner at another family’s house and neglected to tell anyone that she was coming.”

Later on, Warner was heartened when, undergoing an MRI for migraines, her mindlessness was explained. She had a “hole” in her head. Her neurologist told her it was an inconsequential “small cystic area,” but its very existence, however unrelated to any brain difficulties or maladies, was reassuring, at least to Warner, who, perhaps only half kidding, wrote:

The self-blame game is now over. I no longer have to feel ashamed when—despite my ability to recall the details of a small news item from six years ago—I cannot remember the name, or even the face, of a person I met earlier in the day. No one has the right to laugh at me anymore when I write down important reminders—12:30 dismissal! Bring napkins!—on the palm of my hand. For I have a hole in my brain.

So, is that it? Do we all just have holes in our brains? Losing names is one thing, but losing entire dinner plans? If our brains are capable of so much at middle age—such expertise and wisdom and clarity and optimism—why do we walk right past our own office door?

Memory is a strange phenomenon and not completely understood on the molecular level. (One part likely involves the astonishing capacity of brain cells, sometimes described as “soft cells,” to physically change their structure as needed. As it’s often said, “brain cells that fire together, wire together.” If two brain cells are activated at the same time, they will actually change their structure, form stronger connections—and let us form memories and learn. That means, for instance, that if you see a red bird and hear its song enough times, the neurons that recorded the sight of the bird and the neurons that registered the bird’s sound are linked and physically altered. And the next time you hear that song, those neurons will fire up more or less in tandem and you’ll think, “Hey, it’s that noisy red bird again!”)

While there remains considerable fussing over how this works, it is clear that memory is not a single mechanism. Names are arranged one way, plans for an upcoming dinner party another, and that vivid picture of the giant dog that chased you down the street when you were four years old yet another.

And by middle age, most memory functions—I’m happy to report—are still humming along nicely. Biographical material, for instance, generally stays intact. You remember who you are and who your brothers and sisters and cousins are, where you went to second grade. Even personal information that’s acquired in middle age or later generally stays put. You know where you last worked. You remember how to make oatmeal, where the milk is. You can still ride a bike and drive a car, and you can, if you practice enough, perfect your tennis forehand—motor and muscle memory remain intact.

Episodic Memory

But other, more complex types of memory get a bit dodgy. Take a short break from a book you’re reading, even for a day, and you’ll forget not only what you’ve read on the last few pages but that you’ve read that book at all. A friend, Michael, told me that on a recent plane ride, he settled in to finish a book he’d started earlier that same week. But after he picked up the book, he found he “couldn’t remember ever reading any of it.” Unwilling to admit he’d forgotten what he knew he’d just read, he decided to start the book in the middle anyhow. “I just started reading halfway through,” he told me. “I have my pride.”

Such recollections for recent events—books we’ve just read, breakfasts we just ate—are called episodic memories. And our talent in this area generally does not blossom with age.

Why? How can some forms of memory parts stay put while others go missing? Do we, by middle age, simply have so many meals and movies and books in our heads that we have to get rid of some—a storage issue? It’s true that our brains have to jettison something or we’d explode. In fact, the few people who throughout history have been incapable of forgetting anything have been driven crazy as a result. Our brains are set up to set priorities, to weed out the irrelevant.

Still, you’d think the basic outline of a book you’re enjoying would stay put. Could we simply have too many weighty matters on our minds in general? Maybe we just can’t be bothered using up valuable brain space to remember what was on pages 1 through 67?

Marilyn Albert, a neuroscientist at Johns Hopkins University who has been studying the aging brain for decades, says that some difficulties in the normal healthy brain are not imaginary—and not a simple issue of overload. “We used to think it was because we had too much on our minds or because we have been away from school for so long,” Albert said recently. “But the declines are real and they begin in middle age.”

In fact, our increasing problems with some complex types of memory can be tied to how our brain changes its functions as it ages. And researchers are now able to see how this happens.

Cheryl Grady, a brain scientist at the University of Toronto, for instance, has actually watched the middle-aged brain take a few detours into distraction. Using a brain scanner, she has caught it daydreaming.

In a recent study, Grady found that the key part of the brain that we use to concentrate—the dorsolateral prefrontal cortex, part of that crucial frontal lobe region—lights up red-hot, as expected, in young adults when they’re asked to recall words or pictures they’ve just seen—a kind of difficult-to-do episodic memory.

But by middle age, she finds, such focused thoughts can be shoved aside by just about anything. As she scanned the brains of study participants, Grady was surprised to find that many older people trying to recall more complex information used their key frontal brain areas a bit less and a lower section of the brain more. And this second area is not helping. In fact, this fascinating brain region, called the default area—a region whose recent identification is one of the major discoveries in how the brain operates—is a key to why middle-aged brains can sometimes find themselves drawing a blank.

“This is the region we use when we’re thinking about ourselves, our internal monologues,” Grady told me as she explained her recent findings. “For instance, if you’re in a brain scanner and you aren’t doing anything, you might be thinking, ‘Gee, I’m kinda uncomfortable. ’ Or you might be thinking that you should get some milk at the store later on that day. This is the part of the brain that we call the default mode. It’s what the brain uses to daydream.”

Starting in middle age, the brain’s ability to switch off the default mode starts to wane. Faced with the task of remembering we’re boiling water, our brains veer off into their own internal worlds, thinking about those great boots we’d like to buy or that football game we’re planning to watch, none of it pertinent to the task at hand. And while we muse, all thoughts of boiling water disappear.

“This is one of the areas in which the aging brain does not do so well,” Grady told me. “Our ability to tune out irrelevant material is reduced. In middle age, we seem to be in transition from the patterns in youth to those of older age in this area. And it might be one of the reasons we become more distractible.”

Power to Focus

In fact, the ability to focus is one of our most crucial brain functions. It’s a skill we acquire as babies and hone throughout adolescence. And it depends, to a large degree, on the development of our frontal lobes, which are not fully matured until we’re twenty-five years old. This area helps us to focus, in part by blocking out—inhibiting—irrelevant details.

In a recent study using functional MRI, which can observe activity in the brain, Adam Gazzaley has also watched older people have more trouble keeping their brains focused. Shown both faces and scenes and told to focus only on faces, they had more activity in the area of their brain that registers faces—appropriately. But the area that registers scenes, which should have been suppressed or inhibited, also became active. And the older adults who had the most trouble focusing also had the most trouble remembering what they saw.

As we age, our frontal lobes don’t block out irrelevant details that interfere as well, perhaps because they switch into default mode, or because of declines in connections or in the brain’s chemical messengers, creating what’s called an “inhibitory deficit.” Explaining their own recent findings, published in the journal Nature Neuroscience in 2005, Gazzaley and coauthor Mark D’Esposito, a professor of neuroscience at the University of California at Berkeley, concluded: “older individuals are able to focus on pertinent information but are overwhelmed by interference from failing to ignore distracting information.”

When I spoke with Gazzaley, he had just finished another scanning study that tried to pinpoint exactly when this happens as we attempt to pay attention. Not only are we increasingly lured into our daydreaming default mode, but our frontal lobes may fail to perform their top-down enforcement job of blocking out distractions. Shown faces and scenes and told to concentrate only on faces, older brains—for just a millisecond—let distracting and irrelevant scene information sneak in. The older brains then quickly adjusted and began to block out such distractions. But in that tiny moment the floodgates were opened and focus was lost.

This may be how a slower processing speed interferes with our memories as we age. Our frontal lobes may take too much time to tamp down interference, so we get too much neural “noise” at the start. And studies show that those who have the most initial interference also seem to have the most trouble forming solid memories or staying focused on what they are doing or saying.

“If, in the first second, you don’t suppress some of the incoming information, that means you get too much information in at once and that’s bad because once that information is in there, it’s in there,” Gazzaley explained. “With some older brains the suppressing machinery of the prefrontal cortex [part of the frontal lobes] is not coming on line fast enough and it’s letting irrelevant information in.”

And while most of Gazzaley’s studies were done with adults past the age of sixty, there’s ample evidence that such difficulties can begin much earlier, in middle age, a time when our brains can begin to be more tempted to take a rest and space out in our default modes while too much useless information rushes in. “We see this at age forty being kind of an intermediate problem,” said Gazzaley.

Diverging Brain Powers

But here we have to stop, because while these difficulties arise in many brains at middle age, they do not occur in all brains. Nearly every study that spans ages from the forties to the early or mid-seventies—and sometimes later—shows astonishing variability. Brains are obviously varied at any age, but in middle age, that range of variability starts to increase. Some brains still operate with a knife-edged clarity, others have grown duller—most are somewhere in between. And that means that huge declines are not inevitable. As Marilyn Albert, the longtime neuroscientist at Johns Hopkins, said recently, the “true hallmark” of the brain at midlife is “variability.”

“So now we have developed two categories: the age-impaired and the age-unimpaired,” Albert said. “The question is, what is the explanation for that? Do those who are doing well have no age-related brain structure decline or, more likely, have they developed adaptive strategies?”

In fact, this is the key question. Why do some brains age well while others don’t? And can we more accurately define normal aging as opposed to true pathology, such as Alzheimer’s? Can we find out what makes the difference? Is it inborn or will adaptive strategies work? Over and over, scientists have been struck by the fact that it is in middle age when brains start to show not only slight declines but larger differences among one another. And it’s not just human brains. While mental scores are scattered at any age, studies in a range of animals have found that the rate of variability in those scores starts to rise markedly in middle age. This is when paths begin to diverge in earnest.

“There is enormous variability and we see this variability across species,” Albert said.

Indeed, a close look at one of our closest relatives—the rhesus monkey—is now confirming this, too. At a lab in Boston, an intriguing study of the middle-aged brain is still ongoing. And while it, too, is finding some downward trajectories, it shows a surprisingly wide spectrum—some doing okay, others not.

Not long ago, to see all this, I spent an afternoon at the lab of Mark Moss at Boston University School of Medicine, and, more specifically, with Bojangles, a rhesus monkey that was putting on a pretty good show with his own monkey frontal lobes when I caught up with him.

Through the years, one of the best tests of a human’s frontal lobes and their ability to focus our attention has been what’s called the Wisconsin Card Sorting Test, which has been around since the 1940s. The test takers, shown a group of cards, first sort them by suit—hearts, say. Then they switch and sort by number, all nines and fives, for instance. The idea is that the brain first learns the first task, then switches to another task.

In general, our brains are set up to keep doing what they’ve just been doing—a brain likes a good rut. So someone taking this card-sorting test is naturally tempted to keep picking hearts. To switch, the brain must inhibit its urge to stay in that rut and instead move over to its new mission. One of the key roles of the frontal lobes is to inhibit urges. And if our frontal-lobe inhibitory machinery is faulty, switching from sorting playing cards by suits to numbers becomes tougher. Without a strong push to stop what we’ve been doing, we keep doing it. We pick the hearts when we’re supposed to pick nines and fives.

It’s a classic test, still being used. And no one ever thought it could be used on monkeys. But it turns out monkeys do this pretty well, too. A few years ago, Moss, chairman of the neurobiology department at Boston University School of Medicine, a gregarious, out-of-the-box sort, was studying the aging brain of the monkey and decided to see if he could teach a version of the card test to monkeys. And, as he told me when I went to see him at his office, still surprised, “Lo and behold, we could.”

To show how this works, Moss took me to the lab near his office to see Bojangles put on his show. Kept in a large box to limit distractions, the monkey was shown three things over and over on a computer screen: a red triangle, a blue star, and a green square.

Bojangles first had to learn that he would get rewarded only if he picked the red triangle. And through a process of trial and error, he figured it out and got an M&M. Then the game switched and Bojangles was rewarded only if he selected the blue star. The idea was to find out how long it took Bojangles to catch on and switch from red triangles to blue stars. Would his frontal lobes kick in and suppress his urge to keep picking red triangles? Could he figure out how to keep getting those M&M’s?

He did. After a few false starts, Bojangles picked blue stars and got his M&M’s. But there was a catch. Bojangles was a young adult at age six, which is equivalent to about age eighteen in humans. He was still a teenager. And Moss has found that this task, generally, has not been as easy for older monkeys. In fact, after studying forty-one monkeys, Moss found that difficulties clearly begin in middle age. It was the first time anyone had managed to do such a large test on the middle-aged monkey’s brain. And the news was not all good.

The findings “showed that middle-aged monkeys, like those of advanced age, were significantly impaired on the conceptual set shifting task,” Moss wrote in his groundbreaking 2006 study, which was published in the journal Neurobiology of Aging.

The test with the monkeys generally mirrored what has been suggested in studies with humans. But with humans, there was always a nagging question: Did any difficulties in middle age stem from a brewing case of preclinical Alzheimer’s or vascular disease—or were they simply a part of normal aging?

Through the years, it’s been notoriously difficult to tease out the difference, especially now that we know that dementia probably begins much earlier than anyone ever realized—and long before it’s evident in behavior.

But monkeys don’t get Alzheimer’s. So, if monkeys are screened for other vascular problems, they can be a fairly good model for what happens in normal, healthy human brains as they age—and not everything, it seems, always goes right.

Continuing his research, Moss has since scanned the brains of monkeys as well as examining their brain tissue. Aging is no simple process, but Moss is convinced that one of the biggest culprits in the aging brain may be selective declines in white matter—the same white matter whose overall growth helps us to get so smart to begin with.

The brain, as we’ve said, is made up of gray matter—the neuron cell bodies—and white matter, the long arms of the neurons that extend throughout the brain, sending signals from one neuron to another. As we age, the arms are coated in that sheath of fat called myelin. That fatty layer allows the signal to move much faster and be timed more accurately.

Overall, myelin increases up to the fourth, fifth, or even sixth decade. But Moss and others have discovered that starting in middle age, in some people—and in some monkeys—it can also start to erode in selective areas.

In most cases during that time, such erosion is meaningless—the decreases are still outweighed by the increases and brains function better than ever. But in a few middle-aged monkeys Moss found net decreases in white matter. And he now has preliminary data showing that those monkeys that have the most negative changes in their white matter do the worst in the card-sorting games.

At this point, much is still unknown. It may be that while overall myelin increases are occurring and helping the brain operate, decreases in certain areas, such as the frontal lobes, whose rich connections need the highest efficiency for processing, may prove detrimental. It’s still unclear why some brains manage to keep up repairs and some don’t.

In any case, Moss was taken aback—and somewhat dismayed—to see any such problems in the brains of some middle-aged monkeys because these structural declines had not been detected before. The “findings of a marked impairment . . . in monkeys of middle age was initially of some surprise . . . very little is known about the age of onset of cognitive decline. The study demonstrated that middle-aged monkeys, as young as 12 years of age (equivalent to approximately 36 years in humans) already show impairment . . . deficits in EF [executive function] may occur earlier in the aging process than previously thought,” Moss wrote rather depressingly when his first data was published in 2006.

What’s more, over the past few years a clearer picture of how certain defects occur in the brains of humans has also started to emerge, and not all the news is good there, either. For this news we have no one to thank more than Naftali Raz, a neuroscientist at the Institute of Gerontology at Wayne State University in Detroit, who has become an expert in the details of brain decline and aging.

The only working neuroscientist I ran across whose papers are sprinkled with quotes from Sophocles (“For the gods alone there comes no old age, nay nor even death; but all other things are confounded by all-mastering time,” from Oedipus at Colonus), Raz has a succinct and downright scary way of describing what is happening in our aging brains. For instance, in a review of the scientific literature in 2006 with Karen M. Rodrigue entitled “Differential Aging of the Brain: Patterns, Cognitive Correlates and Modifiers,” he wrote:

Postmortem studies of individuals within the adult age span reveal [a] panoply of age related differences in brain structure. The gross differences include reduced brain weight and volume, ventriculomegaly and sulcal expansion. Microscopic studies document myelin pallor, loss of neuronal bodies in the neocortex, the hippocampus and the cerebellum, loss of myelinated fibers across the subcortial cerebrum, shrinkage and dysmorphology of neurons, accumulation of lipofuscin, rarefication of cerebral vasculature, reduction in synaptic density, deafferation, loss of dentritic spines, cumulative mitochrondrial damage, reduction in DNA repair ability and failure to remove neurons with damaged nuclear DNA.

Overall, in fact, Raz has estimated that our brains shrink by about 2 percent a decade as we age. He and others talk of a “dark side to plasticity,” which means that the areas of our brains that change the most during our lives, those that are the most sensitive to our environments—our valuable frontal lobes—could potentially suffer the most in the aging process.


But as dire as all these descriptions may be, Raz and Moss join with most other neuroscientists working today to stress that the main characteristic of the brain as it ages, and in particular the middle-aged brain—according to all we know now—is not universal decline but variability.

Moss, who was so surprised to find fairly significant declines in some middle-aged monkeys, was just as surprised to find out that the monkeys’ brains at midlife were both “mostly all right” and highly “scattered” in terms of white matter loss as well as performance on mental tests. So what is “normal” remains up for grabs—which, again, looked at one way, is very good news.

“In the end, we had to divide the group into successful and less successful agers,” Moss told me. “There clearly are these pristine agers who, even in old age, seem to be doing fine. It could be that they adapt better.”

And we are beginning to find out what is most detrimental to our brains. Even if certain diseases are not yet evident, they may still, even at very early stages, be affecting them.

“There are a lot of individual differences in cortical shrinkage rates plus there are a lot of factors other than aging that cause and influence it,” Raz wrote in his latest e-mail to me. “For example, vascular disease and cardiovascular risk factors such as hypertension (even relatively mild and responsive to medication) affect the prefrontal cortex and the hippocampus, both ‘age sensitive’ structures.”

But he also is quick to add that overall, because normal levels of decline—the kind most of us experience if we’re generally healthy—in middle age appear to be relatively small, for a long span of time in our modern middle age, all this might not matter all that much.

Even more important, scientists now know for certain that large numbers of neurons do not die off. The fundamental building blocks of our brains stay put. While “age-related differences in regional brain volumes and integrity of the white matter are associated with cognitive performance,” Raz pointed out recently, a “review of the literature reveals that the magnitude of the observed association is modest.” And, since our brains do not necessarily age in exactly the same way—and do not include a wholesale die-off of neurons—Raz sees promise, precisely because of the striking variability that he and others have found in middle age.

“Aging—a biological companion of time—spares no organ or system and in due course affects everything, from cell to thought,” Raz has written. “[But] the pace of aging varies among individuals, organisms, organs and systems. And the very existence of such variability merits some measure of hope. If the positive extreme of healthy aging can be made more prevalent and if its worst and most negative expression can be delayed if not completely eliminated, the viable and enjoyable segments of the life can be prolonged into the later decades of the lifespan. In other words, successful aging enjoyed by relatively few may become the norm.”

And while this may be taking things too far for some, there are even hints that, in a few instances, a little age-related brain decline—even in the area of focus—may work in your favor.

The science on this is in its infancy, but some recent studies have shown that letting some irrelevant information sneak into our brains can actually prove useful at times. If older people are asked to read passages that are interrupted with unexpected words or phrases, they read much more slowly than college students. But later, when both groups are asked questions whose answers depend on those distracting words and phrases, the older people are much better at solving the problem. As we age, we seem to be able to grasp the big picture better. But since our brains are also more easily distracted—when we are not asked to focus on underlying meanings—we let in random information that, while it can be an interference, can also at times prove handy. In these studies, the older people came up with the right answer precisely because they had seemingly irrelevant information stuffed somewhere in their brains.

A broader, less focused attention span, says Lynn Hasher, a neuroscientist at the University of Toronto, who is leading much of this new work, may allow a middle-aged person to know more about a situation—at times a real benefit in an often chaotic world where it’s not always clear what will be pertinent in the end. Maybe there’s a seemingly useless piece of information in a memo that later has meaning. Maybe while listening to another person speak, your older brain cells—wandering around here and there—notice what’s happening on the sidelines. Maybe that the person is yawning or fidgeting—tidbits of information that could help you more fully evaluate that person later on.

Indeed, other studies have shown that when an older person meets someone for a second time, she already has a great deal more peripheral information, gathered unwittingly from the first meeting, than a younger person has. As Hasher says, a brain may be a little fuzzy on some details (a name, perhaps?), but may subconsciously register other information that proves to be more crucial—this person seems confident or looks shifty, for instance.

“It’s not that people are doing this on purpose and saying, ‘Oh, that might be relevant later on, I better pay attention to that,’ ” Hasher explained to me when I spoke to her about all this. “Essentially, it’s like being on autopilot. It just happens. But in the everyday world, I think we overestimate the importance of deliberately doing things and we underestimate the importance of the automatic things we do. That’s what keeps us from tripping, from walking into walls.”

Hasher, who is sixty-three, concedes that it is not always a benefit to be on autopilot, when you have to drive on the crowded freeway or watch a toddler. But still, in more situations than we realize, this wider perspective—this lack of filtering—may help, not hurt.

“The full story is not in yet but it is amazing,” said Hasher recently, adding, as she summed up the latest research, “These findings highlight the notion that cognitive aging is characterized by both losses and gains, and that whether to consider reduced inhibitory control as a help or a hindrance depends entirely on the situation.”

What’s more, there’s also the suggestion that this less-than-straight-arrow attention can sometimes lead to art. Studies have shown that those brains that block out less tend to have more creative ideas. As Hasher says, if one part of creativity is “putting normally disassociated ideas together,” then an older brain could, almost by its very nature, be more likely to come up with something quirky, new, even beautiful.

Jacqui Smith, a longtime researcher now at the University of Michigan, agrees that a tendency toward distraction can, in the right context, lead to wonderful things. “If you don’t focus on one central thing, if you are thinking of all sorts of different things at once, sometimes you can come up with new associations. It’s hard to measure, but this is divergent thinking; this is creative thinking. And if you’re lucky, you get a true insight, something brand new.”

Not along ago, after I told a friend, a poet who had just turned fifty, about the link between distraction and creativity, she just looked at me and laughed. A growing ability to daydream? No problem. Mind wandering and linking odd things together in new ways. No problem. “That’s not all bad, you know,” she said. “In fact, that’s all very, very good for poetry.”