Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn - Cathy N. Davidson (2011)

Introduction

I’ll Count—You Take Care of the Gorilla

Five or six years ago, I attended a lecture on the science of attention that was part of a luncheon series designed to showcase cutting-edge research by the best and brightest at my university. A philosopher who conducts research over in the medical school was talking about attention blindness, the basic feature of the human brain that, when we concentrate intensely on one task, causes us to miss just about everything else.1 Because we can’t see what we can’t see, our lecturer was determined to catch us in the act. He had us watch a video of six people tossing basketballs back and forth, three in white shirts and three in black, and our task was to keep track only of the tosses between the people in white. I hadn’t seen the video back then, although it’s now a classic, featured on punked-style TV shows or YouTube versions enacted at frat houses under less than lucid conditions. The tape rolled, and everyone began counting.

Everyone except me. I’m dyslexic, and the moment I saw that grainy tape with the confusing basketball tossers, I knew I wouldn’t be able to keep track of their movements, so I let my mind wander. My curiosity was piqued, though, when about thirty seconds into the tape, a gorilla sauntered in among the players. She (we later learned a female student was in the gorilla suit) stared at the camera, thumped her chest, and then strode away while they continued passing the balls.

When the tape stopped, the philosopher asked how many people had counted at least a dozen basketball tosses. Hands went up all over. He then asked who had counted thirteen, fourteen even, and congratulated those who’d scored the perfect fifteen. Then he asked, “And who saw the gorilla?”

I raised my hand and was surprised to discover I was the only person at my table and one of only three or four others in the large room to do so. Around me, others were asking, “Gorilla? What gorilla?” Some people were getting annoyed. Several muttered that they’d been “tricked.” Instead of answering them, the philosopher rewound the tape and had us watch again. This time everyone saw the gorilla.

He’d set us up, trapping us in our own attention blindness, priming us for his lecture. Yes, there had been a trick, but he wasn’t the one who had played it on us. By concentrating so hard on the confusing counting task, we had managed to miss the main event: the gorilla in the midst. In a brief experiment that had required us simply to pay attention, it was our own minds that had deceived us.

Except I hadn’t been deceived. I’d seen the gorilla, not because I’m better at this than anyone else—I’ve taken enough attention tests since that day to know I’m not—but precisely because I wasn’t paying attention to counting basketballs. That’s how the visual cortex is structured. We think we see the whole world, but we actually see a very particular part of it. For a lot of neuroscientists, that’s the cautionary message of the gorilla experiment: We’re not nearly as smart as we think we are.2

In a very real sense, this book began that day. Attention blindness is the fundamental structuring principle of the brain, and I believe that it presents us with a tremendous opportunity. My take is different from that of many neuroscientists: Where they perceive the shortcomings of the individual, I sense opportunity for collaboration. If we see selectively but we don’t all select the same things to see, that also means we don’t all miss the same things. If some of us can accurately count basketballs in a confusing situation and some can see the gorilla, we can pool our insights and together see the whole picture. That’s significant. The gorilla experiment isn’t just a lesson in brain biology but a plan for thriving in a complicated world.

Without focus, the world is chaos; there’s simply too much to see, hear, and understand, and focus lets us drill down to the input we believe is most useful to us. Because focus means selection, though, it leaves us with blind spots, and we need methods for working around them. Fortunately, given the interactive nature of most of our lives in the digital age, we have the tools to harness our different forms of attention and take advantage of them.

But there’s an important first step, and if we pass over it, we’ll never be able to capitalize on the benefits of our interactive world. It’s not easy to acknowledge that everything we’ve learned about how to pay attention means that we’ve been missing everything else—including the gorilla. It’s not easy for us rational, competent, confident types to admit that the very key to our success—our ability to pinpoint a problem and solve it, an achievement honed in all those years in school and beyond—may be exactly what limits us. For over a hundred years, we’ve been training people to see in a particularly individual, deliberative way. No one ever told us that our way of seeing excluded everything else. It’s hard for us to believe we’re not seeing all there is to see.

But here’s the kicker: Unless we’re willing to take attention blindness personally, we’re going to either flub the basketball count or miss the gorilla every single time. We can’t even develop a method for solving the dilemma until we admit there’s a gorilla in the room and we’re too preoccupied counting basketballs to see it.

A great cognitive experiment is like a fantastic magic trick performed by an exceptionally skilled magician. It makes us see things about ourselves that we don’t normally see and helps us to believe what might otherwise be impossible to accept about the world we live in. An experiment allows us to see the imperfect and idiosyncratic way our own brain works. That’s a key difference between magic tricks and scientific experiments. Scientists don’t contrive experiments to trick, surprise, embarrass, or entertain us. They devise experiments so they can learn more about what makes humans tick.

When they were just starting their careers, the young Harvard psychologists Christopher Chabris and Daniel Simons first performed the now-famous gorilla experiment, or what they’ve come to call the invisible gorilla.3 It was 1999, and they were determined to come up with a convincing way to illustrate the cognitive principle of selective attention that had been identified way back in the 1970s but that people simply refused to believe.4 A colleague down the hall was doing a study on fear and happened to have a gorilla suit handy. The rest is history.

Under normal testing conditions, over half of the participants miss the gorilla. Add peer pressure and that figure goes way up. In a live reenactment of this experiment performed in London, with four hundred very social college students packed into an auditorium, only 10 percent noticed the gorilla stride across the stage.5 We didn’t keep an exact count at our event, but our numbers must have rivaled those of the college kids’ in London. In our case, the most likely reason so few saw the gorilla was that academics like to do well on tests. And that’s the annoying lesson of attention blindness. The more you concentrate, the more other things you miss.

In one amusing experiment you can view on YouTube, a man and a woman perform a familiar card trick. While the man fans a deck of cards out before her, the woman selects a card at random from a deck, shows it to the viewing audience, and slips it back into the pack. And as is to be expected, the man then “magically” pulls the right card out of the deck.6 In this case, however, he reveals his secret to the audience: While we were looking at the card the woman chose, he switched to another deck of a different color so that when she replaced the card, he could spot it instantly. But it turns out that’s only the beginning of what we’ve missed. While we’ve been keeping our eyes on the cards, the man and woman have changed into different clothes, the color of the backdrop has been altered, and even the tablecloth on which the cards were spread has been changed to one of a dramatically different color. Throughout all of this frantic activity a person in a gorilla suit sits onstage, in homage to the Chabris and Simons experiment.

The scariest attention blindness experiment I know is used to train apprentice airplane pilots. You can imagine where this is going. Trainees using a flight simulator are told they will be evaluated on how well they land their plane in the center of a very narrow runway. They have to factor in numerous atmospheric variables, such as wind gusts and the like. But just as the plane is about to land, after the pilot has navigated all the treacherous obstacles, the simulator reveals an enormous commercial airliner parked crossways in the middle of the runway. Pilot trainees are so focused on landing their planes accurately that only about half of them see the airliner parked where it shouldn’t be. When they see a tape of the simulation, they are made aware that they have landed smack on top of another plane. It’s a good lesson to learn in a simulator.7

ATTENTION BLINDNESS IS KEY TO everything we do as individuals, from how we work in groups to what we value in our institutions, in our classrooms, at work, and in ourselves. It plays a part in our interactions with inanimate objects like car keys or computer screens and in how we value—and often devalue—the intelligence of children, people with disabilities, those from other cultures, or even ourselves as we age. It plays a part in interpersonal relations at home and in the office, in cultural misunderstandings, and even in dangerous global political confrontations.

For the last decade, I’ve been exploring effective ways that we can make use of one another’s blind spots so that, collectively, we have the best chance of success. Because of attention blindness, we often arrive at a standstill when it comes to tackling important issues, not because the other side is wrong but because both sides are precisely right in what they see but neither can see what the other does. Each side becomes more and more urgent in one direction, oblivious to what is causing such consternation in another. In normal conditions, neither knows the other perspective exists. We saw this in the summer of 2010 when an explosion on the BP Deepwater Horizon drilling rig sent nearly 5 million barrels of crude oil gushing into the Gulf of Mexico. Some people reacted to the environmental disaster by wanting all offshore oil drilling banned forever. Others protested about the loss of jobs for oil workers in the area when the president of the United States declared a six-month moratorium on oil drilling to investigate what had gone so disastrously wrong. It was as if neither side could see the other.

But it doesn’t have to be that way. If we can learn how to share our perspectives, we can see the whole picture. That may sound easy, but as a practical matter, it involves figuring a way out of our own minds, which as the gorilla experiment so perfectly demonstrates, is a pretty powerful thing to have standing in the way. Yet with practice and the right methods, we can learn to see the way in which attention limits our perspectives. After all, we learned how to pay attention in the first place. We learned the patterns that convinced us to see in a certain way. That means we can also unlearn those patterns. Once we do, we’ll have the freedom to learn new, collective ways that serve us and lead to our success.

What does it mean to say that we learn to pay attention? It means no one is born with innate knowledge of how to focus or what to focus on. Infants track just about anything and everything and have no idea that one thing counts as more worthy of attention than another. They eventually learn because we teach them, from the day they are born, what we consider to be important enough to focus on. That baby rattle that captivates their attention in the first weeks after they’re born isn’t particularly interesting to them when they’re two or twenty or fifty because they’ve learned that rattles aren’t that important to anyone but a baby. Everything works like that. Learning is the constant disruption of an old pattern, a breakthrough that substitutes something new for something old. And then the process starts again.

THIS BOOK OFFERS A POSITIVE, practical, and even hopeful story about attention in our digital age. It uses research in brain science, education, and workplace psychology to find the best ways to learn and change in challenging times. It showcases inventive educators who are using gaming strategy and other collaborative methods to transform how kids learn in the digital age, and it highlights a number of successful innovators who have discarded worn-out business practices in order to make the most of the possibilities difference and disruption afford in a new, interconnected world.

We need these lessons now more than ever. Because of attention blindness, the practices of our educational institutions and workplace are what we see as “school” and “work,” and many of the tensions we feel about kids in the digital age and our own attention at work are the result of a mismatch between the age we live in and the institutions we have built for the last 120 years. The twentieth century has taught us that completing one task before starting another one is the route to success. Everything about twentieth-century education and the workplace is designed to reinforce our attention to regular, systematic tasks that we take to completion. Attention to task is at the heart of industrial labor management, from the assembly line to the modern office, and of educational philosophy, from grade school to graduate school. Setting clear goals is key. But having clear goals means that we’re constantly missing gorillas.

In this book, I want to suggest a different way of seeing, one that’s based on multitasking our attention—not by seeing it all alone but by distributing various parts of the task among others dedicated to the same end. For most of us, this is a new pattern of attention. Multitasking is the ideal mode of the twenty-first century, not just because of our information overload but because our digital age was structured without anything like a central node broadcasting one stream of information that we pay attention to at a given moment. On the Internet, everything links to everything and all of it is available all the time, at any time. The Internet is an interconnected network of networks, billions of computers and cables that provide the infrastructure of our online communication. The World Wide Web lies on top of the Internet and is, in effect, all the information conveyed on the Internet. It is the brilliant invention largely of one person, Sir Tim Berners-Lee, who developed a way that the documents, videos, and sound files—all the information uploaded to the Internet—would have addresses (URLs) that allowed them to be instantaneously transferred anywhere around the billions of computers and networks worldwide without everything going through one, central switching point and without requiring management by one centralized broadcast system.8

Neither the Internet nor the World Wide Web has a center, an authority, a hierarchy, or even much of a filter on the largest structural level. That allows for tremendous freedom and also, in some circumstances, risk. Instead of reinforcing an idea of sustained attention the way, by comparison, television programming might, with the Internet we have no schedule to keep us on track from the beginning to the ending of a sixty-minute show. If I’m reading along and decide to click on a link, I can suddenly be in an entirely different universe of content. There’s no guidebook. There are very few partitions. Everything is linked to everything, each network is a node on another network, and it’s all part of one vast web. We blink and what seemed peripheral or even invisible a minute ago suddenly looms central. Gorillas everywhere!

Internet and web are great metaphors for the world we live in, too. The domino-like collapsing of markets around the world has brought home a truth we should have seen coming long ago: Like it or not, we are connected. We can no longer be living in an “us versus them” world because our fate and theirs (whoever “we” and “they” are) depend on each other. We are all inextricably interwoven. The flip side is that we also have infinite opportunities for making our interconnections as productive as possible. The Internet offers us the communication means that we need to thrive in a diverse and interdependent world.

By one recent accounting, in the last decade we’ve gone from 12 billion e-mails sent each day to 247 billion e-mails, from 400,000 text messages to 4.5 billion, from 2.7 hours a week spent online to 18 hours a week online. That’s an incredible change in the amount and extent of the information taking over our time.9 If life once seemed calmer and more certain (and I’m not sure it ever really did), that wasn’t reality but a feature of a tunnel vision carefully crafted and cultivated for a twentieth-century view of the world. If we’re frustrated at the information overload, at not being able to manage it all, it may well be that we have begun to see the problems around us in a twenty-first-century multifaceted way, but we’re still acting with the individualistic, product-oriented, task-specific rules of the twentieth. No wonder we’re so obsessed with multitasking and attention! You can’t take on twenty-first-century tasks with twentieth-century tools and hope to get the job done.

Here’s the real-life benefit of the gorilla story: If attention blindness is a structural fact of human biology equivalent to, say, our inability to fly, then that means we are faced with the creative challenge of overcoming it. It might take some thinking; it might require humility about our rationality and our vaunted self-reliance; it may require rearranging our twentieth-century training and habits; and it certainly will require lining up the right tools and the right partners. But once we acknowledge the limitations we’ve been living with, we can come up with a workaround.

002

I wasn’t always dyslexic. I’m old enough that “learning disabilities” didn’t exist as a category when I was a kid. Back then, there wasn’t any particular diagnosis for my unusual way of seeing the world. I was already a twenty-seven-year-old professor at Michigan State University when I first heard myself described as “a severe dyslexic with tendencies to attention deficit disorder.” Before that, the diagnosis was simpler: I was simply “obstinate.”

One evening, a friend invited me to dinner. When I walked into her living room, I saw her sharp-as-a-tack daughter, who was six or seven at the time, lying on her side reading, with a dark blue cover pulled over her head.

“That’s how I read too!” I told her mother.

When my friend explained that she was about to take her daughter for some experimental testing because she was exceptionally bright but was falling behind in school, I decided to go along. I was fascinated by what I saw at the testing office and arranged to come back and take the tests for dyslexia myself. It was the first standardized exam on which I’d ever made a perfect score.

In school, I had managed to frustrate just about every teacher I had ever had. According to family lore, I could solve equations with two unknowns in my head before I started school, but I could barely count. One summer, I won a competition that earned me a scholarship to math camp. Two or three of my fellow campers there were going straight from grammar school to MIT without benefit of first passing through puberty. I loved calculus but had never been able to add and spent practically every day of fourth grade (or so I recall) in after-school detention writing endless multiplication tables on the blackboard.

I didn’t fare so well at reading and writing either. I loved perusing thick books where the small print ran from margin to margin, but I found picture books confusing and couldn’t read anything out loud. I still can’t. I happen to be in a profession where it’s common to write out an entire talk in long, complex sentences and then stand up and read it word for word to your audience. Though I try to avoid these occasions, when I must, I practice reading the text maybe fifteen or twenty times, until familiarity with the rhythm of the words and phrases somehow helps me stay on track.

Because my learning disability was diagnosed a few years after I’d already earned a PhD, it was pretty easy to feel the relief of the diagnosis (“So that’s what was going on all along!”) without being burdened with a lifelong label of inferiority, disability, disorder, or deficiency that I see being used to describe many of my students today, including some of my smartest. There’s no way of knowing if school would have been better or worse had I worn the label “dyslexic” instead of “obstinate,” and no way of knowing if my career path would have taken a different turn. All I know is that succeeding against odds and not being afraid to veer off in a direction that no one else seems to be taking have become second nature to me. Tell a kid enough times that she’s obstinate and she begins to believe you. Where would I be if the world had been whispering, all those years, that I was disabled?

I don’t know, but I do know that I’m not alone. If you are a successful entrepreneur in the United States, you are three times more likely than the general population to have been diagnosed with a learning or attention disorder.10

Incidentally, my friend’s young daughter has done quite well for herself. She, too, has a PhD now. I’m convinced the key to her success wasn’t the label or the special classes or the drugs for her attention deficit disorder, but a mom who fought tirelessly for her daughter’s unique and important way of being in the world.

I’ll never forget the day my high school principal called me into his office, this time with something good to tell me. He had received a lengthy, typed letter from whoever is at the other end of ACT scores, telling him that he needed to sit down with me and explain that this was a multiple-choice test and I was doing myself a disservice with the long essays I had handwritten on the reverse. I had addressed all the questions that were either ambiguously worded or where all of the answers offered were incorrect. Those essays had wasted valuable time, which partly accounted, this person insisted, for my low test score. My principal had had me in his office more than once before, but I’ll be eternally grateful to him for reading me that letter. For all its exhortatory tone and cautions, it ended by saying that the principal should tell me that the ACT committee had gone over all of my comments—there were fifteen or maybe twenty of them—and they wanted me to know that I’d been right in every case.

All these years later, this book is my way of paying back that anonymous ACT grader. I suspect he or she must have been a very fine teacher who knew this would be a good exercise not only for me but also for my principal.

Who pays attention to the reverse side of a test? Not many people, but sometimes the right answers are there, where we don’t see them, on the other side. I’m convinced that’s the case now. For all the punditry about the “dumbest generation” and so forth, I believe that many kids today are doing a better job preparing themselves for their futures than we have done providing them with the institutions to help them. We’re more likely to label them with a disability when they can’t be categorized by our present system, but how we think about disability is actually a window onto how attention blindness keeps us tethered to a system that isn’t working. When we assess others against a standard without acknowledging that a standard is hardly objective but rather a shifting construct based on a local set of values, we risk missing people’s strengths. We don’t see them, even when they’re in plain view. The implications for this are broader than you might expect. It means more than discouraging a few potential great novelists or professors or artists (though even that can be a huge loss; imagine a world without Warhol or Einstein); it means that we risk losing broad contributions from the people who are supposed to take us into the future.

003

This book is designed as a field guide and a survival manual for the digital age. I have focused on the science of attention because it offers us clear principles that, once we see them, can be useful in thinking about why we do things the way we do them and in figuring out what we can do differently and better. The age we live in presents us with unique challenges to our attention. It requires a new form of attention and a different style of focus that necessitates both a new approach to learning and a redesign of the classroom and the workplace.

Many of these changes are already well under way. In our individual lives, we’ve gone through astonishing transformations in a little over a decade. A recent survey found that 84 percent of those polled said they could not accomplish their day’s work if the computers were down at their office.11 That’s pretty remarkable given that the Internet has been prevalent in the modern office only since around 1995.

Because we’re in a transitional moment, most of us aren’t aware of how structurally different our life has become because of the Internet. We don’t see how radical the changes of the last decade or so have been. It’s a bit like growing up poor. If everyone is, you don’t know you are. But to step back and look at the digital age from the long view of human history is to see that this is one of those rare times when change concatenates: A change in one place makes a series of other changes in others. With the Internet, we have seen dramatic rearrangements, in a little over a decade, in the most basic aspects of how we communicate, interact, gather knowledge of the world, develop and recognize our social networks and our communities, do business and exchange goods, understand what is true, and know what counts and is worthy of attention.

The eminent historian Robert Darnton puts our information age into perspective for us. He argues that, in all human history, there have been only four times when the very terms of human interaction and communication have been switched so fundamentally that there was no going back. When he says our digital shake-up is major, he’s comparing it to all of human history. For the first great information age of humanity, Darnton goes back to ancient Mesopotamia, around 4000 B.C., and the invention of writing. He counts movable type as responsible for the dawning of a second information age. That happened in tenth-century China and in Europe in the fifteenth century with Gutenberg. The third information age, he says, came with the invention of mass printing and machine-produced paper and ink that made cheap books and newspapers and all other forms of print available to the middle and lower classes for the first time in history. That began at the end of the eighteenth century, in the Enlightenment. And then there’s now, our very own information age, the fastest and most global of all the four great epochs in the history of human communication.12 It’s a bit startling and perhaps humbling to consider that one of the greatest transformations in human interaction is playing out across our everyday lives.

Yet as we’ve gone through enormous changes in our modes of social interaction and communication, in our attention and in the tasks we now set ourselves, our most important institutions of school and work haven’t changed much at all. That’s to be expected, perhaps. As Internet analyst Clay Shirky notes succinctly, “Institutions will try to preserve the problem to which they are the solution.”13 We rival the Cradle of Civilization (remember that?) in momentousness, and many of our institutions still look as if there’s been no digital revolution at all.

Think about our kids’ schools. My grandmother came to this country in steerage, by steamship, but when I look at the photograph of her standing tall and proud in her eighth-grade class in Chicago, surrounded by immigrants from other places, the schoolroom itself looks entirely familiar. Her classroom could be plopped down almost unchanged in any large, urban public school today. What’s more, many features of that classroom and what she learned there were structured to help her adjust to the new industrial, manufacturing-based economy she was entering. That economy, as we all know, has been transformed irrevocably by globalization and the changes wrought by the information age. If kids must face the challenges of this new, global, distributed information economy, what are we doing to structure the classroom of the twenty-first century to help them? In this time of massive change, we’re giving our kids the tests and lesson plans designed for their great-great-grandparents.

The workplace isn’t much different. Unless you happen to be employed at the famous Googleplex, the Day-Glo Google campus in Mountain View, California, your office might still look like something out of a Charles Dickens novel—or a Dilbert cartoon, which is to say the same thing. Small cubicles or offices all in a row are another feature of the industrial age and the workplace of the late nineteenth and twentieth centuries. Is this really the most effective way to work in the twenty-first?

Is it possible for a whole society to have attention blindness? I think it is. We seem to be absorbed right now in counting the equivalent of digital basketballs: fretting about multitasking, worrying over distraction, barking about all the things our kids don’t know. We’re missing the gorilla in the room. We are missing the significance of the information age that is standing right in the midst of our lives, defiantly thumping her chest. It’s not that we haven’t noticed the change. Actually, we’re pretty obsessed with it. What we haven’t done yet is rethink how we need to be organizing our institutions—our schools, our offices—to maximize the opportunities of our digital era.

We’re so busy attending to multitasking, information overload, privacy, our children’s security online, or just learning the new software program and trying to figure out if we can really live without Twitter or Four Square, that we haven’t rethought the institutions that should be preparing us for more changes ahead. Politically, on the right and on the left, we’ve got a lot to say about whether the globalized workforce of the twenty-first century is good or bad, but in some ways the politics of globalization are beside the point. The digital age is not going anywhere. It’s not going to end and it’s not going away. So it’s long past due that we turn our attention to the institutions of school and work to see how we can remake them so they help us, rather than hold us back.

IN OTHER GREAT MOMENTS OF technological change, we’ve used education as the way to meet new challenges. After the Russians launched Sputnik 1 on October 4, 1957, at the height of the Cold War, Americans got busy and devoted enormous energy, resources, and innovation to improving schools so our kids could compete in the future. Not every educational experiment from the time worked, but educators were determined to try to find new ways relevant to the moment. Yet the information age has so far failed to produce the kind of whole-cloth rethinking of policy and institution building necessary to meet the challenge, so as we sally forth into the fourth great information revolution in the history of humanity, we’re armed with No Child Left Behind, a national “standards-based” educational policy based on standardized tests and standardized thinking, which trumpets tradition and looks steadfastly backward more than a hundred years for its vision of the future. We haven’t done much better with many of our workplaces. A cubicle in a modern, global office is rather like the proverbial fish with a bicycle. It’s hard to imagine anything less relevant.

Keep in mind that we had over a hundred years to perfect our institutions of school and work for the industrial age. The chief purpose of those institutions was to make the divisions of labor central to industrialization seem natural to twentieth-century workers. We had to be trained to inhabit the twentieth century comfortably and productively. Everything about school and work in the twentieth century was designed to create and reinforce separate subjects, separate cultures, separate grades, separate functions, separate spaces for personal life, work, private life, public life, and all the other divisions.

Then the Internet came along. Now work increasingly means the desktop computer. Fifteen years into the digital revolution, one machine has reconnected the very things—personal life, social life, work life, and even sexual life—that we’d spent the last hundred years putting into neatly separated categories, cordoned off in their separate spaces, with as little overlap as possible, except maybe at the annual company picnic.

Home and work? One click of the mouse and I’ve moved from the office memo due in half an hour to Aunt Tilly’s latest banana bread recipe. Labor and leisure? The same batch of e-mail brings the imperious note from my boss and another wacky YouTube video from Cousin Ernie. One minute I’m checking corporate sales figures and the next I’m checking how my auction’s going on eBay, while down the hall, I’m pretty sure the sales assistant has slipped yet again from spreadsheets to bedsheets (over 50 percent of Internet porn viewing takes place on company time). Whatever training the twentieth century gave us in separating the different facets of our life and work is scuttled by a gizmo such as the iPhone, which puts the whole, dazzling World Wide Web in the palm of our hand.

Even our brain seems to have changed because of new computational capacities, and those capacities have also supplied a nifty new set of metaphors to help us comprehend the complicated neural circuitry stirring between our ears. We’re less inclined to think of the brain as a lump of gray matter lolling inside our cranium than to imagine it as an excited network of a hundred billion neurons, each connected to several thousand other neurons, and all firing several times a second, in constant weblike communication with one another all the time, even when we’re asleep or resting. If multitasking is the required mode of the twenty-first century, thank goodness we now have a hyperactive, interactive brain that’s up to the (multi)task!

THE WAY WE THINK ABOUT the brain has a lot to do with the technology of the era. Is the brain a machine, linear and orderly? Is the brain a mainframe computer, hardwired with certain fixed properties and abilities? Is the brain a network like the Internet, always changing and inherently interactive? Not surprisingly the metaphors for the brain have grown in complexity along with the evolution of ever more complicated technologies of interconnection.

From contemporary neuroscience we now know the brain is a lot like an iPhone. It comes with certain basic communication functions bundled within it, and it has apps for just about everything. Those apps can be downloaded or deleted and are always and constantly in need of a software update. These iPhone apps represent the things we pay attention to, what counts for us, what we are interested in. Our interests shape what apps our personal iPhone has, but our interests aren’t isolated. If my best friend says, “Find me on Gowalla,” I then add the GPS-based Gowalla app so we can follow one another’s comings and goings around L.A., and before I know it, I’ve added a dozen more apps to my phone from things I’ve learned through our game play and social networking as we travel the city, separate but connected.

The brain is similar. How we use our brain (what we pay attention to) changes our brain. Those things that most capture our attention—our learning and our work, our passions and our activities—change our actual brain biology. In this way the iPhone brain also corresponds nicely with recent advances in what we know about neural plasticity, the theory that the brain adapts physically to the sensory stimuli it receives, or what psychiatrist Norman Doidge calls “the brain that changes itself.”14 This model was given real biological heft in the 1990s, when stem cells were discovered in different parts of the adult human brain. Stem cells can regenerate. They can also take on new functions—new apps—that have been lost due to damage to other parts of the brain.

This is exciting news. The first great era of brain science, the late nineteenth century, coincided with the industrial age, and it’s not surprising that brains back then had distinctive parts and hardwired functions. Early brain scientists surveyed our lobes and nodes like explorers on the frontier, creating topographical maps distinguishing this region from that and describing what natural resources could be found there. Like gold rush mining towns, parts of the brain were named after the pioneers who planted a flag there first, so to speak, such as Broca’s area, after Paul Pierre Broca, who first labeled the inferior frontal gyrus as the center of speech production. German neurologist Korbinian Brodmann identified fifty-two of these distinctive functional regions of the brain, still known as the Brodmann areas. In this view, each area has a specific function within a hierarchy, from the higher-level intellectual ordering and “executive functions” of the prefrontal cortex down to the base, emotional, “reptilian” amygdala.

The brain’s development was also thought to be linear too. It was thought that the brain’s capacities grew until about age twenty-five, plateaued for five or ten years, and then began to decline, first slowly, then more rapidly. It was all a bit mechanical, and that is no accident, for the image of the linear, orderly, machinelike brain hardwired with fixed capacities changing in a clear developmental pattern came into being about the same time as the assembly line and mass production.

Contemporary neuroscience insists that nothing about our brain is quite so fixed or static, including its progress and its decline. Rather, we’re constantly learning, and our mental software is being updated all the time. As we get older, we can become obsessed with what we think we may have lost, but new brain science reveals that in healthy individuals the loss is far less than what was once believed. We stay smarter longer and our capacities expand in more interesting ways than was previously thought possible. There are even productive ways (including culture shock) to force this late-learning process, in much the same way that tulips can be forced to bloom indoors in winter. As we will see, a major factor reshaping the brain is technology.

HOW WE PERCEIVE THE WORLD, what we pay attention to, and whether we pay attention with delight or alarm are often a function of the tools that extend our capabilities or intensify our interactions with the world. That expansion of human capacities can be scary. Albert Einstein famously observed that technological change is like an ax in the hands of a pathological criminal. It can wreak a lot of havoc. If you read newspapers published at the time of Broca and Brodmann, you find anxiety about new technologies of the day. A chief concern is speed. Trains, bicycles, and especially the automobile (the “horseless carriage”) all seemed to push humans beyond their natural, God-given, biological limits. Early critics of the car, for example, simply refused to believe they could be safe because, after all, human attention and reflexes were not created to handle so much information flying past the windshield. That debate reached a crescendo in 1904, when the Hollywood film director Harry Myers received the world’s first speeding ticket, when he was clocked rushing down the streets of Dayton, Ohio, at the death-defying speed of twelve miles per hour.15

It’s commonplace in the history of technology for people to insist that “human biology” or the “human brain” simply cannot cope with the new technology or that our minds and bodies have been put at risk by that technology. Probably people said that about the wheel. What this line of argument overlooks is that the brain is not static. It is built for learning and is changed by what it encounters and what operations it performs. Retooled by the tools we use, our brain adjusts and adapts.

Right now, we’re in a transitional moment. We are both adopting new information technologies all the time and being alarmed by them, even wondering if they are causing us harm, exceeding our human capacities. Fifteen years is a very brief time in the history of a major new technology. Basically, the Internet is still in its adolescence and so are we as users. We’ve grown up fast, but we still have much to learn. There’s a lot of room for improvement. We are experiencing growing pains.

Because we learn to pay attention differently depending on the world we see, when the world changes, there is a lot we’re suddenly seeing for the first time and even more we suspect we’re missing. So that’s a key question: How can we focus on what we do best without missing new opportunities to do better?

The starting place is recognizing the brain’s habits, really taking in what these habits mean, and then working (by ourselves and with others) to find ways to break the old patterns that no longer serve us. A friend of mine has cultivated the habit of always putting her watch on the wrong wrist when she wants to remind herself later to remember something. She says when she gets home from the office and goes to take off the watch, that moment of awareness that the watch is on the wrong wrist forces her to think, “What was it that was so important today that I was sure I would forget it later and changed my watch to remind me?” She then takes an inventory and inevitably remembers. It’s her device, her way of interrupting her habits to refocus her attention. The trick works and has application, as we will see, in individual and group situations in both everyday life and the workplace.

004

If attention suddenly has our attention, it’s because we live in a time when everything is changing so radically and so quickly that our mental software is in constant need of updating. We have heard many times that the contemporary era’s distractions are bad for us, but are they? All we really know is that our digital age demands a different form of attention than we’ve needed before. If a certain kind of attention made sense in a world where all the news came through one of three television channels, then what form of attention do you need when your primary information source is Google, where a search for information about “attention” turns up 371 million entries, and there’s no librarian in sight?

When Tim Berners-Lee invented the World Wide Web, he anticipated a new form of thinking based on process, not product: synthesizing the vast and diverse forms of information, contributing and commenting, customizing, and remixing. Do we even know how to assess this form of interactive, collaborative intelligence? Is it possible we’re still measuring our new forms of associational, interactive digital thinking with an analog stopwatch? What if kids’ test scores are declining because the tests they take were devised for an industrial world and are irrelevant to the forms of learning and knowing more vital to their own world?

By one estimate, 65 percent of children entering grade school this year will end up working in careers that haven’t even been invented yet.16 Take one of the “top ten” career opportunities of 2010: genetic counseling. Every hospital now needs this hybrid medical professional–social worker to advise on family planning, prenatal testing, and treatment protocols. The shortage of genetic counselors is considered a matter of national urgency. Before 2000, when the human genome was sequenced, such a career would have seemed the stuff of science fiction.

Nor is it just kids who face uncertain changes in their careers. My friend Sim Sitkin recently invited me to have lunch with some of his students in the Weekend Executive MBA Program at Duke’s Fuqua School of Business. This is a program for executives who have been in the workforce at least five years and are seeking to retool. They commit to a nineteen-month intensive program that meets every other Friday and Saturday. Tuition is over $100,000. You have to need something badly to make that kind of professional and financial commitment. About seventy students at a time are admitted to this top-ranked program. Sim says they come from every imaginable profession.

We were joined at our table that Saturday by five executives: a twenty-eight-year-old middle manager for an international pharmaceutical firm, a forty-year-old sales rep at an international business equipment manufacturer, a software developer originally from China, a financial analyst now managing accounts handled offshore, and a distinguished physician whose job increasingly relies on telemedicine. All told me about different ways their occupations have changed in the last five years. The words context, global, cross-cultural, multidisciplinary, and distributed came up over and over. One person noted that his firm was bought out by a multinational corporation headquartered in India and that’s where his boss lives and works. Learning how to communicate across cultural, linguistic, and geographical barriers by Skype is no small challenge. The physician compared his medical practice to the hub system in the airlines, with general practitioners across the region referring their most serious problems to his staff at a large research hospital, where they were counseled virtually in preparation for a possible actual visit. He was getting his MBA because medical school hadn’t prepared him to be a traffic controller, bringing in patients from all over, some with major problems and some minor, some referred by highly skilled generalists and others by doctors or nurse practitioners with only rudimentary knowledge. His job was to land all patients safely within an enormous hospital system, making sure that they were not only well cared for, but that their procedures were carefully synchronized across the testing, diagnostic, and financial sectors of the hospital. He hoped the executive MBA would help him navigate this bewildering new world, one that bore little relationship to the medical specialization he’d mastered fifteen years before.

Futurist Alvin Toffler insists that, if you scratch beneath the surface of anyone’s life in the twenty-first century, you will find the kinds of enormous change these executive MBA candidates experienced. Because change is our generation’s byword, he believes we need to add new literacy skills to the old three Rs of reading, writing, and arithmetic. He insists that the key literacy skill of the twenty-first century is the ability to learn, unlearn, and relearn.17 Unlearning is required when the world or your circumstances in that world have changed so completely that your old habits now hold you back. You can’t just resolve to change. You need to break a pattern, to free yourself from old ways before you can adopt the new. That means admitting the gorilla is there, even if you’re the only person in the room who does (or doesn’t) see it. It means putting the watch on your other arm. It means becoming a student again because your training doesn’t comprehend the task before you. You have to, first, see your present patterns, then, second, you have to learn how to break them. Only then do you have a chance of seeing what you’re missing.

As the attention blindness experiments suggest, learning, unlearning, and relearning require cultivated distraction, because as long as we focus on the object we know, we will miss the new one we need to see. The process of unlearning in order to relearn demands a new concept of knowledge not as a thing but as a process, not as a noun but as a verb, not as a grade-point average or a test score but as a continuum. It requires refreshing your mental browser. And it means, always, relying on others to help in a process that is almost impossible to accomplish on your own.

That’s where this book comes in. It offers a systematic way of looking at our old twentieth-century habits so that we can break them. It proposes simple methods for seeing what we’re missing and for finding the strategies that work best for the digital age. This book starts from some core questions:

Where do our patterns of attention come from?

How can what we know about attention help us change how we teach and learn?

How can the science of attention alter our ideas about how we test and what we measure?

How can we work better with others with different skills and expertise in order to see what we’re missing in a complicated and interdependent world? How does attention change as we age, and how can understanding the science of attention actually help us along the way?

These are the central questions this book will address, and I hope that by its end we’ll have something even better than answers: We’ll have new skills and new insights that will help us address problems as they arise in our everyday life in this digital age.

As we will see, the brain is designed to learn, unlearn, and relearn, and nothing limits our capabilities more profoundly than our own attitudes toward them. It’s time to rethink everything, from our approach to school and work to how we measure progress, if we are going to meet the challenges and reap the benefits of the digital world.

We can do this, but we can’t do it by ourselves. Whether prepping for a college interview, walking into a board meeting when a hostile takeover is in the air, interviewing a new manager for a key position in the firm, or, on a personal level, going for a follow-up test after a cancer diagnosis, the best thing we can do to ensure the most successful outcome is to have a partner accompany us. But as we’ll see, it’s not enough to just have someone be there. We need a strategy for working in tandem—a method I call “collaboration by difference.” We have to talk through the situation in advance and delegate how each of us watches for certain things or keeps an eye on certain people. If we can trust our partner to focus in one direction, we can keep our attention in another, and together we can have more options than we ever imagined before: “I’ll count—you take care of that gorilla!”

Our ultimate metric is our success in the world. And that world has changed—but we have not changed our schools, our ways of training for that world, in anything like a suitable way. By working together, by rethinking how we structure the ways we live, work, and learn, we can rise to meet the exciting opportunities presented by the information age.

That’s the challenge this book addresses. Let’s get started.