Now You See It: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn - Cathy N. Davidson (2011)

Part II. The Kids Are All Right

Chapter 5. The Epic Win

Duncan Germain, a first-year teacher at Voyager Academy, a publicly funded charter school in Durham, North Carolina, teaches Creative Productions. It’s not your typical school subject (which means I like it already), so it’s sometimes hard to tell what exactly is going on in this twenty-four-year-old’s classroom. Computers form an L along the back and side of an alcove. By a row of windows, long wooden poles, thick as I-beams, lean into the corner, partially covered by a tarp. A box of Popsicle sticks sits on a desk with another computer, along with index cards, Post-it notes, rulers, and scissors, and plenty of glue guns, too. The room is full of books. Beneath a Smart Board is an unrecognizable object, clearly handmade, about three feet high and three feet wide, constructed from wooden planking cut into various lengths and then glued together, with something like pilings on each side and an irregular V-shaped gap down the middle. Sturdy screw-eye hooks top each of the pilings, as if waiting to be attached to something, although I’m not sure what. Mr. Germain starts to write. Between the pointer finger and thumb of his left hand, there’s either a tattoo or something he’s written there this morning in dark blue ink. It reads: “What would Ender do?”

Now there’s a tough question. Ender is the hero of the science fiction saga beginning with Ender’s Game (1985), by novelist Orson Scott Card. Ender is a tactician and a genius, the ultimate leader and protector of Earth. In this series set far in the future, the inhabitants of Earth must decide how to fend off invasions by hostile beings inhabiting solar systems light-years away. Ender’s childhood games prepare him to be that ultimate leader, holding the destiny of entire civilizations in his powerful command. Mr. Germain believes we all need to take our education seriously and that every child, including each sixth grader at Voyager Academy, needs to view learning as the key to a future whose destiny she holds in her own hand. Like Ender. That’s how Mr. Germain teaches, with the intensity of someone who believes that, individually and collectively, we are responsible for our civilization’s survival. When I ask him what his standard for success is, his answer is simple: “Perfection.” I’m taken aback, but as I watch the class unfold, I understand that he challenges each child to find a perfection of his or her own. No aiming for the A, B, C, or D here. The bar is set at “How high can we go?”

As his students come into the room, they all pause and read what Mr. Germain has written on the board: “What’s your specific role in your group today? What’s your job?”

They take turns grabbing markers and they scrawl answers on the board, initialing what they write. They are chatting quietly, a school rule for entering and leaving the room. They make their way to their seats, arranged in groups of four chairs around tables in two rows down the room.

While the kids are answering the questions on the board, Mr. Germain tells me he often begins a day like this, focusing their attention with questions that succinctly restate problems or dilemmas they expressed the day before. He condenses their concerns, but always as questions that require them to answer, focusing on solving what bothered them the day before. Yesterday, some of the kids were frustrated that others weren’t collaborating as well as they might have. So today’s class begins by addressing that leftover anxiety before moving on to the set of challenges to be addressed today.

Voyager Academy is a very special school. It accepts only a hundred kids a year, as determined by a lottery system, and the kids know they are lucky to be here. It’s been open only two years, so everyone here has been somewhere else, and they know the difference. They are all here by choice. At the same time, there are other forms of diversity. The proportion of kids with learning disabilities is higher at Voyager than the norm, as it is with many charter schools. Because of the smaller class size and the attention to hands-on learning, many parents who feel as if their kids have been falling behind in traditional schools try Voyager. In that it is kin to Q2L, though the two schools take different approaches to reach similar ends.

As with any charter school, this one has its own forms of accountability and must abide by those, in addition to the state-approved goals it must meet in order to continue receiving public funding. It charges no tuition, and there’s a considerable waiting list of parents hoping their kids might get in. Kids at Voyager also take standard North Carolina end-of-grade (EOG) tests, but they are dealt with as an obligation, an add-on; it’s as though they are not part of the educational philosophy of the school but an encumbrance imposed by regulation.

Every teacher I meet at Voyager has a story about the EOGs—what the exams fail to test; how they distract everyone from real learning; how in April everyone, from the kids to the principal, starts losing sleep over them; and how much disruption they cause to the real education happening throughout the school year. But they must be taken seriously if the school is to continue. “Failure” according to those standards isn’t an option, even though the learning method designed to produce success in standardized testing is frowned upon, in practice and principle, by everyone I meet.

So in what, then, does a class called Creative Productions offer instruction, and how does it assess? Evaluation in Mr. Germain’s class is entirely test-based, but his method has almost nothing in common with the assessments of the EOG. One could almost say that the subject of Creative Productions is assessment. This class doesn’t have a set topic. Rather, the purpose is to take what the kids learn in their other classes and find ways to apply it to some kind of real-world experience. It’s a class about usefulness, taking curriculum concepts out of the abstract. The kids come up with their own ways to work toward solutions and to test the validity of their solutions in real-world and experimental situations. They analyze their own results and write essays and formal abstracts that document those results.

The project they are working on is building a bridge from Popsicle sticks. That’s what that wood construction under the Smart Board is. It’s the “valley” that their bridge needs to span. Mr. Germain made it himself, and the students are excited because this is the first time they’ve seen it, the first chance they’ve had to test their own creations on it.

For this project, Mr. Germain has let the kids create their own groups, and they can work in a group of any size from one to five. Some kids chose to work alone, and so long as the student is able to provide a justification for working individually, Mr. Germain approves it. Part of this lesson is to analyze advantages and disadvantages of different ways of working: independent versus group thinking, delegation of tasks, how some people step up to challenges and others let you down, and how you make the group work well together. Anxieties over collaboration were at the heart of the questions Mr. Germain wrote on the board, and when he raises the subject, ten or twelve hands shoot up, and the conversation starts in earnest over the best ways of working together.

The students offer a variety of concerns. For some, working collaboratively produces information overload. They have a hard enough time managing themselves, knowing when they are over their own capacities. One little girl is jumping up and down saying loudly, “I feel jumpy today, I’m too jumpy today.” Another girl at her table looks at her sternly and says, in her best maternal voice, “Well, stop being jumpy or we won’t finish our experiment today either.” The speaker is the girl who wrote, in answer to the “What’s your job?” question, “To get us finished.” The jumpy girl wrote “Operating the glue gun” as her answer to the question, and she turns to the maternal gal at the table and says, “I’ll stop jumping and get the glue gun.” And amazingly, with no intervention from Mr. Germain, she stops jumping and gets the glue gun.

“Who else wants to talk about their roles today?” Mr. Germain asks.

“I want to revise our blueprint,” a boy says, and Mr. Germain points to a stack of blueprints. “OK, you can go get it now and work on it for a while before we start a group project,” Mr. Germain says.

Another girl has written “Leading the group” as her answer.

“That’s the hardest job of all. You know that, right?” Mr. Germain asks the tall, blond girl who has taken on that leadership role for her group. She nods seriously as her eyes slide in the direction of a boy in her group who is stretched out as far as possible while still sitting on a chair, his arms raised, his legs straight, his toes pointing. He’s yawning. Mr. Germain’s eyes follow her gaze toward the boy and then he looks back at her. “You can do it. I’ll help,” he says, his voice quiet and calm.

Throughout the class, Mr. Germain keeps one eye on the boy who was stretching. He’s doing well today, but I learn he’s smart and energetic enough to turn the class upside down with his antics. He’s been learning, lately, how to tell for himself when he’s in a disruptive mood, and he has a deal going with Mr. Germain. If he feels he cannot control himself, he’s allowed to just walk away and go work by himself at the computer. He doesn’t have to ask permission. All he needs to do is take himself out of the situation where he’ll be disruptive. It’s a public pact: Everyone knows it, including the tall girl. Mr. Germain has given this boy the responsibility of controlling himself by knowing how he can be least disruptive to the group. Today it’s working.

As Mr. Germain goes around the room asking each group to talk about the management challenges ahead, he gives me the project plan to look at. Each student fills out such a plan at the beginning of a new class project. On the project plans, students fill in name, class period, and date, and then answer the very first question: How will they complete their project—alone, with a partner, or with a toon? A toon is short for “platoon,” a term for the group one depends on for strength and ingenuity, those partners who offer the vision that you cannot, on your own, even see is missing. Though it’s a military term, Germain took the shortened version of it from Ender’s Game: It’s the way Ender organizes his troops.

The project plan is divided into sections: who, what, where, when, why, and how. The plan covers two full sides of a sheet of paper and asks questions about the materials needed and their source; the backup plan in case the students’ project plan doesn’t work; those people to whom the student will turn for feedback and help; the research sources and design of the project; and what principles require research. And then, in case I’m not impressed enough already, the sheet presents a list of three areas where the student has to provide justification for the project. I’m looking over a project plan filled out by a quiet girl with olive skin and curly brown hair. Her sheet reads:

I can justify this project because it connects to

1. Standard course of study: measurement, scale, structure, scientific method.

2. Habits of mind: thinking flexibly, asking questions, problem solving, applying past knowledge, rejecting poor solutions, repairing, revising.

3. The outside world: understanding the principles whenever we go on a bridge, whenever we need to improvise and make a bridge.

This checklist looks like something one might find in a Harvard Business School book, the answers as good as an executive might offer up. When I ask him how he came up with his project-plan form, Mr. Germain notes that his father is a management consultant who often works with businesses adjusting to the shock waves of the new, digital global economy. Of course! I should have known. A program of learning for CEOs grappling with the twenty-first century is a great lesson plan for twelve-year-olds about to face the challenge of building a bridge from wooden sticks—and vice versa. Skills these kids learn in Creative Productions will serve them for a lifetime.

In this class, the students are testing themselves and one another all the time. For the Popsicle-stick bridge project, they began by finding the best articles they could locate on the engineering principles of bridge construction for the four main kinds of bridges. They each read some articles and then wrote abstracts for their toon and their classmates to read. Based on their reading, each toon chose a bridge type and then did further research on that one kind of bridge. They then had to ask themselves what they didn’t learn in their research and had to come up with an empirical experiment, based on scientific principles, to find out an answer to their question. Is it better to make the bridge wider or narrower? How wide or narrow? Is it better to build with two layers or one, laying the Popsicle sticks on a diagonal or horizontally? In the classroom, there are heavy iron weights to test the tensile, yield, and breaking strengths of their structures. Each toon then writes up its experiment and their test results on index cards, with diagrams and conclusions.

Mr. Germain has distributed stacks of cards with their experiments to each table. The students read one another’s experiments and findings carefully, then write out a summary of the research and an analysis of whether it seems convincing or inconclusive. They make value judgments on whether the experiment was well conceived or not. And then, after doing this for three or four experiments, they confer on whether any of the empirical evidence makes them want to modify their blueprint in any way. It’s astonishing hearing them discuss the relative merits of research methods of the various studies conducted by their classmates. They act as frenetic as any sixth graders in their manner and attitudes, but the questions they raise about scientific method would make a university lab director pleased.

All of this activity leads up to the building of bridges that will be suspended over the model canyon built by Mr. Germain. They will then subject their bridges to all kinds of tests to see if they break. Germain, who has won awards for the Lego battleships he has created, is building his own bridge, too. It is the standard of perfection to which the kids in the class must aspire. When all the bridges are done, the kids will decide which bridge is the best. And there’s a prize for the best. Mr. Germain has put up a $100 prize, from his own money. If his bridge is the best, he will keep $100. If one of the groups win, they will receive this money.

But that will lead to yet another lesson. How will they divide the money? Mr. Germain tells me they’ve been studying monetary systems in other classes, everything from percentages, interest rates, collateral, and so forth to whole economic systems that govern different countries around the world—capitalism, socialism, communism, exchange and barter economies versus cash and credit economies. If one of the toons wins, they will have to put these economic principles to work. One for all and all for one? Or does the team leader take the biggest reward and pay wages to the others? Do you compensate your classmates in any way? You may need their help for the next project. The tests never end in Mr. Germain’s class. Each test presents its own set of complex issues, requiring its own set of skills and knowledge. It is all part of the responsibility of winning, and the class has had long discussions about borrowing, funding bridge building, financial impacts bridges have on communities, environmental consequences, and so forth.

It’s at this point that the penny drops. In Creative Productions, each challenge leads not to an end point but to another challenge. It’s like the game mechanics of LittleBigPlanet on the snow day in my office or like the tests Mrs. Davidson set her kids to that always led to new tests being devised by those who lost that week’s challenge. Game mechanics are based on the need to know: What you need to know in order to succeed is the most important kind of knowledge, and that’s not something someone else can tell you. You have to decide what you need to know and how to obtain that knowledge. The world isn’t multiple-choice. Sometimes defining the question is even harder than answering the question you come up with. This is as true in life as it is in Mr. Germain’s classroom.

I ask my question out loud, “You’re really operating on game mechanics here, aren’t you?” Duncan Germain smiles. He confides that he’s one of the longest-continuing American participants in something called parkour, or freerunning, a noncompetitive physical discipline that originated in France, in which participants run along a route in man-made, typically urban, environments, negotiating every obstacle using only their well-trained bodies. Duncan doesn’t look like some muscle-bound athlete, but when I watch his parkour videos on YouTube, I see him scramble up walls, jump across the gap from one rooftop to another, jump off a building and land in a roll, and pop up again to run, jump, climb, dodge, and all the rest. It’s easy to see why parkour prides itself on needing brains as much as bodies to negotiate spaces that most of us would insist could not be traversed without ladders or ropes.

The philosophy of parkour is that all the world is a challenge to be respected and met. You train in groups so you can learn from one another not only techniques but your own weaknesses. Left to your own devices, you can miss your own weaknesses until it’s too late. You rely on your training partners to help you improve your skills so you don’t take foolhardy risks. Duncan says he’s fallen seriously only once in six years, and that he didn’t hurt himself badly in the fall. Parkour requires you to pay attention to all the things you normally take for granted. It tests you in situations where nothing can be taken for granted.

As in good teaching or good parenting, the challenge of Mr. Germain’s hobby-sport is calibrated to the skills and understanding of the player, and the challenge gets harder only when you have succeeded at the easier challenge. The standard? Perfection.

At the end of a long day at Voyager Academy, the kids sit quietly in their homerooms, waiting for someone to pick them up and take them home. The school rules require silence during this hour of the day. Kids read or look out the window or fiddle, silently.

A boy, smaller than most of the others, with dusty blond hair and a habit of not looking at you when he speaks, asks if he can spend this homeroom time working on his bridge. It is elegant, intricate, and impeccable in its construction. Mr. Germain is going to have to work hard to beat this one. The boy takes a handful of Popsicle sticks over to the fabrication station and uses a mallet and a chisel to cut and bevel them in precise ways so he can use them to form a lattice of supports and struts.

I ask about the boy after he goes off to a far corner of the room to work on incorporating these new pieces into his beautiful bridge. Duncan says he was a bit of a lost child, someone who seemed not to have any talents, who was failing everything, every exam, every class, falling so far behind that there seemed to be no way to rescue him. From the way he holds himself, including how he avoids eye contact with others, I suspect that he might have something like Asperger’s or another attention-spectrum disorder. But this bridge assignment has his complete attention, and the result is marvelous to behold.

Is this child a failure because he cannot begin to answer A, B, C, or D in multiple-choice tests in his other classes?

Or is he a master bridge maker?

The answer may well be both, but right now we have no system in place to recognize, foster, or build upon what this child is accomplishing.

Over the PA system, a voice says a boy’s name, and this one looks up. He seems sad, reluctant to leave, but a mother is waiting for him outside. He asks Mr. Germain if he can bring his bridge home so he can work on it there. This is a child who manages to lose homework assignment after homework assignment. Mr. Germain looks him squarely in the eyes, insisting on the eye contact. “Of course you can bring it home,” he says, calmly, “as long as you are sure you can count on yourself to bring it back again tomorrow.”

The boy hesitates. He stands up and turns in a little circle. Mr. Germain stays with him but doesn’t speak; he’s letting the boy work through this. I see the boy’s face as he looks up at his teacher. He could not be more serious, more intent. He nods.

“Yes,” he says thoughtfully and decisively. “I can count on myself.”

Right now, I can see only the back of Mr. Germain’s head, but I know that he must be smiling with pride. His student has passed a very important test. His score? Perfect.

DUNCAN GERMAIN AT VOYAGER ACADEMY may or may not have studied attention blindness, but it is clear in everything he does in his class that he has mastered the idea that you need to pay attention in different ways at different times and that no one can ever see the whole picture on his own. Extended to learning, that means instead of failure being the default, adaptation is. The issue isn’t whether you have learning disabilities because, in a larger sense, everyone does. No one does everything perfectly all the time. Given that inherent lack, it’s just not interesting or useful to label some disabilities and not others. Far better to diagnose what the issues are and then find the right tools, methods, and partners to compensate for those so that you are able to contribute in the unique ways that you can. Or as in the iPod experiment, better to set the challenge and leave the solution open-ended rather than prescriptive. That’s the parkour credo too: The way you overcome the obstacles in your environment is by adapting your movements to the environment. In freerunning, there are no abstractly right or wrong answers. The freerunner finds solutions to problems as they present themselves.

This is a very different way of thinking about human ability than the one we have perfected and measured and scaled for over the last one hundred years, in which right was right and wrong was wrong and never the twain shall meet. We have spent decades building the institutions and the metric for evaluating what constitutes “normal.” We have been so exacting in that assessment that we’ve created a system in which fewer and fewer people measure up. Recently I spent an afternoon with a teacher at an elite private high school who noted that 26 percent of the students at her school have been officially labeled as having learning disabilities, which means they all have special classes, tutors, and adaptations made by the school to help them compensate for their flawed learning styles. Twenty-six percent. The teacher chuckled ruefully and said, “Of course, the real number isn’t twenty-six percent but one hundred percent.” By that she meant that all of us have different ways of learning the world, only some of which manifest themselves in the conventional classroom setting.

I’m convinced we’re in a transitional moment and that in a decade, maybe two, we will all be seeing the “long tail” of learning styles, just as we are recognizing the long tail of consumer goods and information made available to us in every imaginable form courtesy of the World Wide Web. What is impressive about all of the extraordinary teachers I have been able to spend time with, exemplified by Mr. Germain, is how they have already begun to maximize all manner of different human talents. They see ways that we have the potential, collectively, to organize our unique abilities to some end, to some accomplishment, which will elude us if we continue to hold to a rigid, individualistic, and normative model of achievement.

We have spent a lot of time and money building elaborate systems for rewarding individual excellence and punishing individual failure, as if worth had an on/off switch. What we are beginning to see are the ways that we can take our individual energies and work toward goals that accomplish something greater than we imagined on our own. As we shall see, some visionaries are also beginning to use digital interaction to come up with new ways of working together to envision solutions to seemingly intractable problems. That next step, of imagining a possible new kind of future together, is, after all, what Ender would do.

021

If Jane McGonigal had been there to witness the scene with the boy and his bridge at Voyager Academy, she would have described it as an epic win. That’s gamer-speak. In Urban Dictionary, an online user-created dictionary of contemporary lingo, an epic win is defined as “an overwhelming victory over an opponent, a victory that’s been snatched from the jaws of defeat, an unexpected victory from an underdog.” Or in the words of another contributor, an epic win is “an incredible success so fantastic that long poems would be written about it in the future.”1

McGonigal is a game designer and a social evangelist who believes that “reality is broken” and that the best way to fix it is by playing games. With a PhD from the University of California–Berkeley, she has had epic wins in designing games for business, the arts, entertainment, social causes, environmental purposes, and political movements. For McGonigal, games are the solution to many of our problems; if we were all gamers, we could cure a lot of problems in society. She believes that if from infancy on we all played games as if our life depended on them—especially multiplayer collaborative games—we would learn that challenges never stop, and that it is worth risking utter failure in order to have an epic win. She believes that, once you’ve experienced an epic win, you have the confidence to be fearless and unstoppable, not just in the next game challenge, but in the world.

In real life, we don’t have enough chances to conquer our fears or to practice leadership in situations where the results aren’t potentially catastrophic, she believes. Games, though, provide an opportunity with a built-in safety net, a place where we’re not limited in the options available for exploration. Anyone who has had a leadership role in an epic win by a complex group organized online, in a challenge such as World of Warcraft, has learned much about human nature, about how people act in groups for a certain purpose, and about their own individual capacity to persuade, organize, and lead. McGonigal thinks we should try harder to organize complex, diverse, and unknown groups of people to come together for specific, challenging purposes defined by common interests and goals—something we don’t often do because we don’t believe we are capable of it. We defeat ourselves, and this self-defeat is brought on by a lifetime of bad or weak habits, reinforced by a formal education that tells us, first, that you succeed by pleasing others and, second, that making change—especially organized group change—is insolent, risky, subversive, dangerous, and guaranteed to fail. She would say one reason that we cheer for the underdog in spectator sports as in life is that, in our hearts, we have been schooled (literally) to feel like underdogs. Through game play—in which a need to know focuses our attention, motivates our search for skills and knowledge, and tells us how to deploy them—we develop the confidence to feel not like underdogs but like talented, skilled, competent winners.

Like Mr. Germain, McGonigal believes that gamers are experts at attention. They know that concentrated effort, combined with strategic and collaborative thinking that pays attention not just to the main task but to everything important happening around the peripheries as well, is what leads to great results. With the right strategy, the right planning, the right leadership, the right practice, the right experience, and the right blend of courage and concentration, you can experience epic wins, not once in a lifetime but over and over again.

McGonigal quotes Albert Einstein’s dictum, “Games are the most elevated form of investigation.” She advocates games for all forms of teaching and all forms of learning, insisting that game mechanics apply to all actions online and off, from the simplest to the most complex global social actions. She advocates using gaming to teach collective decision making and responsible action in the world, and she hopes to introduce students to the joy of activist learning.2

This might sound like a wild-eyed idea, or at the very least something of a stretch, but McGonigal has already found remarkable success in bringing her ideas to life. While in her twenties, she changed the world of marketing strategy by designing an alternate-reality game and viral marketing campaign for the 2004 release of Halo 2, at the time a cutting-edge game developed for Microsoft’s Xbox. As part of the project, called I Love Bees, McGonigal first sent jars of actual honey through the mail to fans of previous alternate-reality games (ARGs). With the jars were letters that directed the recipient to the I Love Bees Web site, where a clock was counting down for an unspecified, mysterious purpose. Simultaneously—but with no connection drawn whatsoever—movie theaters were showing trailers for Halo 2 that included the Xbox logo and the URL Xbox.com. If you went to that site, you saw a crude message on the screen, one that looked as if some hacker had dismantled the Xbox site and was forcing a redirect to a hacked site about beekeeping.

It was a puzzling and intriguing setup, and after a month—when buzz had built up to deafening levels in the ARG community—McGonigal finally leaked rumors and then confirmations about a connection between ilovebees.com and Halo 2. As more and more clues emerged over the course of several months, a growing community of ARG aficionados began working together to solve ever more elaborate mysteries that unfolded in both virtual and physical worlds, involving phone booths and cell phones, GPS systems and Web sites, and more. In the tech world, this spontaneous forming of an online community to solve a problem together is called hive mind. It’s one of the most inspiring forms of collaboration indigenous to the Internet, where people who may not even speak the same language can come to a site and work together on anything from code to design. By the time the I Love Bees Web site was finally launched, it was already drawing over a million visitors a month, a community of visitors who shared certain gaming values and interests. When Halo 2 launched, it had a huge appreciative audience waiting to buy it. The launch was an unprecedented success, and at a fraction of normal marketing costs.

McGonigal next turned her attention to the art world, helping to create audience-participation multimedia games for the Whitney Museum of American Art and the Museum of Contemporary Art, Los Angeles. She showed conclusively how the lines between art and technology, games and aesthetics, and the real world and the virtual world were blurring. She also demonstrated how artists, technology designers, curators, museums, and communities could come together to co-create and even co-curate new-media art forms in imaginative ways.

This all sounds amazing, but it pales in comparison to McGonigal’s recent work, which turned gaming strategies toward pressing real-world problems. Few issues vex our present moment quite as much as energy, and with her role in designing the simulation World Without Oil, McGonigal has helped to put the strengths of massive multiplayer online gaming (MMPOG) to challenges of the world’s declining oil reserves.

World Without Oil launched on April 30, 2007, and concluded on June 1, 2007. It imagined the first thirty-two days of a cessation in the production of oil worldwide. Over a hundred thousand players eventually contributed to the game, completing 1,500 complete scenarios, all with different global and local consequences. Those scenarios are all archived and can be downloaded. They are so realistic and so well informed by research, experience, and the science and sociology of environmentalism that they are now used by educators in courses at all levels and by policy makers who use them to train for informed decision making. The World Without Oil Web site argues that, because of its success, we now have proof that “a game can tap into the awesome problem-solving capabilities of an Internet ‘collective intelligence’ and apply them to real world problems.”3

In March 2010, McGonigal launched a new game, Evoke, which the World Bank Institute described as a “ten-week crash course in changing the world.” Evoke is designed to engage hundreds of thousands of people in a select set of problems. It is designed as a social networking game that empowers young people all over, but particularly in Africa, to solve problems such as food shortages or drought survival. You participate by mobile phone because even the poor in Africa have inexpensive mobile phones. Within its first three weeks, fourteen thousand people were playing Evoke. Each challenge is divided into three parts: learn, act, and imagine. Participants research a solution, act on that solution, and then imagine what the result would be if their action could be scaled up to a whole community, tribe, or nation. Participants choose winners for each challenge. What the winners receive is online mentorship from worldfamous authorities in the fields of social policy and business, who give them specific advice, suggestions, and connections that inspire them to scale their idea. Game organizers hope that from among the Evoke graduates (as winners are called), a new generation of leaders will be identified, trained, encouraged, and supported as they attempt to find realistic and creative solutions to seemingly intractable problems.4

McGonigal insists that many problems are intractable only because we keep trying the same failed solutions. A massive game that virtually anyone can play offers the promise that you do not have to be rich or famous to contribute and do good in the world. It also offers the hope that small actions, when aggregated, can make a significant difference.

If reality really is broken, as McGonigal believes, can games actually fix it? Certainly social-activist gaming is an inspired form of attention—one in which the need to know is coupled with a collaborative seeing that makes visible all the possibilities around the peripheries. It exposes the unique abilities that members of the human “toon” can contribute. If we understand games in this way, then, yes, games may well be able to help us solve real-world problems in the future.

022

Why games? First and foremost, because right now a lot of people play them. According to a September 2008 study conducted by the Pew Internet and American Life Project, 97 percent of American kids aged twelve to seventeen play digital games.5 That’s just about all of them.

Maybe there’s a reason why games have captured so much of our kids’ attention in this digital age. For the last fifteen years the default position has been to dismiss games—typically video games—as a waste of time at best and dangerous at worst. But maybe instead of dismissing them, instead of assuming we understand the reasons behind games’ tremendous popularity among kids, we should be curious. Why games? Why now? Let’s take games seriously for a moment and see what we can learn from them.

To be sure, throughout human history, games have been considered an important tool for teaching complex principles and honing sophisticated forms of procedural thinking—in other words, a form of cognition that sees all parts of a problem, all the possibilities for winning and for losing, and that responds with the best set of responses (procedures) to maximize success. Games have long been used to train concentration, to improve strategy, to learn human nature, and to understand how to think interactively and situationally. In many games, the objective is to move toward a goal while remaining aware of the action happening around the edges. Games teach the skills of winning and the skills of losing and what to do with the spoils. Procedural thinking is key in ancient strategy games such as Go or chess or in the most successful modern board game ever invented, Monopoly, released by Parker Brothers at the height of the Great Depression, in 1935, and played by around 750 million aspiring capitalists since. This type of thinking is taken to an entirely new level of complexity in massively multiplayer online role-playing games, the biggest and most important of which is World of Warcraft, released by Blizzard Entertainment in 2004 and boasting nearly 12 million monthly subscribers by fall of 2008. What we might find surprising is that World of Warcraft isn’t just for kids. As of 2010, the average age of a player is thirty-two.

Games are unquestionably the single most important cultural form of the digital age. The average gamer in America now spends about ten thousand hours online by the age of twenty-one—about the same amount of time he might spend in school from fifth grade to high school graduation. That’s also the “magic number” often cited as the amount of time one needs to invest in a difficult new enterprise to go from mediocrity to virtuosity, whether playing the violin or learning a new language. Most parents would be appalled that their child could fritter away so many hours on games, but game designers like McGonigal would counter that we should be harnessing that energy in the service of learning, in the classroom and outside it in the world.

Given how much heat video games continue to take, McGonigal has an uphill battle. After all, aren’t video games what we blame for destroying our children, for dumbing them down and leading them astray? For over a decade, we’ve had report after report saying video games are bad for kids. Yet what’s becoming increasingly clear is that much of what we know about the impact of video games is shaped less by the games themselves than by historical events that shaped our culture’s views of online gaming.

Until the Pew report came out in 2008, one had to go all the way back to the 1980s and 1990s to find real studies of games. By “real studies,” I mean ones that were designed to measure actual experience of game play and what those experiences and effects might be, rather than studies distorted from the outset by fears and value judgments. Like Francis Galton’s end-driven calculations, most of the game studies from the early twenty-first century are framed not to find out if games are good or bad for our children, but how bad the games are for them. In other words, there is a definite negative bias in the design of most of the classic studies of the harmful effects of video games conducted in the last decade.

But strangely, that bias is not present in studies done during the previous decade. Why is that? In a word: Columbine. That horrific school shooting, with its images of young boys armed to the teeth and dressed like assassins in violent video games, on a quest to pop off their schoolmates, brought much of the research on the cognitive, learning, and attentional benefits of video games to an abrupt halt. In the years after Columbine, virtually all studies of games begin with the assumption that the effects of games are intrinsically negative; typically, the point of the study is to find out more about how pervasive this negative effect is, what repercussions it is having, and implicitly how knowing this can help us cure the problem. Games were in a category of known negatives, like dropping out of school, breaking and entering, or bullying.

I am not suggesting that the studies were false or wrong. It is simply that, as we know from Duncan Germain’s sixth graders at Voyager, most scientific studies begin with the statement of a problem. That’s what a hypothesis is. The study then tests that hypothesis. We know from studies of attention blindness that what we focus on is what we see. If our studies focus on the negative impact of games on individuals and groups, we will find out a lot about the relative harm caused by games on individuals and groups.

Studies of video games, in other words, tend to be structured with gaming assumed to be the known problem to be addressed, if not solved. The point of the study is not to find out everything one can possibly discover about how video games are influential. Since Columbine, we’ve spent a lot of research dollars and time looking for the negative moral, social, psychological, and even physiological effects of video games. We’ve been trying to find out how bad “the problem” is, in what ways, and what is necessary to cure it.

Yet if we go back and look at the early research on gaming from the 1980s, now mostly forgotten, we see that the whole structure of the research around video games took a different form. Back then, television was considered too passive a medium to be good for the brain. It was considered too silly to help one’s concentration and too debased or violent to be good for one’s moral development. In fact, as would later happen with games, many studies of television “prove” it is bad for brain development, attention, and morality. From this perspective, the active play of first-generation video games seemed promising. It was a potential solution to the couch-potato syndrome.

So in the 1980s we find literally dozens of reports that ask about the potential developmental, cognitive, and learning benefits of video games. And, lo and behold, these studies find answers to that question, and mostly positive answers.

These studies are based on early video games such as Space Invaders (1978) and Pac-Man (1980).6 A study from 1986, for example, shows how playing video games improves not just visual perception but visual judgment—our ability to extrapolate from what we see to an appropriate decision or an action. Playing video games improves several different ways of processing visual data, improving peripheral visual recognition, spatial accuracy, and focus. Most striking, it does so not only during game play but also afterward, once the game has been turned off. It is as if repetitive playing of video games improves our ability to see across time and space, to both see and predict patterns while moving, and to be able to see things that are moving around us. It helps to train our eyes for better tracking, in other words, not just in the game itself, but in the real world too.7

Other research found improved hand-eye coordination, enhanced physical dexterity, and more precise fine motor movements as a result of game play, again not only during the game but after it. Still other studies pointed to improvement (sometimes quite dramatic) in reaction time, short-term and working memory, rapid evaluation of positive and negative inputs, and in games played with others, cooperation and team-building skills.8 Other studies record improved depth perception and night perception, dynamic spatial skills, and the ability to translate from two- to three-dimensional judgment—a crucial skill in transforming any kind of blueprint into a physical object (whether it’s an architectural plan or a dress pattern).

Significant increases were also measured in what is called the mental rotations test, a component of IQ tests that requires extrapolating what a three-dimensional object will look like if it is rotated on its axis. That was big, exciting news back in the distant twentieth century, when gaming was being championed by numerous scientists for its potential to make us smarter.

What about attention? In the studies of the late 1980s and early 1990s, video-game players were found to respond faster and more accurately than nonplayers to both the easiest and most difficult attention-stimulus tests.9 As psychologists C. Shawn Green and Daphne Bavelier note, game playing greatly increases “the efficiency with which attention is divided.” Simply put, that means that gamers showed improved ability in what we would now call multitasking. They showed improvement relative to their own multitasking skills before they played games, and they improved relative to nongamers, too.10 These early studies also revealed that game playing makes gamers more able to respond to unexpected stimuli: a new antagonist appearing on your digital visual field, a child running out into the road in front of your car, or a sudden drop in your company’s capitalization.

In the 1980s, when studies of the positive benefits of video games were most prevalent, the term video games meant arcade games. The Internet wasn’t yet publicly available, and access to these early games was relatively limited. One often played in public and was more or less supervised. Even after the 1989 release of Nintendo’s Game Boy in the United States, game playing was not a subject of public concern. Some educators warned that too much time on the Game Boy could potentially be “addictive,” but scientists were mainly engaged in testing the device’s potential benefits. Cognitive psychologists and physiologists emphasized the importance of reinforcing good behavior, and games offered immediate feedback for good or bad game play. Psychologists were eager to know how repetitive play with such positive reinforcement might even lead to improvements in various human capacities, such as visual perception and spatial understanding. Educators wondered if the games could be useful for teaching 3-D concepts, such as advanced mathematics. The scientific community buzzed with excitement about improvements on spatial and motor tasks from game play, and researchers were even finding beneficial physiological changes. Game play seemed to regulate glucose levels and dopamine production, and have other positive metabolic impacts. Like the arcade games before them, those Game Boys seemed all to the good.11

Although schools are notoriously slow to adopt new technologies for classroom use, educators applauded the benefits of these early games. A Game Boy was considered an acceptable purchase for parents eager to contribute to the learning abilities and well-being of their child.12 This general approval of handheld gaming devices’ benefits contributed to their sales and was aided by the fact that, in the late 1980s, Japan was synonymous with smart, next-generation, visionary electronics—and really smart kids. It was thought that America was slipping relative to Japan, and some were sure that the flashy electronics coming out of Japan were at least part of the key to that country’s ascendancy and our decline.

When you go on YouTube and look at the Nintendo commercials from the time, you notice how exuberant they are in their insistence that the Game Boy can “go anywhere”—in the back jeans pocket, inside the medical lab coat, in the thick-gloved hands of astronauts in space. A source of intelligence and power as well as fun, the commercials all imply, these devices don’t have to be just for kids. Adults can benefit from them too. Or in the tagline from a popular TV commercial of the early 1990s, “You don’t stop playing because you get old—but you could get old because you stop playing.”13

THE INTRODUCTION OF VIOLENT QUASI-REALISTIC narrative began to alter the public attitude toward video games. Cute shapes and colored boxes were suddenly replaced by first-person shooter games set in simulated real-life settings. The story lines in these games made visible and even interactive our society’s values of power, dominance, control, and violence. Games often lack nuance, rendering a society’s assumptions and prejudices in stark terms of good and evil. Who gets the points and who loses, who dies and who moves on to the next level? Some games do this with serious, educational, thoughtful intent. Others simply require shooting people who are weaker (at the game and in society) than you are.

Coupled with the hyperrealism of a new generation of strategy games, the school shooting in Columbine, Colorado, indelibly changed American attitudes toward youth, schools, and our digital age. On April 20, 1999, when Eric Harris and Dylan Klebold walked into Columbine High School and began murdering their classmates in calculated cold blood, adults raced forward to find an external cause for such terrifyingly inexplicable youth violence. Along with a preference for loud rock music as embodied by the cross-dressing Marilyn Manson, it turned out that the adolescent murderers were devotees of the first-person shooter games Wolfenstein 3D and Doom, both of which featured graphic violence that made the days of Pac-Man seem a distant past.

By the turn of the new millennium, the verdict was in. Kids coming of age in humanity’s fourth great information age were being destroyed in every way and on every level by their obsession with computer games. Although the official reports on Columbine by the Secret Service and the Department of Education emphasize both the extreme rarity of targeted school shootings (one in a million), and the absence of any one indicator that a child will move from persecuted to persecutor, the mainstream media were quick to cite video games as a cause of school violence. While the Secret Service report paid far more attention to access to weapons than to any presumed influence from violence on the news, in the papers, or in other media (whether television, movies, or video games),14 the finger-pointing at video games was intense. Was this because mass media perceived video games to be a threat to their own commercial interests? Or just an ages-old tendency to blame social problems on new technology? Probably a bit of both.

The parents of some of the victims at Columbine filed a class-action lawsuit against Sony America, Atari, AOL/Time Warner, Nintendo, ID Software, Sega of America, Virgin Interactive Media, Activision, Polygram Film Entertainment Distribution, New Line Cinema, and GT Interactive Software. The complaint alleged, “Absent the combination of extremely violent video games and these boys’ incredibly deep involvement, use of and addiction to these games and the boys’ basic personalities, these murders and this massacre would not have occurred.”15 The lawsuit was eventually dismissed, on the grounds that computer games are not “subject to product liability laws,” but within the popular imagination, the causal connection has remained powerful.

By the year 2000, games were being scapegoated as the cause of many supposed youth problems. There was a general condemnation of violent video games and, more, a consensus that Resident Evil, Grand Theft Auto, and Final Fantasy had turned nice boys and girls into know-nothing misfits capable of pathological violence. References to serious disorders of attention and learning disabilities were part and parcel of the critique. It’s not surprising that, along with the 2002 passage of No Child Left Behind, with its emphasis on multiple-choice testing and a standardized body of required knowledge, there was also a widespread disparagement of video games and talk of banning them in schools or for kids. The language was all about containment: restriction, enforcement, toughening up, regulating, and controlling seemed to be necessary for America’s wayward younger generation.

As is often the case, these social anxieties were quickly transformed into studies by the scientific community. Research funding that had previously gone to studying the potentially positive benefits of games on attention, motor skills, and visual perception was replaced mostly by funding for research into the potential negative side effects of the games. Research funding from many different agencies, from the National Institutes of Health to the Department of Education to private nonprofit and philanthropic foundations, was funneled to studies of games’ harmful effects. Following Columbine, study after study posited corrosive moral, social, and attentional effects of video games—and, by extension, the Internet—on our children’s lives.

IN A TRANSFORMATION THAT HAS happened often in the last 150 years of scientific and empirical study, the paradigm had switched. Typically, a major social or technological attitude change occurs and research follows it. Then there is another change, and a new line of research is developed in a different, sometimes competing direction. In the first decade of the twenty-first century, we see pundits claiming that technology destroys our brain, especially in children. We see the tendency in a spate of books with alarmist titles and subtitles, such as Maggie Jackson’s Distracted: The Erosion of Attention and the Coming Dark Age, Mark Bauerlein’s The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future (or, Don’t Trust Anyone Under 30), or Nicholas Carr’s The Shallows: What the Internet Is Doing to Our Brains (the follow-up to Carr’s earlier article “Is Google Making Us Stupid?”).16 The assumption here is that kids are being ruined by technology. They are disabled, disordered—and disorderly. They can’t sit still anymore. They need special care and special treatment. They need special diagnosis, special testing.

Not surprisingly, the number of reported cases of attention disorders—and the variety of the diagnoses—soared in the United States during this decade. Some studies suggest that up to 25 percent of those in high school today have some actual form of or a “borderline” tendency toward attention deficits or learning disorders.17 The language in these studies is similar to that of infectious disease, with video games, like mosquitoes, passing the malarial attention infection from child to child. No one is safe.

With these rising numbers as well as with the hyperbolic press, we have to wonder whether we really know that our kids are being overwhelmed by an attentional “epidemic” driven by video games. Based on what evidence?

Very little, actually.

First, there is little to suggest that “kids today” are more violent because of video games. There isn’t even evidence that kids today are any more violent than kids yesterday. Demographically, the opposite is the case. This is the least violent generation of kids in sixty years, perpetrating less violence against others or themselves. Because violence of both types correlates strongly with poverty, the declining numbers are even more significant if one adjusts for changing poverty rates as well. Youth violence is still too high, of course, and young people between eighteen and twenty-five remain the group most likely to meet violent deaths. But the numbers of such deaths are lower now proportionally than they were in 1990.18

The numbers have also declined for depression, mental health disorders, and self-inflicted violence. The most comprehensive longitudinal surveys are offered by UCLA’s Higher Education Research Institute, which has been surveying entering college students, hundreds of thousands of them, since 1966. The institute’s surveys indicate that the percentage of college freshmen describing themselves as “frequently depressed” has fallen from a peak of 11 percent in 1988 to 7 percent in 2005 and 2006. Suicide rates among youth fell 50 percent over the same time period and “self-destructive deaths” from drug overdoses, accidental gun accidents, and violent deaths classified as having “undetermined intent” have declined more than 60 percent in the same population over the last thirty-five years. Other surveys indicate that this generation has the least drug use and the lowest rate of arrests over that same time period.

By no quantitative, comparative measure are the digital-era kids, those born after 1985, bad, endangered, vulnerable, or in other ways “at risk,” even in numbers comparable to earlier generations. The kids are all right. And 97 percent of them play games.

If kids cannot pay attention in school, it may be less because they have ADHD and more because we have a mismatch between the needs and desires of students today and the national standards-based education based on the efficiencies of a classroom created before World War I. Are kids being dumbed down by digital culture or by our insistence on success and failure as measured by testing geared to lower-order and largely irrelevant item-response skills? Are video games the problem, or, as I suspect, is the problem our widespread abdication of the new three Rs (rigor, relevance, and relationships) in favor of antiquated testing based on rote memorization of random facts that has little to do with the way kids actually read, write, and do arithmetic online—or will need to do it in the workplace?

023

Of course some video games are appalling in their violence. And of course there are horrific school shooters who play video games. If 97 percent of teens are now playing games, there is likely to be, among them, someone seriously disturbed enough to perpetrate a tragedy. Or to win eight Gold Medals in swimming at the 2008 Olympics. Or to do just about anything else. If nearly everyone is playing video games, we’re not talking about a social problem anymore. We’re talking about a changed environment.

The Pew study that generated the 97 percent figure is the first in-depth ethnographic study in the twenty-first century of how kids actually play games, not of how we think games affect them. It shows that kids who spend time playing video games report more, not less, time for their friends; 65 percent of teens say that they play games with others in the same room, and 76 percent claim they help others while playing and get help too. And they seem to spend more, not less, time on civic engagement than kids past; 44 percent report learning about social problems from their games, 52 percent say they learn about moral and ethical issues from their games, and 43 percent say games help them make decisions about how their community, city, or nation should be run. Absorption in games doesn’t contradict social life, civic engagement, focus, attention, connection with other kids, or collaboration. On the contrary, it seems to promote all of the above. As Professor Joseph Kahne, the leader of the Pew study notes, “We need to focus less on how much time kids spend playing video games and pay more attention to the kinds of experiences they have while playing them.”19

By all statistical measures these digital natives happen to be the happiest, healthiest, most social, most civic-minded, best adjusted, and least violent and self-destructive teens since large demographic surveys began at the end of World War II. They’re having fun and learning too.20 Yet as a society, we tend to act as if we do not believe this is true: We limit our kids, test and retest them, diagnose them, medicate them, subject them to constant surveillance, and micromanage them as if they are afflicted with a debilitating national disease.

DESPITE THE TURN THINGS TOOK after Columbine, much of the research on video games done in the 1980s and 1990s has had a tremendously positive and lasting impact—just not for kids. One of the great ironies, and perhaps one could even say tragedies, of the early twenty-first century is that the fear of video games, however well intentioned, has made them virtually off limits to educators even as adults have benefited profoundly from them.

Many of the findings on the cognitive benefits of game play have been translated with great success from the potential of games for teaching kids to the potential of games for teaching adults. For example, game mechanics have been incorporated with very positive results in learning simulators that have been used for over a decade to supplement professional training. Learning simulations are now commonplace in the training of architects, engineers, mechanics, airplane and ship pilots, machine operators, draftsmen, surgeons, dentists, coal miners, racecar drivers, nurses, lab scientists, astronauts, and dozens of other professionals.

The military has also been quick to employ sophisticated games for training, testing, and recruiting its members. America’s Army is widely regarded as one of the best, most exciting, and most cognitively challenging of all games. It is entertaining but it is also a learning game. The military firmly believes (and has for upwards of two decades now) that superior gamers make superior soldiers—and vice versa. The military is one sector that has continued to do research on the cognitive benefits of games, but its attention has focused, of course, on adults rather than kids.

The impressive findings from the studies of games done in the 1980s and 1990s also prompted the development of a variety of tools for so-called lifelong (i.e., adult) learning. You can now buy programs and devices based on game mechanics that will help you learn everything from a second language to musical sight reading, bird-watching, knitting, and even introductory computer skills.

There has been another powerful use of this early research as well. Several of these studies point to the far-reaching, positive effects for the elderly of playing video games. This goes back to that early television commercial warning that you don’t have to stop playing games when you get older but that you can grow old if you don’t continue to play Nintendo. Game mechanics are the basis for hundreds, perhaps thousands of simulation and rehabilitation programs used for everything from stroke recovery to postoperative adjustment after eye or ear surgery to cognitive adjustment to computerized hearing aids and cochlear implants. They’ve also been used to help patients understand and use prosthetic limbs or electronically enhanced wheelchairs.

On the nonprofessional medical level, there has been a boom industry in games designed specifically to aid the middle aged and even the elderly in memory retention and enhancement. Brain-Age and other games for oldsters promise to make your brain young again. Research suggests that, if used correctly, to stretch rather than to habituate patterns, these interactive, level-based games can, indeed, show benefits.

Imagine what schools would be like today if the design work that has gone into fighter-pilot simulators and the splendid interfaces in geriatric games had gone into challenging and stimulating learning games for schoolkids? Quest 2 Learn gives us a precious glimpse, and we can take heart in the fact that the school has grown so popular as to necessitate a lottery system. I hope the day will come soon when kids can be guaranteed an education that good without having to hit the jackpot.

But take heart, games are on the march. Since 2007, when I began codirecting the HASTAC/MacArthur Foundation Digital Media and Learning Competition, I’ve seen not just dozens but hundreds of projects that use digital means to advance and enhance classroom learning. One of the winners of our first competition recreated America’s Army as Virtual Peace. Working with the makers of America’s Army, Timothy Lenoir created a simulation game that is fast, challenging, and fun—but which focuses not on violence but on conflict resolution.21 Students, teachers, game developers, computer scientists, and policy experts worked together to develop the scenarios as well as to create templates for future games that will be available, along with their open source code, to educators and the public. To win at Virtual Peace, you have to understand all of the variables of the countries in conflict, the roles of all of those negotiating the conflicts, the culture of argument and negotiation among the parties, the languages of the parties involved, and many other factors. Lenoir is now working with the renowned global health physician-activist Paul Farmer and his organization, Partners in Health, to create a version of Virtual Peace set in Haiti. It will be used by relief workers, doctors, educators, both from Haiti and abroad, to help understand the challenges of renovating that devastated country, and it will be available for educators to use in high school civics classes.

THE TIME NOW IS RIGHT to go back and reconsider all those pre-Columbine studies. We need research on the benefits to people of all ages offered by this challenging, test-based gamer way of thinking. If gaming absorbs 97 percent of kids in informal learning, it can be an inspiring tool in schools as well—including, potentially, an assessment tool. I don’t mean measuring how well the game scores of U.S. students stack up against those of other nations. I mean that games could provide teachers, students, and parents with an ongoing way of measuring learning, with the game mechanics themselves designed to give feedback to the students about how they are doing as they are doing it. If Sackboy and Sackgirl don’t do well, they are stuck on the same level for the next round of game play. That’s boring. Any kid will tell you that boredom—that is, avoiding it—is a great motivator in the life of a kid. So to escape boredom, you have to learn more, try harder, and then you not only “win,” but more important, you progress to the next level of play, with ever more scenarios, ever more challenges to entertain you but also keep you learning.

We might all have benefited over the last two decades if we’d done a better job of thinking through the positive and creative ways we might actually use the technology to which we are tethered for so much of our waking lives. It’s salutary, surely, to think that this computer interfacing with my fingertips is actually improving me in ways cognitive and even social.

Games provide something else, too: flow. Mihaly Csikszentmihalyi, the distinguished psychology professor heading the Quality of Life Research Center at the Claremont Graduate University, has written extensively about the sublime happiness that emerges from certain creative activities that engage you so completely that you lose track of time and place. Flow activities, as Csikszentmihalyi calls them, assume that the brain multitasks on its own. Such activities provide multiple forms of multisensory stimuli in situations that require total mind-body activity, testing your mental, emotional, and physical skills simultaneously. As you become more proficient in these activities, your brain is suffused with blood flow, pumped through with energy, clarity, and optimism.

Csikszentmihalyi has concluded that TV watching leads, in a neuroanatomical way, to low-grade depression. By contrast, he defines as ideal flow activities playing chess, rock climbing, dancing to intensely rhythmic music

(specifically, he says, rock music), and performing surgery.22 To this list, many ardent gamers would add game playing as a fifth all-consuming flow activity. To those who do not game, the teenager playing World of Warcraft may well seem distracted, inattentive, and unproductive. But when that teenager is playing well, he is in the flow and far more efficient, attentive, engaged, creative, and happy than Dad in the armchair across the room watching Fox News at the end of the workday, with the newspaper in his lap.

In a recent issue of the Harvard Business Review, two major thinkers have even argued that the teenage gamer might signal the skills needed for potential leadership in politics and commerce for our age. John Seely Brown, one of the earliest and perennial visionaries of the information age, and Douglas Thomas, a communications professor at the University of Southern California and himself a champion gamer, have noted that multiplayer online games are “large, complex, constantly evolving social systems.” These researchers have defined five key attributes of what they term the “gamer disposition”:

1. they are bottom-line oriented (because games have embedded systems of measurement and assessment);

2. they understand the power of diversity (because success requires teamwork among those with a rich mix of talents and abilities);

3. they thrive on change (because nothing is constant in a game);

4. they see learning as fun (because the fun of the game lies in learning how to overcome obstacles); and

5. they “marinate” on the edge (because, to succeed, they must absorb radical alternatives and imbibe innovative strategies for completing tasks).23

It is hard to think of five better qualities for success in our always-on digital age.

Even if kids are natural users of the digital (and by natural I mean that they have learned how to interface with computers from a very early age), they are not perfect users. Yet game play is about working harder and harder, receiving constant feedback on your progress, and progressing to the next level when you’ve mastered the last one. It is about striving for perfection. There is no bell curve in game play. Given that motivation, it is possible that video games are an ideal preparation for the interactive, iterative, multitasking, and collaborative world into which our kids are coming of age, a world they will need to navigate, lead, and, by their leadership, transform.24

Given this generation’s interest in everything from global finance to global warming, today’s young digital thinkers could well end up saving our world. Their paradigm of success is less the old management model of business schools than the gamer ideal of daring and sometimes even gonzo strategizing—working together as a team toward innovation on the global scale. The gamer model of success is also remarkably collective, grassroots, communitarian, coordinated, and distributed. If it can be translated into the future world of work—in the form of digital and social entrepreneurship of the kind Jane McGonigal espouses—it is possible that youth will be able to strategize solutions that work and that, in the end, are also rewarding and possibly even fun.25

And their motivation is the “magic circle,” another gamer term for the way the mechanics of a great game capture your attention so completely that you never want to stop. Games resemble the endless curiosity of a small child asking, “Why?” Everything is a puzzle to be solved. Everything is a source of curiosity. Every answer leads to another question—and another quest. Can these kids who have inherited our digital age fix the realities that are broken? Do their passions and experiences prepare them for the future they have inherited and that, very soon, they will help shape?

024

Now a septuagenarian, Alvin Toffler is still thinking about the future, just as he was back in the 1970s. When recently asked what he thought was the “most pressing need in public education” right now, he answered, “Shut down the public education system.” He noted that Bill Gates thinks the same thing.26

I’m not sure if we need to shut our schools down, but we need a time-out and we need a time of systematic unlearning. We need to rethink what we want from education from the bottom up. We need to build a system of education that prepares our kids for the twenty-first century, not the twentieth.

Everything about education today is based on a hunt-and-peck mode of understanding—A, B, C, D, all of the above, none of the above. As we’ve seen, compared to the creative ways the best teachers teach, our classrooms are generally among the most intellectually straitjacketed environments our kids enter. What a missed opportunity! Graded, segregated by topics and subjects, and with EOG assessments based on multiple-choice answers and rote memorization of facts, most education is still stuck, institutionally and instructionally, in the ideology and the methodology of the industrial age. We have done little to transform education in view of the world of information at our kids’ fingertips. We’ve done little to make them understand, like those kids playing LittleBigPlanet on the snow day, that exuberant, engaged fun also counts as learning. We’re underestimating our children. We are shortchanging rather than challenging them.

If kids are enthralled with games, why not make exciting ones that also aid and inspire learning? Good learning is about inspiring curiosity. Good gaming is the same. We fret a lot about how our kids are falling behind, especially in science and math. Yet fostering standardization is exactly inimical to the scientific, inquiry-based, deductive and inductive reasoning skills that good science demands. With ever more rigid standards, all we are doing is killing the spark of curiosity that virtually every child brings into the kindergarten classroom. I would argue that we are especially strangling the exploratory, learn-bydoing, testing, retesting, modifying, rejecting, revising, and trying again form of thinking that is the scientific method.

Game play—procedural, strategic thinking—is far more conducive to inspired science making than is cramming for end-of-grade tests. The more we learn about children’s cognitive development, the more we realize that kids have far greater complexity of thought at earlier ages than we previously assumed. Game play not only requires complex social and conceptual mapping, but also the mathematical skills that we (in the West) previously thought could not be learned until adolescence. New work coming out of a series of preschools being pioneered in some of the poorest urban areas in cities, including Boston, Washington, and Nashville, has been showing that conventional ideas about children’s readiness to learn geometry, reading, language, and even self-control in the classroom are simply wrong.27 Cognitive neuroscientists have shown that kids can learn these concepts as long as the concepts are incorporated into inquiry-based learning projects, like learning games.

This really should not be so shocking, for in countries like China or India, high-level math-based methods are taught as part of rhyming games even at the toddler stage. The math then becomes more and more complex as the child plays the games more often and becomes more adept at them. There are verbal and physical variations on Nim, the ancient mathematical strategy game that began in Asia (some say China, others India) and that moved throughout the world the way cinnamon and indigo did, as part of ancient intercultural flows. In Nim, players take turns removing objects from a heap in a way that leaves the opponent with the last piece. Complex calculating skills are also taught in the play of three- and four-year-olds among many indigenous peoples in Latin America and elsewhere, often using systems of knots made in cords, with accompanying songs, dances, and other forms of play.

The key component in these examples, as with all the examples of excellent education we’ve seen, is learning as play and tests as challenges. What better gift could we give our children than to learn to enjoy as they grapple, to aspire higher when they stumble, to find satisfaction as the challenges become ever greater? As the adage goes, the person who loves to work never has to.

All over America there are exceptional teachers who, against odds, are trying to find ways to motivate and inspire kids even as they have to prepare them for the end-of-grade tests that embody the antithesis of motivated, inspired, challenge-based, interactive learning that students experience in their daily digital lives. The HASTAC method of collaboration by difference, which we used in the iPod experiment and which I used in This Is Your Brain on the Internet, elevates to a university-level pedagogical method the chief exploratory and interactive mode of the digital age. It’s the same method Katie Salen uses at Quest 2 Learn gaming school in New York, the same method used by the late Inez Davidson in her three-room schoolhouse in Mountain View, Alberta, and by Duncan Germain in Creative Productions. We can learn from all these teachers.

Our kids are all right. Despite our national educational policy, many kids are getting the kind of dexterous, interactive learning they need. Online, in games and in their own creative information searching, they are learning skills that cannot be replaced one day by computers. They are mastering the lessons of learning, unlearning, and relearning that are perfectly suited to a world where change is the only constant. They are mastering the collaborative methods that not only allow them to succeed in their online games but that are demanded more and more in the workplace of their future.