Coda - How We Decide - Jonah Lehrer

How We Decide - Jonah Lehrer (2009)

Coda

There are certain statistics that seem like they'll never change: the high school dropout rate, the percentage of marriages that end in divorce, the prevalence of tax fraud. The same used to be true of plane crashes that were due to pilot error. Despite a long list of aviation reforms, from mandatory pilot layovers to increased classroom training, that percentage refused to budge from 1940 to 1990, holding steady at around 65 percent. It didn't matter what type of plane was being flown or where the plane was going. The brute fact remained: most aviation deaths were due to bad decisions in the cockpit.

But then, starting in the early 1990s, the percentage of crashes attributed to pilot error began to decline rapidly. According to the most current statistics, mistakes by the flight crew are responsible for less than 30 percent of all plane accidents, with a 71 percent reduction in the number of accidents caused by poor decision-making. The result is that flying has become safer than ever. According to the National Transportation Safety Board, flying on a commercial plane has a fatality rate of 0.04 per one hundred million passenger miles, making it the least dangerous form of travel by far. (In contrast, driving has a fatality rate of 0.86.) Since 2001, pilot error has caused only one fatal jetliner crash in the United States, even though more than thirty thousand flights take off every day. The most dangerous part of traveling on a commercial airplane is the drive to the airport.

What caused the dramatic reduction in pilot error? The first factor was the introduction in the mid-1980s of realistic flight simulators. For the first time, pilots could practice making decisions. They could refine their reactions to a sudden downdraft in a thunderstorm and practice landing with only one engine. They could learn what it would be like to fly without wing flaps and to land on a tarmac glazed with ice. And they could do all this without leaving the ground.

These simulators revolutionized pilot training. "The old way of teaching pilots was the 'chalk and talk' method," says Jeff Roberts, the group president of civil training at CAE, the largest manufacturer of flight simulators. Before pilots ever entered the cockpit, they were forced to sit through a long series of classroom lectures. They learned all the basic maneuvers of flight while on the ground. They were also taught how to react in the event of various worst-case scenarios. What should you do if the landing gear won't deploy? Or if the plane is struck by lightning? "The problem with this approach," Roberts says, "is that everything was abstract. The pilot has this body of knowledge, but they'd never applied it before."

The benefit of a flight simulator is that it allows pilots to internalize their new knowledge. Instead of memorizing lessons, a pilot can train the emotional brain, preparing the parts of the cortex that will actually make the decision when up in the air. As a result, pilots who are confronted with a potential catastrophe during a real flight—like an engine fire in the air above Tokyo—already know what to do. They don't have to waste critical moments trying to remember what they learned in the classroom. "A plane is traveling four hundred miles per hour," Roberts says. "It's the rare emergency when you've got time to think about what your flight instructor told you. You've got to make the right decision right away."

Simulators also take advantage of the way the brain learns from experience. After pilots complete their "flight," they are forced to endure an exhaustive debriefing. The instructor scrutinizes all of their decisions, so that the pilots think about why, exactly, they decided to gain altitude after the engine fire, or why they chose to land in the hailstorm. "We want pilots to make mistakes in the simulator," Roberts says. "The goal is to learn from those mistakes when they don't count, so that when it really matters, you can make the right decision." This approach targets the dopamine system, which improves itself by studying its errors. As a result, pilots develop accurate sets of flight instincts. Their brains have been prepared in advance.

There was one other crucial factor in the dramatic decline of pilot error: the development of a decision-making strategy known as Cockpit Resource Management (CRM). The impetus for CRM came from a large NASA study in the 1970s of pilot error; it concluded that many cockpit mistakes were attributable, at least in part, to the "God-like certainty" of the pilot in command. If other crew members had been consulted, or if the pilot had considered other alternatives, then some of the bad decisions might have been avoided. As a result, the goal of CRM was to create an environment in which a diversity of viewpoints was freely shared.

Unfortunately, it took a tragic crash in the winter of 1978 for airlines to decide to implement this new system. United Flight 173 was a crowded DC-8 bound for Portland, Oregon. About ten miles from the runway, the pilot lowered the landing gear. He noticed that two of his landing-gear indicator lights remained off, suggesting that the front wheels weren't properly deployed. The plane circled around the airport while the crew investigated the problem. New bulbs were put in the dashboard. The autopilot computers were reset. The fuse box was double-checked. But the landing-gear lights still wouldn't turn on.

The plane circled for so long that it began to run out of fuel. Unfortunately, the pilot was too preoccupied with the landing gear to notice. He even ignored the flight engineer's warning about the fuel levels. (One investigator described the pilot as "an arrogant S.O.B.") By the time the pilot looked at his gas gauge, the engines were beginning to shut down. It was too late to save the plane. The DC-8 crash-landed in a sparsely populated Portland suburb, killing ten and seriously wounding twenty-four of the 189 on board. Crash investigators later concluded that there was no problem with the landing gear. The wheels were all properly deployed; it was just a faulty circuit.

After the crash, United trained all of its employees with CRM. The captain was no longer the dictator of the plane. Instead, flight crews were expected to work together and constantly communicate with one another. Everyone was responsible for catching errors. If fuel levels were running low, then it was the job of the flight engineer to make sure the pilot grasped the severity of the situation. If the copilot was convinced that the captain was making a bad decision, then he was obligated to dissent. Flying a plane is an extremely complicated task, and it's essential to make use of every possible resource. The best decisions emerge when a multiplicity of viewpoints are brought to bear on the situation. The wisdom of crowds also applies in the cockpit.

Remember United Flight 232, which lost all hydraulic power? After the crash-landing, the pilots all credited CRM with helping them make the runway. "For most of my career, we kind of worked on the concept that the captain was the authority on the aircraft," says Al Haynes, the captain of Flight 232. "And we lost a few airplanes because of that. Sometimes the captain isn't as smart as we thought he was." Haynes freely admits that he couldn't have saved the plane by himself that day. "We had 103 years of flying experience there in the cockpit [on Flight 232], trying to get that airplane on the ground. If I hadn't used CRM, if we had not had everybody's input, it's a cinch we wouldn't have made it."

In recent years, CRM has moved beyond the cockpit. Many hospitals have realized that the same decision-making techniques that can prevent pilot error can also prevent unnecessary mistakes during surgery. Consider the experience of the Nebraska Medical Center, which began training its surgical teams in CRM in 2005. (To date, more than a thousand hospital employees have undergone the training.) The mantra of the CRM program is "See it, say it, fix it"; all surgical-team members are encouraged to express their concerns freely to the attending surgeon. In addition, team members engage in postoperation debriefings at which everyone involved is supposed to share his or her view of the surgery. What mistakes were made? And how can they be avoided the next time?

The results at the Nebraska Medical Center have been impressive. A 2007 analysis found that after fewer than six months of CRM training, the percentage of staff members who "felt free to question the decisions of those with more authority" had gone from 29 percent to 86 percent. More important, this increased willingness to point out potential errors led to a dramatic decrease in medical mistakes. Before CRM training, only around 21 percent of all cardiac surgeries and cardiac catheterizations were classified as "uneventful cases," meaning that nothing had gone wrong. After CRM training, however, the number of "uneventful cases" rose to 62 percent.

The reason CRM is so effective is that it encourages flight crews and surgical teams to think together. It deters certainty and stimulates debate. In this sense, CRM creates the ideal atmosphere for good decision-making, in which a diversity of opinions is openly shared. The evidence is looked at from multiple angles, and new alternatives are considered. Such a process not only prevents mistakes but also leads to startling new insights.

TO SIT IN a modern airplane cockpit is to be surrounded by computers. Just above the windshield are the autopilot terminals, which can keep a plane on course without any input from the pilot. Right in front of the thrust levers is a screen relaying information about the state of the plane, from its fuel levels to the hydraulic pressure. Nearby is the computer that monitors the flight path and records the position and speed of the plane. Then there's the GPS panel, a screen for weather updates, and a radar monitor. Sitting in the captain's chair, you can tell why it's called the glass cockpit: everywhere you look there's another glass screen, the digital output of the computers underneath.

These computers are like the emotional brain of the plane. They process a vast amount of information and translate that information into a form that can be quickly grasped by the pilot. The computers are also redundant, so every plane actually contains multiple autopilot systems running on different computers and composed in different programming languages. Such diversity helps prevent mistakes, since each system is constantly checking itself against the other systems.

These computers are so reliable that they perform many of their tasks without any pilot input. If, for example, the autopilot senses a strong headwind, it will instantly increase thrust in order to maintain speed. The pressure in the cabin is seamlessly adjusted to reflect the altitude of the plane. If a pilot is flying too close to another plane, the onboard computers emit loud warning sirens, forcing the flight crew to notice the danger. It's as if the plane has an amygdala.

Pilots are like the plane's prefrontal cortex. Their job is to monitor these onboard computers, to pay close attention to the data on the cockpit screens. If something goes wrong, or if there's a disagreement among the various computers, then it's the responsibility of the flight crew to resolve the problem. The pilots must immediately intervene and, if necessary, take control of the plane. The pilots must also set the headings, supervise the progress of the flight, and deal with the inevitable headaches imposed by air-traffic control. "People who aren't pilots tend to think that when the autopilot is turned on, the pilot can just take a nap," my flight instructor in the simulator says. "But planes don't fly themselves. You can't ever relax in the cockpit. You always have to be watching, making sure everything is going according to plan."

Consider the cautionary tale of a crowded Boeing 747 traveling from Miami to London in May 2000. The runway at Heathrow was shrouded in dense fog, so the pilots decided to make an automated landing, or what's known as a category IIIc approach. During the initial descent, all three autopilot systems were turned on. However, when the plane reached an altitude of a thousand feet, the lead autopilot system suddenly shut down for no apparent reason. The pilots decided to continue with the approach, since the 747 is designed to be able to make automated landings with only two autopilot systems. The descent went smoothly until the plane was fifty feet above the runway, or about four seconds from touchdown. At that point, the autopilot abruptly tilted the nose of the plane downward, so that its rate of descent was four times faster than normal. (Investigators would later blame a programming error for the mistake.) The pilot quickly intervened and yanked back on the control column so that the plane wouldn't hit the runway nose first. The landing was still rough—the plane suffered some minor structural damage—but the quick reactions of the flight crew prevented a catastrophe.

Events like this are disturbingly common. Even redundant autopilot systems will make mistakes. They'll disengage or freeze or steer the plane in dangerous ways. Unless a pilot is there to correct the error, to turn off the computer and pull up the nose, the plane will fly itself into the ground.

Of course, pilots aren't perfect either. They sometimes fail to notice when they're getting too close to another plane, or they struggle to monitor all the different gauges in the cockpit. In fact, if pilots had to rely on their own instincts, they wouldn't even be able to fly through clouds. (The inner ear can't detect blind turns, which means that it's very tough to fly straight without proper instruments or visual cues.) Then there are the pilots that micromanage the flight—constantly overruling the autopilot or fiddling with the path of the plane. They dramatically increase the likelihood of human error, acting like people who rely too heavily on their prefrontal cortices.

When the onboard computers and pilot properly interact, it's an ideal model for decision-making. The rational brain (the pilot) and the emotional brain (the cockpit computers) exist in perfect equilibrium, each system focusing on those areas in which it has a comparative advantage. The reason planes are so safe, even though both the pilot and the autopilot are fallible, is that both systems are constantly working to correct each other. Mistakes are fixed before they spiral out of control.

The payoff has been huge. "Aviation is just about the only field that consistently manages to operate at the highest level of performance, which is defined by six sigma," Roberts says, using the managerial buzzword for any process that produces fewer than 3.4 defects per one million opportunities. "Catastrophic error in planes is incredibly, incredibly rare. If it wasn't, nobody would ever get on board. The fact of the matter is that the aviation industry needs to be perfect, and so we found ways to be as close to perfect as humanly possible."

The safety of flight is a testament to the possibility of improvement. The reduction in the pilot-error rate is a powerful reminder that mistakes are not inevitable, that planes don't have to crash. As the modern cockpit demonstrates, a few simple innovations and a little self-awareness can dramatically improve the way people think, so that both brain systems are used in their ideal context. The aviation industry took decision-making seriously—they made a science of pilot error—and the result has been a stunning advance in performance.

The first step to making better decisions is to see ourselves as we really are, to look inside the black box of the human brain. We need to honestly assess our flaws and talents, our strengths and shortcomings. For the first time, such a vision is possible. We finally have tools that can pierce the mystery of the mind, revealing the intricate machinery that shapes our behavior. Now we need to put this knowledge to work.