Fooled by a Feeling - How We Decide - Jonah Lehrer

How We Decide - Jonah Lehrer (2009)

Chapter 3. Fooled by a Feeling

Ann Klinestiver was working as a high school English teacher in a small town in West Virginia when she was diagnosed with Parkinson's disease. She was only fifty-two years old, but the symptoms were unmistakable. While she was standing at the front of her class trying to teach her students some Shakespeare, her hands started to shake uncontrollably. Then her legs went limp. "I lost control of my own body," she says. "I'd look at my arm, and I'd tell it what to do, but it just wouldn't listen."

Parkinson's is a disease of the dopamine system. It begins when dopamine neurons start to die in a part of the brain that controls the body's movements. Nobody knows why these cells die, but once they are gone, the loss is irrevocable. By the time the symptoms of Parkinson's appear, more than 80 percent of these neurons will be dead.

Ann's neurologist immediately put her on Requip, a drug that imitates the activity of dopamine in the brain. (It's part of a class of drugs called dopamine agonists.) While there are many different treatments for Parkinson's patients, all operate on a similar principle: increase the amount of dopamine in the brain. By making the few surviving dopamine neurons more effective at transmitting dopamine, these medicines help compensate for the massive cell death. They allow a faint electrical signal to break through the ravages of the disease. "At first, the drug was like a miracle," Ann says. "All my movement problems just disappeared." Over time, however, Ann was forced to take higher and higher doses of Requip in order to quiet her tremors. "You can feel your brain going," she says. "I became completely dependent on this drug just to get myself out of bed and put on my clothes. I needed it to live my life."

That's when Ann discovered slot machines. It was an unlikely attraction. "I'd never been interested in gambling," Ann says. "I'd always avoided casinos. My daddy was a Christian, and he raised me to believe that gambling was a sin, that it was something you were never supposed to do." But after she started taking the dopamine agonist, Ann found the slots at her local dog-racing track completely irresistible. She started gambling as soon as the track opened, at seven in the morning, and kept playing the machines until three thirty the next morning, when the security guards kicked her out. "Then I would go back home and gamble on the Internet until I could get back to the real machines," she says. "I was able to keep that up for two or three days at a time." After each of her gambling binges, Ann always swore to stay away. Sometimes, she was even able to stop gambling for a day or two. But then she'd find herself back at the racetrack, sitting in front of the slot machine, gambling away everything she had.

After a year of addictive gambling, Ann had lost more than $250,000. She had exhausted her retirement savings and emptied her pension fund. "Even when I had no money left, I still couldn't stop gambling," she says. "I was living on peanut butter, straight from the jar. I sold everything I could sell. My silverware, my clothes, my television, my car. I pawned my diamond ring. I knew I was destroying my life, but I just couldn't stop. There's no worse feeling than that."

Ann's husband eventually left her. He promised to return if she got control of her gambling habit, but Ann kept relapsing. He would find her at the track in the middle of the night, hunched in front of a slot machine, a bucket of coins in her lap and a bag of groceries on the floor. "I was a shell of a person," she says. "I stole quarters from my grandkids. I lost everything that mattered."

In 2006, Ann was finally taken off her dopamine agonist. Her movement problems came back, but the gambling compulsion immediately disappeared. "I haven't gambled in eighteen months," she says, with more than a little pride in her voice. "I still think about the slots, but the obsession isn't there. Without the drug, I don't need to play those damn machines. I'm free."

Klinestiver's sad story is disturbingly common. Medical studies suggest that as many as 13 percent of patients taking dopamine agonists develop severe gambling compulsions. People with no history of gambling suddenly become addicts. While most of these people obsess over slot machines, others get hooked on Internet poker or blackjack. They squander everything they have on odds that are stacked against them.*

Why does an excess of dopamine in a few neurons make games of chance so irresistible? The answer reveals a serious flaw in the human brain, which casinos have learned to exploit. Think how a slot machine works: You put in a coin and pull the lever. The reels start to whir. Pictures of cherries and diamonds and figure sevens fly by. Eventually, the machine settles on its verdict. Since slot machines are programmed to return only about 90 percent of wagered money over the long term, chances are you lost money.

Now think about the slot machine from the perspective of your dopamine neurons. The purpose of these cells is to predict future events. They always want to know what occurrences—a loud tone, a flashing light, and so forth—will precede the arrival of the juice. While you are playing the slots, putting quarter after quarter into the one-armed bandit, your neurons are struggling to decipher the patterns inside the machine. They want to understand the game, to decode the logic of luck, to find the circumstances that predict a payout. So far, you're acting just like a monkey trying to predict when his squirt of juice is going to arrive.

But here's the catch: while dopamine neurons get excited by predictable rewards—they increase their firing when the juice arrives after the loud tone that heralded it—they get even more excited by surprising ones. According to Wolfram Schultz, such unpredictable rewards are typically three to four times more exciting, at least for dopamine neurons, than rewards that can be predicted in advance. (In other words, the best-tasting juice is the juice that was most unexpected.) The purpose of this dopamine surge is to make the brain pay attention to new, and potentially important, stimuli. Sometimes this cellular surprise can trigger negative feelings, such as fear, as happened to Lieutenant Commander Michael Riley. In the casino, however, the sudden burst of dopamine is intensely pleasurable, since it means that you've just won some money.

Most of the time, the brain will eventually get over its astonishment. It'll figure out which events predict the reward, and the dopamine neurons will stop releasing so much of the neurotransmitter. The danger of slot machines, however, is that they are inherently unpredictable. Because they use random number generators, there are no patterns or algorithms to uncover. (There is only a stupid little microchip churning out arbitrary digits.) Even though the dopamine neurons try to make sense of the rewards—they want to know when to expect some coins in return for all those squandered quarters—they keep getting surprised.

At this point, the dopamine neurons should just surrender: the slot machine is a waste of mental energy. They should stop paying attention to the surprising rewards, because the appearance of the rewards will always be surprising. But this isn't what happens. Instead of getting bored by the haphazard payouts, the dopamine neurons become obsessed. When you pull the lever and get a reward, you experience a rush of pleasurable dopamine, precisely because the reward was so unexpected, because your brain cells had no idea what was about to happen. The clanging coins and flashing lights are like a surprise squirt of juice. Because the dopamine neurons can't figure out the pattern, they can't adapt to the pattern. The result is that you are transfixed by the slot machine, riveted by the fickle nature of its payouts.

For Parkinson's patients on dopamine agonists, the surprising rewards of the casino trigger a massive release of chemical bliss. Their surviving dopamine neurons are so full of dopamine that the neurotransmitter spills over and pools in the empty spaces between cells. The brain is flooded with a feel-good chemical, making these games of chance excessively seductive. Such patients are so blinded by the pleasures of winning that they slowly lose everything. That's what happened to Ann.

The same science that revealed the importance of emotions to making decisions—Tom Brady finds the open man by listening to his feelings—is also beginning to show us the dark side of feeling too deeply. While the emotional brain is capable of astonishing wisdom, it's also vulnerable to certain innate flaws. These are the situations that cause the horses in the human mind to run wild, so that people gamble on slot machines and pick the wrong stocks and run up excessive credit card bills. When emotions get out of control—and there are certain things that reliably make this happen—the results can be just as devastating as not having any emotions at all.

1

In the early 1980s, the Philadelphia 76ers were one of the greatest teams in NBA history. The center of the team was Moses Malone, voted Most Valuable Player in the league. He dominated the paint, averaging twenty-five points and fifteen rebounds per game. The power forward was Julius Erving, a future Hall of Famer, who pioneered the modern style of basketball play with his elegant drives and extravagant slam dunks. In the backcourt were Andrew Toney—his accurate jump shot was a constant offensive threat—and Maurice Cheeks, one of the league leaders in assists and steals.

The 76ers entered the 1982 playoffs with the best record in the NBA. Before the first round of the postseason, a reporter asked Malone what the 76ers thought of their competition. His answer made headlines: "Four, four, four," he said, suggesting that the team would sweep all of their opponents. That had never been done before.

Malone's audacious prediction wasn't far off. During the playoffs, the 76ers' team was like a scoring machine. The offense ran through Malone in the post, but if Malone was double-teamed he simply had to swing the ball over to Erving or kick it out to Toney for a jumper. At times, the players seemed to be incapable of missing shots. On their way to the championship, the 76ers lost one game only, in the second round to Milwaukee. A slightly amended version of Malone's prediction was inscribed on the championship rings: "Fo, five, fo." It was one of the most dominant team performances in basketball history.

While the 76ers were prevailing in the postseason, the psychologists Amos Tversky and Thomas Gilovich were thinking about the imperfections of the human mind. Tversky would later recall watching the NBA games and hearing the television announcers talk about various kinds of streaks. For instance, the sportscasters alluded to the "hot hand" of Julius Erving and said that Andrew Toney was "in the zone." By the time the 76ers reached the NBA finals, the temperature of the team had become a cliché. How could they possibly lose when they were on such a roll?

But all this talk of hot hands and streaks made Tversky and Gilovich curious. Had Moses Malone really become so unstoppable? Could Andrew Toney really not miss a shot? Were the 76ers really as invincible as everyone said? So Tversky and Gilovich decided to conduct a little research experiment. Their question was simple: do players make more shots when they are hot, or do people just imagine that they make more shots? In other words, is the hot hand a real phenomenon?

Tversky and Gilovich began the investigation by sifting through years of 76er statistics. They looked at every single shot taken by every single player and then recorded if that shot had been preceded by a string of hits or misses. (The 76ers were one of the few NBA teams that kept track of the order in which shots were taken.) If the hot hand was a real phenomenon, then a hot player should have a higher field-goal percentage after making several previous shots. The streak should elevate his game.

So what did the scientists find? There was absolutely no evidence of the hot hand. A player's chance of making a shot was not affected by whether or not his previous shots had gone in. Each field-goal attempt was its own independent event. The short runs experienced by the 76ers were no different than the short runs that naturally emerge from any random process. Taking a jumper was like flipping a coin. The streaks were a figment of the imagination.

The 76ers were shocked by the evidence. Andrew Toney, the shooting guard, was particularly hard to convince: he was sure that he was a streaky shooter who went through distinct hot and cold periods. But the statistics told a different story. During the regular season, Toney made 46 percent of all his shots. After hitting three shots in a row—a sure sign that he was "in the zone"—Toney's field-goal percentage dropped to 34 percent. When Toney thought he was hot, he was actually freezing cold. And when he thought he was cold, he was just getting warmed up: after missing three shots in a row, Toney made 52 percent of his shots, which was significantly higher than his normal average.

But maybe the 76ers' team was a statistical outlier. After all, according to a survey conducted by the scientists, 91 percent of serious NBA fans believed in the hot hand. They just knew that players were streaky. So Tversky and Gilovich decided to analyze another basketball team: the Boston Celtics. This time, they looked at free-throw attempts too, not just field goals. Once again, they found absolutely no evidence of hot hands. Larry Bird was just like Andrew Toney: after he made several free throws in a row, his free-throw percentage actually declined. Bird got complacent and started missing shots he should have made.

Why do we believe in streaky shooters? Our dopamine neurons are to blame. Although these cells are immensely useful—they help us predict events that are actually predictable—they can also lead us astray, especially when we are confronted with randomness. Look, for example, at this elegant little experiment: A rat was put in a T-shaped maze with a few morsels of food placed on either the far right or the far left side of the enclosure. The placement of the food was random, but the dice were rigged: over the long run, the food was placed on the left side 60 percent of the time. How did the rat respond? It quickly realized that the left side was more rewarding. As a result, it always went to the left of the maze, which resulted in a 60 percent success rate. The rat didn't strive for perfection. It didn't search for a unified theory of the T-shaped maze. It just accepted the inherent uncertainty of the reward and learned to settle for the option that usually gave the best outcome.

The experiment was repeated with Yale undergraduates. Unlike the rat, the students, with their elaborate networks of dopamine neurons, stubbornly searched for the elusive pattern that determined the placement of the reward. They made predictions and then tried to learn from their prediction errors. The problem was that there was nothing to predict; the apparent randomness was real. Because the students refused to settle for a 60 percent success rate, they ended up with a 52 percent success rate. Although most of the students were convinced that they were making progress toward identifying the underlying algorithm, they were, in actuality, outsmarted by a rat.

The danger of random processes—things like slot machines and basketball shots—i's that they take advantage of a defect built into the emotional brain. Dopamine neurons get such a visceral thrill from watching a hot player sink another jumper or from winning a little change from a one-armed bandit or from correctly guessing the placement of a food morsel that our brains completely misinterpret what's actually going on. We trust our feelings and perceive patterns, but the patterns don't actually exist.

Of course, it can be extremely hard to reconcile perceptions of streaks and runs with the statistical realities of an unruly world. When Apple first introduced the shuffle feature on its iPods, the shuffle was truly random; each song was equally as likely to get picked as any other. However, the randomness didn't appear random, since some songs were occasionally repeated, and customers concluded that the feature contained some secret patterns and preferences. As a result, Apple was forced to revise the algorithm. "We made it less random to make it feel more random," said Steve Jobs, the CEO of Apple.* Or consider Red Auerbach, the legendary Celtics coach. After being told about Tversky's statistical analysis of the hot hand, he reportedly responded with a blunt dismissal. "So he makes a study," Auerbach said. "I couldn't care less." The coach refused to consider the possibility that the shooting streaks of the players might be a fanciful invention of his brain.

But Auerbach was wrong to disregard the study; the belief in illusory patterns seriously affects the flow of basketball games. If a team member had made several shots in a row, he was more likely to get the ball passed to him. The head coach would call a new set of plays. Most important, a player who thinks he has a hot hand has a distorted sense of his own talent, which leads him to take riskier shots, since he assumes his streak will save him. (It's the old bane of overconfidence.) Of course, the player is also more likely to miss these riskier shots. According to Tversky and Gilovich, the best shooters always think they're cold. When their feelings tell them to take the shots because they've got the hot hands, they don't listen.

THIS DEFECT IN the emotional brain has important consequences. Think about the stock market, which is a classic example of a random system. This means that the past movement of any particular stock cannot be used to predict its future movement. The inherent randomness of the market was first proposed by the economist Eugene Fama in the early 1960s. Fama looked at decades of stock-market data in order to prove that no amount of knowledge or rational analysis could help anyone figure out what would happen next. All of the esoteric tools used by investors to make sense of the market were pure nonsense. Wall Street was like a slot machine.

The danger of the stock market, however, is that sometimes its erratic fluctuations can actually look predictable, at least in the short term. Dopamine neurons are determined to solve the flux, but most of the time there is nothing to solve. And so brain cells flail against the stochasticity, searching for lucrative patterns. Instead of seeing the randomness, we come up with imagined systems and see meaningful trends where there are only meaningless streaks. "People enjoy investing in the market and gambling in a casino for the same reason that they see Snoopy in the clouds," says the neuroscientist Read Montague. "When the brain is exposed to anything random, like a slot machine or the shape of a cloud, it automatically imposes a pattern onto the noise. But that isn't Snoopy, and you haven't found the secret pattern in the stock market."

One of Montague's recent experiments demonstrated how an unrestrained dopamine system can, over time, lead to dangerous stock-market bubbles. The brain is so eager to maximize rewards that it ends up pushing its owner off a cliff. The experiment went like this: Subjects were each given a hundred dollars and some basic information about the "current" state of the stock market. Then the players chose how much of their money to invest and nervously watched as their stock investments either rose or fell in value. The game continued for twenty rounds, and the subjects got to keep their earnings. One interesting twist was that instead of using random simulations of the stock market, Montague relied on distillations of data from history's famous markets. Montague had people "play" the Dow of 1929, the Nasdaq of 1998, the Nikkei of 1986, and the S&P 500 of 1987. This let the scientists monitor the neural responses of investors during what had once been real-life bubbles and crashes.

How did the brain deal with the fluctuations of Wall Street? The scientists immediately discovered a strong neural signal that seemed to be driving many of the investment decisions. This signal emanated from dopamine-rich areas of the brain, such as the ventral caudate, and it was encoding fictive-error learning, or the ability to learn from what-if scenarios. Take, for example, this situation: A player has decided to wager 10 percent of his total portfolio in the market, which is a rather small bet. Then he watches as the market rises dramatically in value. At this point, the fictive-error learning signal starts to appear. While he enjoys his profits, his ungrateful dopamine neurons are fixated on the profits he missed, as the cells compute the difference between the best possible return and the actual return. (This is a modified version of the prediction-error signal discussed earlier.) When there is a big difference between what actually happened and what might have happened—which is experienced as a feeling of regret—the player, Montague found, is more likely to do things differently the next time around. As a result, investors in the experiment adapted their investments to the ebb and flow of the market. When markets were booming, as they were in the Nasdaq bubble of the late 1990s, investors kept increasing their investments. Not to invest was to drown in regret, to bemoan all the money that might have been earned if they'd only made better decisions.

But fictive-error learning isn't always adaptive. Montague argues that these computational signals are also a main cause of financial bubbles. When the market keeps going up, people are led to make larger and larger investments in the boom. Their greedy brains are convinced that they've solved the stock market, and so they don't think about the possibility of losses. But just when investors are most convinced that the bubble isn't a bubble—many of Montague's subjects eventually put all of their money into the booming market—the bubble bursts. The Dow sinks, the Nasdaq implodes, the Nikkei collapses. All of a sudden, the same investors who'd regretted not fully investing in the market and had subsequently invested more were now despairing of their plummeting net worth. "You get the exact opposite effect when the market heads down," Montague says. "People just can't wait to get out, because the brain doesn't want to regret staying in." At this point, the brain realizes that it's made some very expensive prediction errors, and the investor races to dump any assets that are declining in value. That's when you get a financial panic.

The lesson here is that it's silly to try to beat the market with your brain. Dopamine neurons weren't designed to deal with the random oscillations of Wall Street. When you spend lots of money on investment-management fees, or sink your savings into the latest hot mutual fund, or pursue unrealistic growth goals, you are slavishly following your primitive reward circuits. Unfortunately, the same circuits that are so good at tracking juice rewards and radar blips will fail completely in these utterly unpredictable situations. That's why, over the long run, a randomly selected stock portfolio will beat the expensive experts with their fancy computer models. And why the vast majority of mutual funds in any given year will underperform the S&P 500. Even those funds that do manage to beat the market rarely do so for long. Their models work haphazardly; their successes are inconsistent. Since the market is a random walk with an upward slope, the best solution is to pick a low-cost index fund and wait. Patiently. Don't fixate on what might have been or obsess over someone else's profits. The investor who does nothing to his stock portfolio—who doesn't buy or sell a single stock—outperforms the average "active" investor by nearly 10 percent. Wall Street has always searched for the secret algorithm of financial success, but the secret is, there is no secret. The world is more random than we can imagine. That's what our emotions can't understand.

2

Deal or No Deal is one of the most popular television game shows of all time. The show has been broadcast in more than forty-five different countries, from Great Britain to Slovakia to America. The rules of the game couldn't be simpler: a contestant is confronted with twenty-six sealed briefcases each full of varying amounts of cash, from a penny to a million dollars. Without knowing the amount of money in any of the briefcases, the contestant chooses a single one, which is then placed in a lockbox. Its contents won't be revealed until the game is over.

The player then proceeds to open the remaining twenty-five briefcases one at a time. As the various monetary amounts are revealed, the contestant gradually gets an idea of how much money his or her own briefcase might contain, since all the remaining amounts are displayed on a large screen. It's a nerve-racking process of elimination, as each player tries to keep as many of the big monetary sums on the board for as long as possible. Every few rounds, a shadowy figure known as the Banker makes the player an offer for the sealed briefcase. The contestant can either accept the deal and cash out or continue to play, gambling that the unopened briefcase contains more money than the Banker has offered. As the rounds continue, the tension becomes excruciating. Spouses start crying, and children begin screaming. If the wrong briefcase is picked, or the best deal is rejected, a staggering amount of money can evaporate, just like that.

For the most part, Deal or No Deal is a game of dumb luck. Although players develop elaborate superstitions about the briefcases—odd numbers are better; even numbers are better; ones held by blond models are better—the monetary amounts in them are randomly distributed. There is no code to crack, no numerology to decipher. This is just fate unfolding in front of a national television audience.

And yet, Deal or No Deal is also a game of difficult decisions. After the Banker makes an offer, the contestant has a few minutes—usually the length of a commercial break—to make up his mind. He must weigh the prospect of sure money against the chances of winning one of the larger cash prizes. It's almost always a hard call, a moment full of telegenic anxiety.

There are two ways to make this decision. If the contestant had a calculator handy, he could quickly compare the average amount of money he might expect to win against the Banker's offer. For example, if there were three remaining briefcases, one containing $1, one containing $10,000, and one containing $500,000, then the player should, at least in theory, accept any offer over $170,000, since that is the average of the money in all three briefcases. Although offers in the early rounds are generally unfairly low—the producers don't want people to quit before it gets dramatic—as the game goes on, the offers made by the Banker become more and more reasonable, until they are essentially asymptotic with the mathematical average of the money still available. In this sense, it is extremely easy for a contestant on Deal or No Deal to determine whether or not to accept an offer. He just needs to add up all the remaining monetary amounts, divide that number by the number of briefcases left, and see if that figure exceeds the offer on the table. If Deal or No Deal were played like this, it would be a thoroughly rational game. It would also be extremely boring. It's not fun to watch people do arithmetic.

The game show is entertaining only because the vast majority of contestants don't make decisions based on the math. Take Nondumiso Sainsbury, a typical Deal or No Deal contestant. She is a pretty young woman from South Africa who met her husband while she was studying in America. She plans on sending her winnings back home to her poor family in Johannesburg, where her three younger brothers live in a shantytown with her mother. It's hard not to root for her to make the right decision.

Nondumiso starts off rather well. After a few rounds, she still has two big amounts—$500,000 and $400,000—left in play. As is usual for this stage of the game, the Banker makes her a blatantly unfair offer. Although the average amount of money left is $185,000, Nondumiso is offered less than half that. The producers clearly want her to keep playing.

After quickly consulting with her husband—"We still might win half a million dollars!" she shouts—Nondumiso wisely rejects the offer. The suspense builds as she prepares to pick her next briefcase. She randomly chooses a number and winces as the briefcase is slowly opened. Every second of tension is artfully mined. Nondumiso's luck has held: the briefcase contains only $300. The Banker now increases his offer to $143,000, or 75 percent of a perfectly fair offer.

After just a few seconds of deliberation, Nondumiso decides to reject the deal. Once again, the pressure builds as a briefcase is opened. The audience collectively gasps. Once again, Nondumiso has gotten lucky: she has managed to avoid eliminating either of the two big remaining sums of money. She now has a 67 percent chance of winning more than $400,000. Of course, she also has a 33 percent chance of winning $100.

For the first time, the Banker's offer is essentially fair: he is willing to "buy" Nondumiso's sealed briefcase for $286,000. As soon as she hears the number, she breaks into a huge smile and starts to cry. Without even pausing to contemplate the math, Nondumiso begins chanting, "Deal! Deal! I want a deal!" Her loved ones swarm the stage. The host tries to ask Nondumiso a few questions, and she struggles to speak through the tears.

In many respects, Nondumiso made an excellent set of decisions. A computer that meticulously analyzed the data couldn't have done much better. But it's important to note how Nondumiso arrived at these decisions. She never took out a calculator or estimated the average amount of money remaining in the briefcases. She never scrutinized her options or contemplated what would happen if she eliminated one of the larger amounts of money. (In that case, the offer probably would have been cut by at least 50 percent.) Instead, her risky choices were entirely impulsive; she trusted her feelings to not lead her astray.

While this instinctive decision-making strategy normally works out just fine—Nondumiso's feelings made her rich—there are certain situations on the game show that reliably fool the emotional brain. In these cases, contestants end up making terrible choices, rejecting deals that they should accept. They lose fortunes because they trust their emotions at the wrong moment.

Look at poor Frank, a contestant on the Dutch version of Deal or No Deal. He gets off to an unlucky start by immediately eliminating some of the most lucrative briefcases. After six rounds, Frank has only one valuable briefcase left, worth five hundred thousand euros. The Banker offers him #102,006, about 75 percent of a perfectly fair offer. Frank decides to reject the deal. He's gambling that the next briefcase he picks won't contain the last big monetary amount, thus driving up the offer from the Banker. So far, his emotions are acting in accordance with the arithmetic. They are holding out for a better deal.

But Frank makes a bad choice, eliminating the one briefcase he wanted to keep in play. He braces himself for the bad news from the Banker, who now offers Frank a deal for ???€2,508, or about €100,000 less than he was offered thirty seconds before. The irony is that this offer is utterly fair; Frank would be wise to cut his losses and accept the Banker's proposal. But Frank immediately rejects the deal; he doesn't even pause to consider it. After another unlucky round, the Banker takes pity on Frank and makes him an offer that's about 110 percent of the average of the possible prizes. (Tragedy doesn't make good game-show TV, and the producers are often quite generous in such situations.) But Frank doesn't want pity, and he rejects the offer. After eliminating a briefcase containing €1—Frank's luck is finally starting to turn—he is now faced with a final decision. Only two briefcases remain: €10 and €10,000. The Banker offers him €6,500, which is a 30 percent premium over the average of the money remaining. But Frank spurns this final proposal. He decides to open his own briefcase, in the desperate hope that it contains the bigger amount. Frank has bet wrong: it contains only €10. In fewer than three minutes, Frank has lost more than €100,000.

Frank isn't the only contestant to make this type of mistake. An exhaustive analysis by a team of behavioral economists led by Thierry Post concluded that most contestants in Frank's situation act the exact same way. (As the researchers note, Deal or No Deal has "such desirable features that it almost appears to be designed to be an economics experiment rather than a TV show.") After the Banker's offer decreases by a large amount—this is what happened after Frank opened the €500,000 briefcase—a player typically becomes excessively risk-seeking, which means he is much more likely to reject perfectly fair offers. The contestant is so upset by the recent monetary loss that he can't think straight. And so he keeps on opening briefcases, digging himself deeper and deeper into a hole.

These contestants are victims of a very simple flaw rooted in the emotional brain. Alas, this defect isn't limited to greedy game-show contestants, and the same feelings that caused Frank to reject the fair offers can lead even the most rational people to make utterly foolish choices. Consider this scenario:

The United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill six hundred people. Two different programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program A is adopted, two hundred people will be saved. If program B is adopted, there is a one-third probability that six hundred people will be saved and a two-thirds probability that no people will be saved. Which of the two programs would you favor?

When this question was put to a large sample of physicians, 72 percent chose option A, the safe-and-sure strategy, and only 28 percent chose program B, the risky strategy. In other words, physicians would rather save a certain number of people for sure than risk the possibility that everyone might die. But consider this scenario:

The United States is preparing for the outbreak of an unusual Asian disease, which is expected to kill six hundred people. Two different programs to combat the disease have been proposed. Assume that the exact scientific estimates of the consequences of the programs are as follows: If program C is adopted, four hundred people will die. If program D is adopted, there is a one-third probability that nobody will die and a two-thirds probability that six hundred people will die. Which of the two programs would you favor?

When the scenario was described in terms of deaths instead of survivors, physicians reversed their previous decisions. Only 22 percent voted for option C, while 78 percent chose option D, the risky strategy. Most doctors were now acting just like Frank: they were rejecting a guaranteed gain in order to participate in a questionable gamble.

Of course, this is a ridiculous shift in preference. The two different questions examine identical dilemmas; saving one-third of the population is the same as losing two-thirds. And yet doctors reacted very differently depending on how the question was framed. When the possible outcomes were stated in terms of deaths—this is called the loss frame—physicians were suddenly eager to take chances. They were so determined to avoid any option associated with loss that they were willing to risk losing everything.

This mental defect—it's technical name is loss aversion—was first demonstrated in the late 1970s by Daniel Kahneman and Amos Tversky. At the time, they were both psychologists at Hebrew University, best known on campus for talking to each other too loudly in their shared office. But these conversations weren't idle chatter; Kahneman and Tversky (or "kahnemanandtversky," as they were later known) did their best science while talking. Their disarmingly simple experiments—all they did was ask each other hypothetical questions—helped to illuminate many of the brain's hard-wired defects. According to Kahneman and Tversky, when a person is confronted with an uncertain situation—like having to decide whether to accept an offer from the Banker—the individual doesn't carefully evaluate the information, or compute the Bayesian probabilities, or do much thinking at all. Instead, the decision depends on a brief list of emotions, instincts, and mental shortcuts. These shortcuts aren't a faster way of doing the math; they're a way of skipping the math altogether.

Kahneman and Tversky stumbled upon the concept of loss aversion after giving their students a simple survey that asked if they would accept various bets. The psychologists noticed that when a person was offered a gamble on the toss of a coin and was told that losing would cost him twenty dollars, the player demanded, on average, around forty dollars for winning. The pain of a loss was approximately twice as potent as the pleasure generated by a gain. Furthermore, decisions seemed to be determined by these feelings. As Kahneman and Tversky put it, "In human decision making, losses loom larger than gains."

Loss aversion is now recognized as a powerful mental habit with widespread implications. The desire to avoid anything that smacks of loss often shapes our behavior, leading us to do foolish things. Look, for example, at the stock market. Economists have long been perplexed by a phenomenon known as the premium equity puzzle. The puzzle itself is easy to explain: over the last century, stocks have outperformed bonds by a surprisingly large margin. Since 1926, the annual return on stocks after inflation has been 6.4 percent, while the return on Treasury bills has been less than 0.5 percent. When the Stanford economists John Shoven and Thomas MaCurdy compared randomly generated financial portfolios composed of either stocks or bonds, they discovered that, over the long term, stock portfolios always generated higher returns than bond portfolios. In fact, stocks typically earned more than seven times as much as bonds. MaCurdy and Shoven concluded that people who invest in bonds must be "confused about the relative safety of different investments over long horizons." In other words, investors are just as irrational as game-show contestants. They, too, have a distorted sense of risk.

Classical economic theory can't explain the premium equity puzzle. After all, if investors are such rational agents, why don't all of them invest in stocks? Why are low-yield bonds so popular? In 1995, the behavioral economists Richard Thaler and Shlomo Benartzi realized that the key to solving the premium equity puzzle was loss aversion. Investors buy bonds because they hate losing money, and bonds are a safe bet. Instead of making financial decisions that reflect all the relevant statistical information, they depend on their emotional instincts and seek the certain safety of bonds. These are well-intentioned instincts—they prevent people from gambling away their retirement savings—but they are also misguided. The fear of losses makes investors more willing to accept a measly rate of return.

Even experts are vulnerable to these irrational feelings. Take Harry Markowitz, a Nobel Prize-winning economist who practically invented the field of investment-portfolio theory. In the early 1950s, while working at the RAND Corporation, Markowitz became intrigued by a practical financial question: how much of his savings should he invest in the stock market? Markowitz derived a complicated mathematical equation that could be used to calculate the optimal mix of assets. He had come up with a rational solution to the old problem of risk versus reward.

But Markowitz couldn't bring himself to use his own equation. When he divided up his investment portfolio, he ignored the investment advice that had won him the Nobel Prize; instead of relying on the math, he fell into the familiar trap of loss aversion and split his portfolio equally between stocks and bonds. Markowitz was so worried about the possibility of losing his savings that he failed to optimize his own retirement account.

Loss aversion also explains one of the most common investing mistakes: investors evaluating their stock portfolios are most likely to sell stocks that have increased in value. Unfortunately, this means that they end up holding on to their depreciating stocks. Over the long term, this strategy is exceedingly foolish, since ultimately it leads to a portfolio composed entirely of shares that are losing money. (A study by Terrance Odean, an economist at UC Berkeley, found that the stocks investors sold outperformed the stocks they didn't sell by 3.4 percent.) Even professional money managers are vulnerable to this bias and tend to hold losing stocks twice as long as winning stocks. Why does an investor do this? Because he is afraid to take a loss—it feels bad—and selling shares that have decreased in value makes the loss tangible. We try to postpone the pain for as long as possible; the result is more losses.

The only people who are immune to this mistake are neurologically impaired patients who can't feel any emotion at all. In most situations, these people have very damaged decision-making abilities. And yet, because they don't feel the extra sting of loss, they are able to avoid the costly emotional errors brought on by loss aversion.

Consider this experiment, led by Antonio Damasio and George Loewenstein. The scientists invented a simple investing game. In each round, the experimental subject had to decide between two options: invest $1 or invest nothing. If the participant decided not to invest, he kept the dollar, and the game advanced to the next round. If the participant decided to invest, he would hand a dollar bill to the experimenter, who would then toss a coin. Heads meant that the participant lost the $1 that was invested; tails meant that $2.50 was added to the participant's account. The game stopped after twenty rounds.

If people were perfectly rational—if they made decisions solely by crunching the numbers—then subjects would always choose to invest, since the expected overall value on each round is higher if one invests ($1.25, or $2.50 multiplied by the 50 percent chance of getting tails on the coin toss) than if one does not ($1). In fact, if a person invests on each and every round, there is a mere 13 percent chance that he'll wind up with less than twenty dollars, which is the amount a player would have if he didn't invest in any of the rounds.

So what did the subjects in Damasio's study do? Those with intact emotional brains invested only about 60 percent of the time. Because human beings are wired to dislike potential losses, most people were perfectly content to sacrifice profit for security, just like investors choosing low-yield bonds. Furthermore, the willingness of a person to invest plummeted immediately after he or she had lost a gamble—the pain of losing was too fresh.

These results are entirely predictable; loss aversion makes us irrational when it comes to evaluating risky gambles. But Damasio and Loewenstein didn't stop there. They also played the investing game with neurologically impaired patients who could no longer experience emotion. If it was the feeling of loss aversion that caused these bad investing decisions, then these patients should perform better than their healthy peers.

That's exactly what happened. The emotionless patients chose to invest 83.7 percent of the time and gained significantly more money than normal subjects. They also proved much more resistant to the misleading effects of loss aversion, and they gambled 85.2 percent of the time after a lost coin toss. In other words, losing money made them more likely to invest as they realized that investing was the best way to recoup their losses. In this investing situation, having no emotions was a crucial advantage.

And then there is Deal or No Deal, which turns out to be a case study in loss aversion. Imagine you are Frank. Less than a minute ago, you turned down the Banker's offer of €102,006. But now you've picked the worst possible briefcase, and the offer has declined to €2,508. In other words, you've lost a cool hundred grand. Should you accept the current deal? The first thing your mind does is make a list of the options under consideration. However, instead of evaluating those options in terms of arithmetic—which would be the rational thing to do—you use your emotions as a shortcut to judgment. You simulate the various scenarios and see how each makes you feel. When you imagine accepting the offer of €2,508, you experience a sharply negative emotion, even though it's a perfectly fair offer. The problem is that your emotional brain interprets the offer as a dramatic loss, since it's automatically compared to the much larger amount of money that had been on offer just a few moments earlier. This resulting feeling serves as a signal that accepting the deal is a bad idea; you should reject the offer and open another briefcase. In this situation, loss aversion makes you risk seeking.

But now that you've imagined rejecting the offer, you fixate on the highest monetary amount now possible. This is the potential gain you measure everything against, what economists call the reference point. (For Frank, the potential gain during the final rounds was €10,000. For the physicians being quizzed about that unusual Asian disease, the potential gain was saving all six hundred people.) When you think about this optimistic possibility, you experience, however briefly, a pleasurable feeling. You contemplate the upside of risk and envision a check with lots of zeros. You might not be able to get back the €100,000 offer, but at least you won't leave empty-handed.

The upshot of all this is that you badly miscalculate the risk. You keep on chasing after the possibility of a big gain because you can't accept the prospect of a loss. Your emotions have sabotaged common sense.

Loss aversion is an innate flaw. Everyone who experiences emotion is vulnerable to its effects. It's part of a larger psychological phenomenon known as negativity bias, which means that, for the human mind, bad is stronger than good. This is why in marital interactions, it generally takes at least five kind comments to compensate for one critical comment. As Jonathan Haidt points out in his book The Happiness Hypothesis, people believe that a person who's been convicted of murder must perform at least twenty-five acts of "life-saving heroism" in order to make up for his or her one crime. There's no rational reason for us to treat gains and losses or compliments and criticisms so differently. But we do. The only way to avoid loss aversion is to know about the concept.

3

"The credit card is my enemy," Herman Palmer says. Herman is a very friendly guy, with sympathetic eyes and a wide smile that fills his face, but when he starts to talk about credit cards, his demeanor abruptly darkens. He furrows his brow, lowers his voice, and leans forward in his chair. "Every day, I see lots of smart people who have the same problem: Visa and MasterCard. Their problem is all those plastic cards they've got in their wallet." Then he shakes his head in dismay and lets out a resigned sigh.

Herman is a financial counselor in the Bronx. He has spent the last nine years working for GreenPath, a nonprofit organization that helps people deal with their debt problems. His small office is a minimalist affair, with a desk so clean that it looks as if no one has ever used it. The only thing on the desk is a large glass candy jar, but this jar isn't stuffed with M&M's or jelly beans or miniature candy bars. It's filled with the cut-up shards of hundreds of credit cards. The plastic pieces make for a pretty collage—the iridescent security stickers glitter in the light—but Herman doesn't keep the jar around for aesthetic reasons. "I use it as a kind of shock treatment," he says. "I'll ask a client for their cards and just cut them up right in front of them. And then I just add the cards to the jar. I want people to see that they are not alone, that so many people have the exact same problem." Once the jar in his office is completely filled—and that only takes a few months—Herman empties it into the big glass vase in the waiting room. "That's our flower display," he jokes.

According to Herman, the jar of credit cards captures the essence of his job. "I teach people how not to spend money," he says. "And it's damn near impossible to not spend money if you've still got all these cards, which is why I always cut them up." The first time I visited the GreenPath office was a few weeks after Christmas, and the waiting room was full of anxious-looking people trying to pass the time with old issues of celebrity magazines. Every chair was taken. "January is our busiest time of year," Herman says. "People always overspend during the holidays, but they don't realize how much they've overspent until the credit card bills arrive in the mail. That's when they come see us."

For the most part, Herman's clients are from the neighborhood, a working-class area of row houses that were once single-family dwellings but are now apartment buildings, with numerous buzzers and mailboxes grafted onto the front doors. Many of the homes have fallen into disrepair, with peeling siding and graffiti. There aren't any supermarkets nearby, but there are plenty of bodegas and liquor stores. A little farther down the block, there are two pawnshops and three check-cashing operations. Every few minutes, another number 6 subway train rumbles directly overhead, shrieking to a stop near the GreenPath office. It's the last stop on the line.

Nearly half of Herman's clients are single mothers. Many of these women work full-time but still struggle to pay their bills. Herman estimates that his clients spend, on average, around 40 percent of their income on housing, even though the neighborhood has some of the cheapest real estate in New York City. "It's easy to judge people," Herman says. "It's easy to think, 'I would never have gotten into so much debt,' or to think that just because someone needs financial help, then they must be irresponsible. But a lot of the people I see are just trying to make ends meet. The other day I had a mother come in who just broke my heart. She was working two jobs. Her credit card bill was all daycare charges for her kid. What am I supposed to tell her? That her kid can't go to daycare?"

This ability to help his clients without judging them, to understand what they're going through, is what makes Herman such a good financial counselor. (He has an unusually high success rate, with more than 65 percent of his clients completing their debt-elimination plans.) It would be easy for Herman to play the scold, to chastise his clients for letting their spending get out of control. But he does just the opposite. Instead of lecturing his clients, he listens to them. After Herman destroys their credit cards at the initial meeting—he almost always gets out his scissors within the first five minutes—he will spend the next several hours poring over their bills and bank statements, trying to understand what's gone wrong with their finances. Is their rent too expensive? Are they spending too much money on clothes or cell phones or cable television? "I always tell my clients that they are going to leave my office with a practical plan," Herman says. "And charging it to Mr. MasterCard is not a plan."

When Herman talks about the people who have been helped by his financial advice, his face takes on the glow of a proud parent. There's the plumber from Co-op City who lost his job and started paying rent with his credit card. After a few months, his interest rate was above 30 percent. Herman helped him consolidate his debt and get his expenses under control. There's that single mother who couldn't afford daycare. "We helped her find other ways to save money," he says. "We cut her expenses by enough so that she didn't have to charge everything. The trick is to notice whenever you're spending money. All that little stuff? Guess what: it adds up." There's the schoolteacher who racked up debt on ten different credit cards and paid hundreds of dollars every month in late fees alone. It took five years of careful discipline, but now the teacher is debt free. "I know the client is going to be okay when they start telling me about the sweater or CD they really wanted but they didn't buy," Herman says. "That's when I know they are starting to make better decisions."

Most of the people who come to see Herman tell the same basic story. One day, a person gets a credit card offer in the mail. (Credit card companies sent out 5.3 billion solicitations in 2007, which means the average American adult got fifteen offers.) The card seems like such a good deal. In big bold print it advertises a low introductory rate along with something about getting cash back or frequent-flier miles or free movie tickets. And so the person signs up. He fills out the one-page form and then, a few weeks later, gets a new credit card in the mail. At first, he doesn't use it much. Then one day he forgets to get cash, and so he uses the new credit card to pay for food at the supermarket. Or maybe the refrigerator breaks, and he needs a little help buying a new one. For the first few months, he always manages to pay off the full bill. "Almost nobody gets a credit card and says, 'I'm going to use this to buy things I can't afford,'" Herman says. "But it rarely stays like that for long."

According to Herman, the big problem with credit cards—the reason he enjoys cutting them up so much—is that they cause people to make stupid financial choices. They make it harder to resist temptation, so people spend money they don't have. "I've seen it happen to the most intelligent people," Herman says. "I'll look at their credit card bill and I'll see a charge for fifty dollars at a department store. I'll ask them what they bought. They'll say, 'It was a pair of shoes, Herman, but it was on sale.' Or they'll tell me that they bought another pair of jeans but the jeans were fifty percent off. It was such a good deal that it would have been dumb not to buy it. I always laugh when I hear that one. I then have them add up all the interest they are going to pay on those jeans or that pair of shoes. For a lot of these people, it will be around twenty-five percent a month. And you know what? Then it's not such a good deal anymore."

These people aren't in denial. They know they have serious debt problems and that they're paying a lot of interest on their debts. That's why they're visiting a financial adviser. And yet, they still bought the jeans and the pair of shoes on sale. Herman is all too familiar with the problem: "I always ask people, 'Would you have bought the item if you had to pay cash? If you had to go to an ATM and feel the money in your hands and then hand it over?' Most of the time, they think about it for a minute and then they say no."

Herman's observations capture an important reality about credit cards. Paying with plastic fundamentally changes the way we spend money, altering the calculus of our financial decisions. When you buy something with cash, the purchase involves an actual loss—your wallet is literally lighter. Credit cards, however, make the transaction abstract, so that you don't really feel the downside of spending money. Brain-imaging experiments suggest that paying with credit cards actually reduces activity in the insula, a brain region associated with negative feelings. As George Loewenstein, a neuroeconomist at Carnegie Mellon, says, "The nature of credit cards ensures that your brain is anesthetized against the pain of payment." Spending money doesn't feel bad, so you spend more money.

Consider this experiment: Drazen Prelec and Duncan Simester, two business professors at MIT, organized a real-life, sealed-bid auction for tickets to a Boston Celtics game. Half the participants in the auction were informed that they had to pay with cash; the other half were told they had to pay with credit cards. Prelec and Simester then averaged the bids for the two different groups. Lo and behold, the average credit card bid was twice as high as the average cash bid. When people used their Visas and MasterCards, their bids were much more reckless. They no longer felt the need to contain their expenses, and so they spent way beyond their means.

This is what's happened to the American consumer over the past few decades. The statistics are bleak: the average household currently owes more than nine thousand dollars in credit card debt, and the average number of credit cards per person is 8.5. (More than 115 million Americans carry month-to-month balances on their credit cards.) In 2006, consumers spent more than seventeen billion dollars in penalty fees alone on their credit cards. Since 2002, Americans have had a negative savings rate, which means that we've spent more than we've earned. The Federal Reserve recently concluded that this negative savings rate was largely a consequence of credit card debt. We spend so much money on interest payments that we can't save for retirement.

At first glance, this behavior makes no sense. Given the exorbitant interest rates charged by most credit card companies—rates of 25 percent or more are common—a rational consumer would accumulate credit card debt only as a last resort. Paying interest is expensive. And yet, credit card debt is as American as apple pie. "The people who have credit card debt are the same people who drive an extra mile to save two cents on a gallon of gas," Herman says. "They are the same people who clip coupons and comparison shop. Many of these people are normally very good with their money. But then they bring me their credit card bill and they say, 'I don't know what happened. I don't know how I spent all this money.'"

The problem with credit cards is that they take advantage of a dangerous flaw built into the brain. This failing is rooted in our emotions, which tend to overvalue immediate gains (like a new pair of shoes) at the cost of future expenses (high interest rates). Our feelings are thrilled by the prospect of an immediate reward, but they can't really grapple with the long-term fiscal consequences of that decision. The emotional brain just doesn't understand things like interest rates or debt payments or finance charges. As a result, areas like the insula don't react to transactions involving a Visa or MasterCard. Because our impulsivity encounters little resistance, we swipe our cards and buy whatever we want. We'll figure out how to pay for it later.

This sort of shortsighted decision-making isn't dangerous only for people with too many credit cards in their wallets. In recent years, Herman has seen a new financial scourge in the neighborhood: subprime mortgages. "I still remember the first subprime mortgage I dealt with," Herman says. "I remember thinking, 'This is a really bad deal. These people just bought a house that's way too expensive for them, and they don't even know it yet.' And that's when I knew that I'd be seeing a lot of these loans in the future."

The most common type of subprime mortgage that Herman deals with is the 2/28 loan, which comes with a low, fixed-interest rate for the first two years and a much higher, adjustable rate for the next twenty-eight. In other words, the loan works a lot like a credit card: it lets people get homes for virtually nothing up front, then hits the borrowers with high-interest payments at some point in the distant future. By the time the housing market went bust in the summer of 2007, subprime loans like the 2/28 accounted for almost 20 percent of all mortgages. (The percentage in poorer neighborhoods, such as the Bronx, was much higher, with more than 60 percent of all mortgages falling into the subprime category.) Unfortunately, the loan comes with a steep cost. The structure of the loan ensures that subprime borrowers are five times more likely to default than other borrowers. Once the rates start to rise—and they always do—many people can no longer afford the monthly mortgage payments. By the end of 2007, a whopping 93 percent of completed foreclosures involved adjustable-rate loans that had recently been adjusted. "When I help people with a mortgage," Herman says, "I never ask them about the home. Because then they just start talking about how pretty it is and how the extra room will be so great for their kids. That's just temptation talking. I make sure we stick to the numbers and that we especially focus on their interest payments in the future, after the rates are adjusted." While 2/28 loans tempt consumers with low initial payments, that temptation turns out to be extremely expensive. In fact, subprime loans even proved tempting for people with credit scores that qualified them for conventional loans that had far better financial terms. During the peak of the housing boom, 55 percent of all 2/28 mortgages were sold to homeowners who could have gotten prime mortgages. Although prime mortgages would have saved them lots of money over the long term, these people just couldn't resist the allure of those low initial payments. Their feelings tricked them into making foolish financial decisions.

The pervasive reach of credit cards and subprime loans reveals our species' irrationality. Even when people are committed to long-term goals, such as saving for retirement, they are led astray by momentary temptations. Our impulsive emotions make us buy what we can't afford. As Plato might have put it, the horses are pulling the charioteer against his will.

Understanding the circuitry of temptation is one of the practical ambitions of scientists studying decision-making. Jonathan Cohen, a neuroscientist at Princeton University, has made some important progress. He's begun to diagnose the specific brain regions responsible for the attraction to credit cards and subprime loans. One of his recent experiments involved putting a subject in an fMRI machine and making him decide between a small Amazon gift certificate that he could have right away and a slightly larger gift certificate that he'd receive in two to four weeks. Cohen discovered that these two options activated very different neural systems. When a subject contemplated a gift certificate in the future, brain areas associated with rational planning, such as the prefrontal cortex, were more active. These cortical regions urge a person to be patient, to wait a few extra weeks for the bigger gain.

However, when a subject started thinking about getting the gift certificate right away, the brain areas associated with emotion—such as the midbrain dopamine system and nucleus accumbens—were turned on. These are the cells that tell a person to take out a mortgage he can't afford, or run up credit card debt when he should be saving for retirement. All these cells want is a reward, and they want it now.

By manipulating the amount of money on offer in each situation, Cohen and his collaborators could watch this neural tug of war unfold. They saw the fierce argument between reason and feeling as the mind was pulled in contradictory directions. The ultimate decision—whether to save for the future or to indulge in the present—was determined by whichever region showed greater activation. The people who couldn't wait for the bigger gift certificates—and most people couldn't wait—were led astray by their feelings. More emotions meant more impulsivity. (This also helps explain why men who are shown revealing pictures of attractive women, what scientists refer to as "reproductively salient stimuli," become even more impulsive: the photos activate their emotional circuits.) However, subjects who chose to wait and receive the larger Amazon gift certificates later showed increased activity in their prefrontal cortices; they did the math and selected the "rational" option.

This discovery has important implications. For starters, it locates the neural source for many financial errors. When self-control breaks down, and we opt for the rewards we can't afford, it's because the rational brain has lost the neural tug of war. David Laibson, an economist at Harvard and coauthor of the paper on the monetary-reward experiment, notes: "Our emotional brain wants to max out the credit card, order dessert, and smoke a cigarette. When it sees something it wants, it has difficulty waiting to get it." Corporations have learned to take advantage of this limbic impatience. Consider the teaser rates offered in credit card solicitations. In order to entice new customers, lenders typically advertise their low introductory charges. These alluring offers expire within a few short months, leaving customers stuck with lots of debt on credit cards with high interest rates. The bad news is that the emotional brain is routinely duped by these tempting (but financially foolish) advertisements. "I always tell people to read only the fine print," Herman says. "The bigger the print, the less it matters."

Unfortunately, most people don't follow Herman's advice. Lawrence Ausubel, an economist at the University of Maryland, analyzed the responses of consumers to two different credit card promotions used by actual credit card companies. The first card offered a six-month teaser rate of 4.9 percent that was followed by a lifetime at 16 percent. The second card had a slightly higher teaser rate—6.9 percent—but a significantly lower lifetime rate (14 percent). If consumers were rational, they would always choose the card with the lower lifetime rate, since that's the rate that would apply to most of their debts. Of course, this isn't what happens. Ausubel found that the credit offer with the 4.9 percent teaser rate was chosen by consumers almost three times more often than the other. Over the long term, this impatience leads to significantly higher interest payments.

When people opt for bad credit cards, or choose 2/28 mortgages, or fail to put money in their 401(k)s, they are acting just like the experimental subjects who chose the wrong Amazon gift certificate. Because the emotional parts of the brain reliably undervalue the future—life is short and we want pleasure now—we all end up spending too much money today and delaying saving until tomorrow (and tomorrow and tomorrow). George Loewenstein, the neuroeconomist, thinks that understanding the errors of the emotional brain will help policymakers develop plans that encourage people to make better decisions: "Our emotions are like software programs that evolved to solve important and recurring problems in our distant past," he says. "They are not always well suited to the decisions we make in modern life. It's important to know how our emotions lead us astray so that we can find ways to compensate for these flaws."

Some economists are already working on that. They are using this brain-imaging data to support a new political philosophy known as asymmetric paternalism. That's a fancy name for a simple idea: creating policies and incentives that help people triumph over their irrational impulses and make better, more prudent decisions. Shlomo Benartzi and Richard Thaler, for example, designed a 401(k) that takes our irrationality into account. Their plan, called Save More Tomorrow, neatly sidesteps the limbic system. Instead of asking people if they want to start saving right away—which is the standard pitch for a 401(k)—companies in the Save More Tomorrow program ask their employees if they want to opt into savings plans that begin in a few months. Since this proposal allows people to make decisions about the future without contemplating possible losses in the present, it bypasses their impulsive emotional brains. (This is roughly equivalent to asking a person if he wants a ten-dollar Amazon gift certificate in one year or an eleven-dollar gift certificate in one year and one week. In this case, virtually everyone chooses the rational option, which is the larger amount.) Trial studies of this program show it's a resounding success: after three years, the average savings rate has gone from 3.5 percent to 13.6 percent.

Herman is content with an even simpler solution. "My first piece of advice is always the same," he says. "Cut up the damn cards. Or put them in a block of ice in the freezer. Learn to pay with cash." Herman knows from experience that unless people get rid of their credit cards, they won't be able to stay on fiscally sound spending plans: "I've seen people who have more debt than you can believe, and they'll still make irresponsible shopping decisions if they can charge it." It's not easy for the brain to choose a long-term gain over an immediate reward—such a decision takes cognitive effort—which is why getting rid of anything that makes the choice harder (such as credit cards) is so important. "Everybody knows about temptation," Herman says. "Everybody wants that new pair of shoes and the big house. But sometimes you have to say no to yourself." He tries to quote a famous song by the Rolling Stones but he can't quite remember the lyrics. The message of the chorus is simple: you can't always get what you want, but sometimes not getting what you want is just what you need.