The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life - Robert Trivers (2011)

Chapter 1. The Evolutionary Logic of Self-Deception


In the early 1970s, I busied myself trying to construct social theory based on natural selection. I wanted to understand the evolution of our basic social relationships—parent/offspring, male/female, relative/ friend, in-group member/out-group one, whatever. Natural selection, in turn, was the key to understanding evolution, and the only theory that answered the question, what is a trait designed to achieve? Natural selection refers to the fact that in every species, some individuals leave more surviving offspring than do others, so that the genetic traits of the reproductively successful tend to become more frequent over time. Since this process knits together genes associated with high reproductive success (RS = number of surviving offspring), all living creatures are expected to be organized accordingly, that is, to attempt to maximize personal RS. Because the replicating units are actually genes, this also means that our genes are expected to promote their own propagation.

When applied to social behavior, natural selection predicts a mixture of conflicting emotions and behavior. Contrary to widespread beliefs of the time (and even sometimes now), parent/offspring relations are not expected to be free of conflict, not even in the womb. At the same time, reciprocal relations are easily exploited by cheaters, that is, non-reciprocators, so that a sense of fairness may naturally evolve to regulate such relations in a protective manner. Finally, a coherent and unbiased theory for the evolution of sex differences can be built on the concept of relative parental investment—how much time and effort each parent puts into creating the offspring—as well as an understanding of selection acting on their relative numbers (the sex ratio). This work gives us a deeper view of the meaning of being a male or a female.

The general system of logic worked perfectly well for most subjects I encountered, but one problem stood out. At the heart of our mental lives, there seemed to be a striking contradiction—we seek out information and then act to destroy it. On the one hand, our sense organs have evolved to give us a marvelously detailed and accurate view of the outside world—we see the world in color and 3-D, in motion, texture, nonrandomness, embedded patterns, and a great variety of other features. Likewise for hearing and smell. Together our sensory systems are organized to give us a detailed and accurate view of reality, exactly as we would expect if truth about the outside world helps us to navigate it more effectively. But once this information arrives in our brains, it is often distorted and biased to our conscious minds. We deny the truth to ourselves. We project onto others traits that are in fact true of ourselves—and then attack them! We repress painful memories, create completely false ones, rationalize immoral behavior, act repeatedly to boost positive self-opinion, and show a suite of ego-defense mechanisms. Why?

Surely these biases are expected to have negative effects on our biological welfare. Why degrade and destroy the truth? Why alter information after arrival so as to reach a conscious falsehood? Why should natural selection have favored our marvelous organs of perception, on the one hand, only to have us systematically distort the information gathered, on the other? In short, why practice self-deception?

During a brainstorm on parent-offspring conflict in 1972, it occurred to me that deception of others might provide exactly the force to drive deception of self. The key moment occurred when I realized that parent-offspring conflict extended beyond how much parental investment is delivered to the behavior of the offspring itself. Once I saw conflict over the offspring’s personality, it was easy to imagine parental deceit and self-deception molding offspring identity for parental benefit. Likewise, one could imagine parents not just practicing self-deception but also imposing it—that is, inducing it in the offspring—to the offspring’s detriment but to parental advantage. After all, the parent is in the position of advantage—larger, stronger, in control of the resources at issue, and more practiced in the arts of self-deception.

Applied more broadly, the general argument is that we deceive ourselves the better to deceive others. To fool others, we may be tempted to reorganize information internally in all sorts of improbable ways and to do so largely unconsciously. From the simple premise that the primary function of self-deception is offensive—measured as the ability to fool others—we can build up a theory and science of self-deception.

In our own species, deceit and self-deception are two sides of the same coin. If by deception we mean only consciously propagated deception—outright lies—then we miss the much larger category of unconscious deception, including active self-deception. On the other hand, if we look at self-deception and fail to see its roots in deceiving others, we miss its major function. We may be tempted to rationalize self-deception as being defensive in purpose when actually it is usually offensive. Here we will treat deceit and self-deception as a unitary subject, each feeding into the other.


In this book we take an evolutionary approach to the topic. What is the biological advantage to the practitioner of self-deception, where advantage is measured as positive effects on survival and reproduction? How does self-deception help us survive and reproduce—or, slightly more accurately, how does it help our genes survive and reproduce? Put differently, how does natural selection favor mechanisms of self-deception? We shall see that we have a large set of such mechanisms and that they may have important costs. Where is the benefit? How do such mechanisms increase individual reproductive and genetic success?

Although the biological approach defines “advantage” in terms of survival and reproduction, the psychological approach often defines “advantage” as feeling better, or being happier. Self-deception occurs because we all want to feel good, and self-deception can help us do so. There is some truth to this, as we shall see, but not much. The main biological objection is this: Even if being happier is associated with higher survival and reproduction, as expected, why should we use such a dubious—and potentially costly—mechanism as self-deception to regulate our happiness? Lying to ourselves has costs. We are basing conscious activity on falsehoods, and in many situations this can turn around and bite us, as we shall see many, many times in this book. Whether during airplane crashes, the planning of stupid offensive wars, personal romantic disasters, family disputes, whatever, we shall see time and again that self-deception brings with it the expected costs of being alienated from reality, although, alas, there is a tendency for other people to suffer disproportionately the costs of our self-deception, while the benefits, such as they are, go to ourselves. So how does self-deception pay for itself biologically? How does it actually improve survival and reproduction?

The central claim of this book is that self-deception evolves in the service of deception—the better to fool others. Sometimes it also benefits deception by saving on cognitive load during the act, and at times it also provides an easy defense against accusations of deception (namely, I was unconscious of my actions). In the first case, the self-deceived fails to give off the cues that go with consciously mediated deception, thus escaping detection. In the second, the actual process of deception is rendered cognitively less expensive by keeping part of the truth in the unconscious. That is, the brain can act more efficiently when it is unaware of the ongoing contradiction. And in the third case, the deception, when detected, is more easily defended against—that is, rationalized—to others as being unconsciously propagated. In some cases, self-deception may give a direct personal advantage by at least temporarily elevating the organism into a more productive state, but most of the time such elevation occurs without self-deception.

In short, this book will attempt to describe a science of self-deception that is actually built on preexisting science—in this case, biology. The book will showcase what seem to be some of the most important features of the subject. The field is in its infancy, and surely many mistakes will be made here, but if the underlying logic is sound, and is linked by evidence and logic to the rest of biology, then corrections should come very quickly and we may rapidly grow a mature science that this book seeks only to outline.

The dynamics of deception and its detection have been studied in a broad range of other species (see Chapter 2), with the advantage that we can see things in others that we can’t easily see in ourselves. This enterprise also greatly extends our range of evidence and leads to a few general principles of some considerable value. Deceiver and deceived are trapped in a coevolutionary struggle that continually improves adaptations on both sides. One such adaptation is intelligence itself. The evidence is clear and overwhelming that both the detection of deception and often its propagation have been major forces favoring the evolution of intelligence. It is perhaps ironic that dishonesty has often been the file against which intellectual tools for truth have been sharpened.

Regarding underlying mechanisms, some interesting work in neurophysiology shows that the conscious mind is more of an observer after the fact, while behavior itself is usually unconsciously initiated (see Chapter 3). Knocking out activity in deception-related areas of the brain improves the quality of deception, while suppression of memory can be achieved consciously by inhibiting brain activity in relevant areas. The classic experiment demonstrating human self-deception shows that we often unconsciously recognize our own voices while consciously failing to do so, and this tendency can be manipulated. An important concept is that of imposed self-deception, in which we act out self-deceptions others have imposed on us. The possibility that self-deception evolves as a purely defensive device to make us feel better is addressed and rejected, with some latitude for self-deception that directly benefits self (without fooling others). The placebo effect provides an interesting example.

Our logic also applies with special force to family and sexual interactions (see Chapters 4 and 5), each involving both conflict and cooperation over reproduction, life’s key aim. Family interactions can select for a divided self, in which our maternal half is in conflict with the paternal half, leading to a kind of “selves deception” between the two halves. Sexual relations are likewise fraught with conflict—and deceit and self-deception—from courtship to long-term life partnerships.

And there is an intimate association between our immune system and our psyches, such that self-deception is often associated with major immune effects, all of which must be calculated if we are to understand the full biological effects of our mental lives (see Chapter 6). There is a whole world of social psychology that shows how our minds bias information, from initial avoidance, to false encoding, memory, and logic, to incorrect statements to others—from one end to the other (see Chapter 7). Key mechanisms include denial, projection, and perpetual efforts to reduce cognitive dissonance.

The analysis of self-deception illuminates daily life, whether the evidence is embedded in personal experience or unconscious and uncovered only through careful study (see Chapter 8). One example from everyday life that has an entire chapter devoted to it is airplane and spacecraft crashes—they permit the cost of self-deception to be studied intensively under almost controlled conditions (see Chapter 9).

Self-deception is intimately tied to false historical narratives, lies we tell ourselves about our past, usually in the service of self-forgiveness and aggrandizement (see Chapter 10). Self-deception plays a large role in the launching of misguided wars (see Chapter 11) and has important interactions with religion, which acts as both an antidote to self-deception and an accelerant (see Chapter 12). We are hardly surprised to note that nonreligious systems of thought—from biology to economics to psychology—are affected by self-deception according to the rule that the more social a discipline, the more its development is retarded by self-deception (see Chapter 13). Finally, as individuals, we can choose whether to fight our own self-deceptions or to indulge them. I choose to oppose my own—with very limited success so far (see Chapter 14).


Deception is a very deep feature of life. It occurs at all levels—from gene to cell to individual to group—and it seems, by any and all means, necessary. Deception tends to hide from view and is difficult to study, with self-deception being even worse, hiding itself more deeply in our own unconscious minds. Sometimes the subject must be ferreted out before it can be inspected, and we often lack key pieces of evidence, given the complexity of the subterfuges and our ignorance of the internal physiological mechanisms of self-deception.

When I say that deception occurs at all levels of life, I mean that viruses practice it, as do bacteria, plants, insects, and a wide range of other animals. It is everywhere. Even within our genomes, deception flourishes as selfish genetic elements use deceptive molecular techniques to over-reproduce at the expense of other genes. Deception infects all the fundamental relationships in life: parasite and host, predator and prey, plant and animal, male and female, neighbor and neighbor, parent and offspring, and even the relationship of an organism to itself.

Viruses and bacteria often actively deceive to gain entry into their hosts, for example, by mimicking body parts so as not to be recognized as foreign. Or, as in HIV, by changing coat proteins so often as to make mounting an enduring defense almost impossible. Predators gain from being invisible to their prey or resembling items attractive to them—for example, a fish that dangles a part of itself like a worm to attract other fish, which it eats—while prey gain from being invisible to their predators or mimicking items noxious to the predator, for example, poisonous species or a species that preys on its own predator.

Deception within species is expected in almost all relationships, and deception possesses special powers. It always takes the lead in life, while detection of deception plays catch-up. As has been said regarding rumors, the lie is halfway around the world before the truth puts its boots on. When a new deception shows up in nature, it starts rare in a world that often lacks a proper defense. As it increases in frequency, it selects for such defenses in the victim, so that eventually its spread will be halted by the appearance and spread of countermoves, but new defenses can always be bypassed and new tricks invented.

Truth—or, at least, truth detection—has been pushed back steadily over time by the propagation of deception. It always amazes me to hear some economists say that the costs of deceptive excesses in our economy (including white-collar crime) will naturally be checked by market forces. Why should the human species be immune to the general rule that where natural selection for deception is strong, deception can be selected that extracts a substantial net cost (in survival and reproduction) every generation? Certainly there is no collective force against this deception, only the relatively slow generation and evolution of counterstrategies. These lines were written in 2006, two years before the financial collapse that resulted from such practices and beliefs. I know nothing about economics and—from evolutionary logic—could not have predicted a thing about the collapse of 2008, but I have disagreed for thirty years with an alleged science called economics that has resolutely failed to ground itself in underlying knowledge, at a cost to all of us (see Chapter 13).

As for the notion that deception is naturally constrained to be of modest general cost, consider the case of stick insects (or Phasmatodea), a group that has given itself over to imitating either sticks (three thousand species) or leaves (thirty species). These forms have existed for at least fifty million years and achieve a remarkably precise resemblance to their models. In those forms resembling sticks, there is apparently strong evolutionary pressure to produce a long, thin (sticklike) body, even if doing so forces the individual to forgo the benefits of bilateral symmetry. Thus, to fit the internal organs into a diminishing space, one of two organs has often been sacrificed, leaving only one kidney, one ovary, one testis, and so on. This shows that selection for successful deception has been powerful enough not only to remold the creature’s external shape but to remold its internal organs as well—even when this is otherwise disadvantageous to the larger creature, as loss of symmetry must often be. Likewise, as we shall see in the next chapter, selection can evolve a male fish that lives its entire adult life pretending to be a female and hooks up with territory-holding males in order to steal paternity of eggs laid in their territories by real females.


What exactly is self-deception? Some philosophers have imagined that self-deception is a contradiction in terms, impossible at the outset. How can the self deceive the self? Does that not require that the self knows what it does not know (p/~p)? This contradiction is easily sidestepped by defining the self as the conscious mind, so that self-deception occurs when the conscious mind is kept in the dark. True and false information may be simultaneously stored, only with the truth stored in the unconscious mind and falsehood in the conscious. Sometimes this involves activities of the conscious mind itself, such as active memory suppression, but usually the processes themselves are unconscious yet act to bias what we are conscious of. Most animals also have a conscious mind (not usually self-conscious), in the sense of a light being turned on (when awake) that allows integrated ongoing concentration on the outside world via their sense organs.

So the key to defining self-deception is that true information is preferentially excluded from consciousness and, if held at all, is held in varying degrees of unconsciousness. If the mind acts quickly enough, no version of the truth need be stored. The counterintuitive fact that needs to be explained is that the false information is put into the conscious mind. What is the point of this? One would think that if we had to store true and false versions of the same event simultaneously, we would store the true version in the conscious mind, the better to enjoy the benefits of consciousness (whatever they may be), while the false information would be kept safely out of sight somewhere in the basement. The hypothesis of this book is that this entire counterintuitive arrangement exists for the benefit of manipulating others. We hide reality from our conscious minds the better to hide it from onlookers. We may or may not store a copy of that information in self, but we certainly act to exclude it from others.


If the main function of self-deception is to make deception more difficult to detect, we are naturally led to how humans detect consciously propagated deception. What cues do we use when we do it? When interactions are anonymous or infrequent, behavioral cues cannot be read against a background of known behavior, so more general attributes of lying must be used. Three have been emphasized:

Nervousness: Because of the negative consequences of being detected, including being aggressed against and also possibly guilt, people are expected to be more nervous when lying.

Control: In response to concern over appearing nervous (or concentrating too hard) people may exert control, trying to suppress behavior, with possible detectable side effects such as overacting, overcontrol, a planned and rehearsed impression, or displacement activities. More to the point, tensing ourselves up almost inevitably increases the pitch of our voices. When asked to create a painful reaction or suppress it, for example in response to cold, children and adults are more successful suppressing than inventing—they tend to overact.

Cognitive load: Lying can be cognitively demanding. You must suppress the truth and construct a falsehood that is plausible on its face and does not contradict anything known by the listener, nor likely to be known. You must tell it in a convincing way and you must remember the story. This usually takes time and concentration, both of which may give off secondary cues and reduce performance on simultaneous tasks.

Cognitive load often appears to be the critical variable among the three, with a minor role for control and very little for nervousness. At least, this seems to be true in real criminal investigations as well as experimental situations designed to mimic them. Absent well-rehearsed lies, people who are lying have to think too hard, and this causes several effects, some of which are opposite to those of nervousness.

Consider, for example, blinking. When nervous, we blink our eyes more often, but we blink less under increasing cognitive load (for example, while solving arithmetic problems). Recent studies of deception suggest that we blink less when deceiving—that is, cognitive load rules. Nervousness makes us fidget more, but cognitive load has the opposite effect. Again, contra usual expectation, people often fidget less in deceptive situations. And consistent with cognitive load effects, men use fewer hand gestures while deceiving and both sexes often employ longer pauses when speaking deceptively. An absurd example of the latter occurred the other day on my property in Jamaica when I questioned a young man just arriving on a motorcycle, intent (in my opinion) on either extorting money or robbing me. What was his name, I wanted to know. “Steve,” he said. “And what is your last name?” Pause. “It is not supposed to take a long time to remember your own last name.” Quick as you can say “Jones,” he said, “Jones.” So it was “Steve Jones”—not an entirely unlikely pair of names in Jamaica—but less believable on its face than his actual name, which turned out to be Omar Clarke. The point is that cognitive load gave him away at once. The most recent work shows that there is by no means always a delay prior to lying. It depends on the kind of lie. Denial is apt to be quicker than the truth, and so are well-rehearsed lies.

Efforts at controlling oneself can also reveal deception. A nice example is pitch of voice. Deceivers tend to have higher-pitched voices. This is a very general finding and is a natural consequence of stress or of any effort to suppress behavior by becoming more rigid. Tensing up the body inevitably tends to raise the pitch of voice, and this tensing will naturally increase the closer the liar comes to the keyword. For example, someone denying a sexual relationship with “Sherri” may see her voice shoot up upon mention of the key person’s name: “You think I am there with SHERri.” Well, I had been leaning toward that theory, but now I had a fresh piece of evidence.

Another effect of suppression is the production of displacement activities. As classically described in other animals, these are irrelevant activities often seen when two opposing motivations are simultaneously aroused. Since neither impulse can express itself, the blocked energy easily activates irrelevant behavior, such as a twitch. For this reason, displacement activities in primates reliably indicate stress. For example, I once tried to slip a minor lie by a female friend at a bar and saw my left arm twitch involuntarily. Since we had by then been dating for some time, her eyes shot at once to the twitching arm. A few months later, the situation happened again, only with the roles reversed. If this had been a tennis match, the referee would have said on each occasion, “Advantage, your opponent.”

Nervousness is almost universally cited as a factor associated with deception, both by those trying to detect it as well as by those trying to avoid it, yet surprisingly enough, it is one of the weaker factors in predicting deception in scientific work. This is partly because, with no ill effects of having their deception detected, many experiments do not make people nervous. But also in real-life situations (for example, criminal investigations), being suspected of lying can make you nervous regardless of whether you lie and, perhaps more important, because we are conscious of our nervousness as a factor, suppression mechanisms may be almost as well developed as the nervousness itself, especially in those experienced in lying. And as we saw earlier, the effects of cognitive load involved in lying are often opposite to those of nervousness.

The point about cognitive load (and pitch of voice) is that there is no escape. If suppressing your nervousness increases pitch of voice, then trying to suppress that effect may only increase pitch further. If it is cognitively expensive to lie, there is no obvious way to reduce the expense, other than to increase unconscious control. Mechanisms of denial and repression may serve to reduce immediate expense, but with ramifying costs later on.

Separately, it is worth pointing out that cognitive load has important effects across a broad range of psychological processes, according to the rule that the greater the cognitive load, the more likely the unconscious processes will be revealed. For example, under cognitive load, people will more often blurt out something they are trying to suppress and will more often express biased opinions they are otherwise hiding. In short, cognitive load does more than slow down your responses—in a whole host of ways, it tends to reveal unconscious processes. These predominate when conscious degree of control is minimized because of cognitive load.

Verbal details of lies can also be revealing. Excellent work, aided by computer analysis, has demonstrated several common verbal features of lies. We cut down on the use of “I” and “me” and increase other pronouns, as if disowning our lie. We cut down on qualifiers, such as “although.” This streamlines the lie, lowering both our immediate cognitive load and later need to remember. A truth teller might say, “Although it was raining, I still walked to the office”; a liar would say, “I walked to the office.” Negative terms are more common, perhaps because of guilt or because lies more frequently involve denial and negation.

It is difficult to measure the frequency with which lies are detected in everyday life. Interviews of people in the United States show that they believe their lies are detected 20 percent of the time and that another 20 percent may be detected. Of course, the 60 percent of lies they feel are successful may contain some detections where the detector hides his or her knowledge of the deception.


How biologically deep is the subject we are discussing? Many people imagine that self-deception is, almost by definition, a human phenomenon, the “self” suggesting the presence of language. But there is no reason to suppose that self-deception is not far deeper in evolutionary history, as it does not require words. Consider self-confidence, a personal variable that others can measure. It can be inflated to deceive them, with self-deception making the act more plausible. This feature probably extends far back in our animal past.

In nature, two animals square off in a physical conflict. Each is assessing its opponent’s self-confidence along with its own—variables expected to predict the outcome some of the time. Biased information flow within the individual can facilitate false self-confidence. Those who believe their self-enhancement are probably more likely to get their opponent to back down than those who know they are only posing. Thus, nonverbal self-deception can be selected in aggressive and competitive situations, the better to fool antagonists. Much the same could be said for male/female courtship. A male’s false self-confidence may give him a boost some of the time. A biased mental representation can be produced, by assumption, without language. Note, of course, that self-deception tends to work only with plausible limits to self-inflation.

The above is meant to demonstrate that in at least two widespread contexts—aggressive conflict and courtship—selection for deception may easily favor self-deception even when no language is involved. There are undoubtedly many other such contexts, for example, parent/offspring. On top of that, as we shall see, very clever recent work demonstrates in monkeys forms of self-deception that are well-known in humans: a consistency bias, for example, as well as implicit in-group favoritism, both being shown by the same kinds of experiments that reveal them in humans. As we shall see, men are more prone to overconfidence than are women, just as expected, and in rational situations such as stock trading, where fooling others is rarely involved, men do correspondingly worse.

Self-confidence is an internal variable and thus especially prone to deception. I can inflate my apparent size by muscling up, but this is fairly obvious to observers, and increasing my apparent symmetry, another important variable, is very difficult to achieve. But pretending to be more confident than I am is more easily achieved and more strongly selects for self-deception, especially when self-confidence may be as important as apparent size in predicting aggressive outcomes. Thus, I believe that overconfidence is one of the oldest and most dangerous forms of self-deception—both in our personal lives and in global decisions, such as going to war.

On the other hand, language certainly greatly expanded the opportunities for deceit and self-deception in our own lineage. If one great virtue of language is its ability to make true statements about events distant in space and time, then surely one of its social drawbacks is its ability to make false statements about events distant in space and time. These are so much less easily contradicted than statements about the immediate world. Once you have language, you have an explicit theory of self and of social relationships ready to communicate to others. Numbers of new true assertions are matched by an even greater number of false ones.

A very disturbing feature of overconfidence is that it often appears to be poorly associated with knowledge—that is, the more ignorant the individual, the more confident he or she maybe. This is true of the public when asked questions of general knowledge. Sometimes this phenomenon varies with age and status, so that senior physicians, for example, are both more likely to be wrong and more confident they are right, a potentially lethal combination, especially among surgeons. Another case with tragic consequences concerns eyewitness testimony—witnesses who are more mistaken in eyewitness identification and more confident that they are right, and this in turn has a positive effect on jurors. It may be that a rational approach to the world is nuanced and gray, capable of accommodating contradictions, all of which leads to hesitancy and a lack of certainty, as is indeed true. An easy shortcut is to combine ignorance with straight-out endorsement of ignorance—no signs of rational inquiry but, more important, no signs of self-doubt or contradiction.


We begin with simple cases of self-inflation and derogation of others. We consider the effects of in-group feelings, a sense of power, and the illusion of control. Then we imagine false social theories, false internal narratives, and unconscious modules as additional sources of self-deception.

Self-Inflation Is the Rule in Life

Animal self-inflation routinely occurs in aggressive situations (size, confidence, color) as well as in courtship (same variables). Self-inflation is also the dominant style in human psychological life, adaptive self-diminution appearing in both animals and humans as an occasional strategy (see Chapter 8). Much of this self-inflation is performed in the service of what one psychologist aptly called “beneffectance”—appearing to be both beneficial and effective to others. Subtle linguistic features may easily be involved. When describing a positive group effect, we adopt an active voice, but when the effect is negative, we unconsciously shift to a passive voice: this happened and then that happened and then costs rained down on all of us. Perhaps a classic in the genre was the man in San Francisco in 1977 who ran his car into a pole and claimed afterward, as recorded by the police: “The telephone pole was approaching. I was attempting to swerve out of the way, when it struck my front end.” Perfectly legitimate, but it shifts the blame to the telephone pole. And self-bias extends in every direction. If you question BMW owners on why they own that brand of car, they will tell you it had nothing to do with trying to influence others but will see others as owning one for exactly that reason.

Self-inflation results in people routinely putting themselves in the top half of positive distributions and the lower half of negative ones. Of US high school students, 80 percent place themselves in the top half of students in leadership ability. This is not possible. But for self-deception, you can hardly beat academics. In one survey, 94 percent placed themselves in the top half of their profession. I plead guilty. I could be tied down to a bed in a back ward of some hospital and still believe I am outperforming half my colleagues—and this is not just a comment on my colleagues.

When we say we are in the top 70 percent of people for good looks, this may be only our mouths talking. What about our deeper view? A recent methodology gives a striking result. With the help of a computer, individual photos were morphed either 20 percent toward attractive faces (the average of fifteen faces regarded as attractive out of a sample of sixty) or 20 percent toward unattractive ones (people with cranial-facial syndrome, which produces a twisted face). Among other effects, when a person tries to quickly locate his or her real face, the 20 percent positive face, or the 20 percent negative one, each embedded in a background of eleven faces of other people, he or she is quickest to spot the positive face (1.86 seconds), 5 percent slower for the real face (2.08 seconds), and another 5 percent slower for the ugly one (2.16 seconds). The beauty is that there has not been the usual verbal filter—what do you think of yourself?—only a measure of speed of perception. When people are shown a full array of photos of themselves, from 50 percent more attractive to 50 percent less attractive, they choose the 20 percent better-looking photo as the one they like the most and think they most resemble. This is an important, general result: self-deception is bounded—30 percent better looking is implausible, while 10 percent better fails to gain the full advantage.

I hardly need the above result to convince myself, because if I am in a big city, I experience the effect almost every week. I am walking down the street with a younger, attractive woman, trying to amuse her enough that she will permit me to remain nearby. Then I see an old man on the other side of her, white hair, ugly, face falling apart, walking poorly, indeed shambling, yet keeping perfect pace with us—he is, in fact, my reflection in the store windows we are passing. Real me is seen as ugly me by self-deceived me.

Is the tendency toward self-inflation really universal in humans? Some cultures, such as in Japan and China, often value modesty, so that if anything, people might be expected to compete to show lack of self-inflation. Certainly in some domains modesty rules, but in general it seems that one can still detect tendencies toward self-inflation, including self over other in terms of good and bad. Likewise, as in other cultures, inflation often applies to friends, who are seen as better than average (though less strongly than self in some cultures and more so in others).

By the way, recent work has located an area of the brain where this kind of self-inflation may occur. Prior work has shown that a region called the medial prefrontal cortex (MPFC) seems often to be involved in processing self-related information. Even false sensations of self are recorded there, and the region is broadly involved in deceiving others. One can suppress neural activity in this region (by applying a magnetic force to the skull where the brain activity takes place), deleting an individual’s tendencies toward self-enhancement (while suppression in other regions has no effect).

An extreme form of self-adulation is found among so-called narcissists. Though people in general overrate themselves on positive dimensions, narcissists think of themselves as special and unique, entitled to more positive outcomes in life than others. Their self-image is good in dominance and power (but not caring or morality). Thus, they seem especially oriented toward high status and will seek out people of perceived status apparently for this reason. Though people in general are overconfident regarding the truth of their assertions, narcissists are especially so. Because they are overconfident, narcissists in the laboratory are more likely to accept bets based on false knowledge and hence lose more money than are less narcissistic people. They are persistent in their delusions as well. They predict high performance in advance, guess they have done well after the fact when they have not, and continue to predict high future performance despite learning about past failure—a virtuoso performance indeed. Calling someone a narcissist is not a compliment—it suggests someone whose system of self-enhancement is out of control, to the individual’s disadvantage.

Derogation of Others Is Closely Linked

In one sense, derogation of others is the mirror image of self-inflation; either way, you look relatively better. But there is an important difference. For self-inflation, you need merely change the image of yourself to achieve the desired effect, but for derogation of others, you may need to derogate an entire group. Exactly when would we expect this to be advantageous to you? Perhaps especially when your own image has been lowered—suddenly it becomes valuable to deflect attention onto some disliked group—so that by comparison, you do not look as bad as they do.

This is precisely what social psychology appears to show—derogation of others appears more often as a defensive strategy that people adopt when threatened. Contrast two sets of college students who have been told (at random) that they scored high or low on an IQ test. Only those scoring low later choose to denigrate a Jewish woman (but not a non-Jewish) woman on a variety of traits. Apparently association with intellectual achievement is sufficient reason to denigrate the woman if one’s own intellectual powers are in doubt. Likewise, the same “low scorers” (as they are told they are) are more likely to complete “duh” and “dan” as “dumb” and “dangerous” when subliminally primed with a black face. So let us say that there is some evidence that I am stupid (in fact, fictitious). I apparently lash out by denigrating members of allegedly intelligent groups (against which there may be other biases) while calling attention to negative stereotypes of allegedly less gifted ones. Incidentally, the derogation does make me feel better afterward, as measured by an interview, so the act may fool me as well.

As we shall see later (Chapter 11), derogation of others—including racial, ethnic, and class prejudices—can be especially dangerous when contemplating hostile activity, such as warfare.

In-Group/Out-Group Associations Among Most Prominent

Few distinctions bring quicker and more immediate psychological responses in our species than in-group and out-group—almost as much as, if not sometimes more than, for self and other. Just as you are on average better than others, so is your group—just as others are worse, so are out-groups. Such groups, in and out, are pathetically easy to form. You need not stoke Sunni or Catholic fundamentalism to get people to feel the right way; just make some wear blue shirts and others red and within a half-hour you will induce in-group and out-group feelings based on shirt color.

Once we define an individual as belonging to an out-group, a series of mental operations are induced that, often quite unconsciously, serve to degrade our image of the person, compared with an in-group member. The words “us” and “them” have strong unconscious effects on our thinking. Even nonsense syllables (such as “yaf,” “laj,” and “wuhz”), when paired with “us,” “we,” and “ours,” are preferred over similar syllables paired with “they,” “them,” and “theirs.” And these mechanisms can be primed to apply to artificial groups, experimentally created—those with different-colored shirts, for example. We easily generalize bad traits in an out-group member while reserving generalization for good traits performed by an in-group member. For example, if an out-group member steps on my toes, I am more likely to say, “He is an inconsiderate person,” though with an in-group member I will describe the behavior exactly: “He stepped on my toes.” In contrast, an out-group member acting nicely is described specifically—“she gave me directions to the train station”—while an in-group member is described as being “a helpful person.” Similar mental operations serve to derogate others compared to self. Even minor positive social traits, such as a smile, are imputed unconsciously more often to in-group members than to out-group ones.

This bias begins early in life, among infants and young children. They divide others into groups based on ethnicity, attractiveness, native language, and sex. By age three, they prefer to play with in-group members and also begin to display explicit negative verbal attitudes toward out-group members. They also share with adults a strong tendency to prefer groups to which they have been randomly assigned, to believe that their own group is superior to others, and to begin to treat out-group members in a harmful fashion.

Recent work shows a similar mental architecture in monkeys regarding in-groups and out-groups. When a test is performed on a monkey in which it responds visually to matched facial pictures of in-group and out-group members, corrected for degree of experience with them, there is a clear tendency to view the out-group member longer—a measure of concern and hostility. Likewise, a monkey will attach an out-group orientation to an object an out-group group member is looking at and vice versa for an in-group member. Finally, male monkeys (but not female) more readily associate out-group members with pictures of spiders but in-group members with pictures of fruits. The beauty of this work is that the monkeys were migrating in and out of different groups at various times, so one could control exactly for degree of familiarity. In-group members, for example, tend to be more familiar, but independent of familiarity, they are preferred over out-group members. That males more readily associate out-group members with negative stimuli, and in-group with positive, is consistent with work on humans in which men typically are relatively more prejudiced against out-group than in-group members.

The Biases of Power

It has been said that power tends to corrupt and absolute power, absolutely. This usually refers to the fact that power permits the execution of ever more selfish strategies toward which one is then “corrupted.” But psychologists have shown that power corrupts our mental processes almost at once. When a feeling of power is induced in people, they are less likely to take others’ viewpoint and more likely to center their thinking on themselves. The result is a reduced ability to comprehend how others see, think, and feel. Power, among other things, induces blindness toward others.

The basic methodology is to induce a temporary state of mind via a so-called prime, which can be conscious or unconscious and as short as a word or considerably more detailed, as in this case. The power prime consists of asking some people to write for five minutes about a situation in which they felt powerful, supplemented by having the subjects apportion candy among a group, while the low-power prime group writes about the opposite situation and is allowed only to say the amount of candy they hope to receive.

This modest prime produced the following striking results. When the subjects were asked to snap their right-hand fingers five times in succession and quickly write the letter E on their foreheads, an unconscious bias was revealed. Those who had been primed to feel powerless were three times as likely to write the E so that others could read it, compared to those primed to feel powerful. This effect was equally strong for the two sexes. The basic shift in focus from other to self with power was confirmed in additional work. When compared with those with a neutral prime, those with the power prime were less able to discriminate among common human facial expressions associated with fear, anger, sadness, and happiness. Again, the sexes responded similarly to the power prime, but in general women are better at making the emotional discriminations, and men are more likely to be overconfident. In short, powerful men suffer multiple deficits in their ability to apprehend the world of others correctly, due to their power and their sex. And since, at the national level, it is powerful men who usually decide for war, they have an in-built bias in the wrong direction, less oriented toward others, less inclined to value their viewpoint, with, alas, often tragic effects all the way around (see Chapter 11).

There must be a thousand examples of power inducing male blindness, but why not look at Winston Churchill? He experienced highs and lows in his life that were often nearly absolute. One moment he was the prime minister of the UK during World War II—one of the most powerful prime ministers ever—and the next moment he is an ex–prime minister with almost no political power at all. Similar reverses were associated with World War I. At the heights of his power, he was described as dictatorial, arrogant, and intolerant, the stuff of which tyrants are made; at low power, he was seen as introspective and humble.

Moral Superiority

Few variables are as important in our lives as our perceived moral status. Even more than attractiveness and competence, degree of morality is a variable of considerable importance in determining our value to others—thus it is easily subject to deceit and self-deception. Moral hypocrisy is a deep part of our nature: the tendency to judge others more harshly for the same moral infraction than we judge ourselves—or to do so for members of other groups compared to members of our own group. For example, I am very forgiving where my own actions are concerned. I will forgive myself in a heartbeat—and toss in some compassionate humor in the bargain—for a crime that I would roast anybody else for.

Social psychologists have shown these effects with an interesting twist. When a person is placed under cognitive load (by having to memorize a string of numbers while making a moral evaluation), the individual does not express the usual bias toward self. But when the same evaluation is made absent cognitive load, a strong bias emerges in favor of seeing oneself acting more fairly than another individual doing the identical action. This suggests that built deeply in us is a mechanism that tries to make universally just evaluations, but that after the fact, “higher” faculties paint the matter in our favor. Why might it be advantageous for our psyches to be organized this way? The possession of an unbiased internal observer ought to give benefits in policing our own behavior, since only if we recognize our behavior correctly can we decide who is at fault in conflict with others.

The Illusion of Control

Humans (and many other animals) need predictability and control. Experiments show that occasionally administering electrical shocks at random creates much more anxiety (profuse sweating, high heart rate) than regular and predictable punishment. Certainty of risk is easier to bear than uncertainty. Controlling events gives greater certainty. If you can control, to some degree, your frequency of being shocked, you feel better than if you have less control over less frequent shocks. Similar effects are well known for other animals, such as rats and pigeons.

But there is also something called an illusion of control, in which we believe we have greater ability to affect outcomes than we actually do. For the stock market, we have no ability to affect its outcome by any of our actions, so any notion that we do must be an illusion. This was measured directly on actual stockbrokers. Scientists set up a computer screen with a line moving across it more or less like the stock market average, up and down—jagged—initially with a general bias downward but then recovering to go into positive territory, all while a subject sits in front of the screen, able to press a computer mouse, and told that pressing it “may” affect the progress of the line, up or down. In fact, the mouse is not connected to anything. Afterward, people are asked how much they thought they controlled the line’s movement, a measure of their “illusion of control.”

A very interesting finding emerged when those taking the tests were stockbrokers (105 men and 2 women) whose firms provided data both on internal evaluation and on salaries paid. In both cases, those with a higher illusion of control did worse. They were evaluated by their superiors as being less productive and, more important, they earned less money. Cause and effect is not certain, of course. But if the direction of effect were such that poor performers responded to their own failure by asserting greater control over external events, then they would be blaming themselves more for failure than success, contrary to the well-documented human bias to rationalize away one’s failures. The alternative scenario then seems much more likely—that imagining one has greater control over events than one actually has leads to poorer performance: being a worse stockbroker. Note the absence of a social dimension here. One has no control over the movement of markets and scarcely much knowledge. There seems little possibility to fool your superiors along these lines when they can measure your success easily and directly. It is not at all clear that such an illusion in other situations may not give some social benefits—or even individual ones, as in prompting greater effort toward achieving actual control.

It is interesting to note that lacking control increases something called illusory pattern recognition. That is, when individuals are induced to feel a lack of control, they tend to see meaningful patterns in random data, as if responding to their unfortunate lack of control by generating (false) coherence in data that would then give them greater control.

The Construction of Biased Social Theory

We all have social theories, that is, theories regarding our immediate social reality. We have a theory of our marriages. Husband and wife may agree, for example, that one party is a long-suffering altruist while the other is hopelessly selfish, but disagree over which is which. We each have a theory regarding our employment. Are we an exploited worker, underpaid and underappreciated for value given—and therefore fully justified in minimizing output while stealing everything that is not nailed down? We usually have a theory regarding our larger society as well. Are the wealthy unfairly increasing their share of resources at the expense of the rest of us (as has surely been happening) or are the wealthy living under an onerous system of taxation and regulation? Does democracy permit us to reassert our power at regular intervals or is it largely a sham exercise controlled by wealthy interests? Is the judicial system regularly biased against our kinds of people (African Americans, the poor, individuals versus corporations)? And so on. The capacity for these kinds of theories presumably evolved not only to help understand the world and to detect cheating and unfairness but also to persuade self and others of false reality, the better to benefit ourselves.

The unconscious importance of biased social theory is revealed most vividly perhaps when an argument breaks out. Human arguments feel so effortless because by the time the arguing starts, the work has already been done. The argument may appear to burst forth spontaneously, with little or no preview, yet as it rolls along, two whole landscapes of information lie already organized, waiting only for the lightning of anger to reveal them. These landscapes have been organized with the help of unconscious forces designed to create biased social theory and, when needed, biased evidence to support them.

Social theory inevitably embraces a complex set of facts, which may be only partially remembered and poorly organized, the better to construct a consistent, self-serving body of social theory. Contradictions may be far afield and difficult to detect. When Republicans in the US House of Representatives bemoaned what the Founding Fathers would have thought had they known a future president (Clinton) would have sex with an intern, the black American comedian Chris Rock replied that they were having sex not with their interns but with their slaves. This of course is an important function of humor—to expose and deflate hidden deceit and self-deception (see Chapter 8).

False Personal Narratives

We continually create false personal narratives. By enhancing ourselves and derogating others, we automatically create biased histories. We were more moral, more attractive, more “beneffective” to others than in fact we were. Recent evidence suggests that forty- to sixty-year-olds naturally push memories of negative moral actions roughly ten years deeper into their past than memories of positive ones. Likewise, there is a similar but not so pronounced bias regarding nonmoral actions that are positive or negative. An older self acted badly; a recent self acted better. I am conscious of this in my own life. When saying something personal, whether negative or positive, I displace it farther in the past, as if I am not revealing anything personal about my current self, but this is especially prominent for negative information—it was a former self acting that way.

When people are asked to supply autobiographical accounts of being angered (victim) or angering someone else (perpetrator), a series of sharp differences emerges. The perpetrator usually describes angering someone else as meaningful and comprehensible, while victims tend to depict such an event as arbitrary, unnecessary, or incomprehensible. Victims often provide a long-term narrative, especially one emphasizing continuing harm and grievance, while perpetrators describe an arbitrary, isolated event with no lasting implications. One effect of this asymmetry between victim and perpetrator is that when the victim suppresses anger at a provocation, only to respond after an accumulation of slights, the perpetrator sees only the final, precipitating event and easily views the victim’s angry response as an unwarranted overreaction.

There is also something called false internal narratives. An individual’s perception of his or her own ongoing motivation may be biased to conceal from others the true motivation. Consciously, a series of reasons may unfold to accompany actions so that when they are challenged, a convinced alternative explanation is at once available, complete with an internal scenario—“but I wasn’t thinking that at all; I was thinking . . . ”

Unconscious Modules Devoted to Deception

Over the years, I have discovered that I am an unconscious petty thief. I steal small objects from you while in your presence. I steal pens and pencils, lighters and matches, and other useful objects that are easy to pocket. I am completely unconscious of this while it is going on (as are you, most of the time) even though I have been doing it for more than forty years now. Perhaps because the trait is so unconscious, it appears to have a life of its own and often seems to act directly against my own narrow interests. I steal chalk from myself while lecturing and am left with no chalk with which to lecture (nor do I have a blackboard at home). I steal pens and pencils from my office, only to offload them at home—leaving me none the next day at the office—and so on. Recently I stole a Jamaican principal’s entire set of school keys from the desk between us. No use to me, high cost to him.

In summary, there appears to be a little unconscious module in me devoted to petty thievery, sufficiently isolated to avoid interfering with ongoing activity (such as talking). I think of a little organism in me looking out for the matches, the ideal moment to seize them, the rhythm of the actual robbery, and so on. Of course, this organism will study the behavior of my victim but it will also devote time to my own behavior, in order best to integrate the thievery while not giving off any clues. Noteworthy features of this little module in my own life are that the behavior has changed little over my lifetime, and that increasing consciousness of the behavior after the fact has done little or nothing to increase consciousness prior to, during, or immediately after the behavior. The module also appears to misfire more often the older I get. Incidentally, the only time I can remember getting caught is by my brother, born a year after me—we were raised as twins. We each had an ability to read deception in the other that others in the family could not match. Once when we were both in our late forties, I began to pocket his pen, but he grabbed my hand halfway to my pocket and the pen was his again.

I think I never pilfer from someone’s office when it is empty. I will see a choice pen and my hand moving toward it but will say, “Robert, that would be stealing,” and stop. Perhaps if I steal from you in front of your face, I believe you have given implicit approval. When I stole the principal’s keys, I was simultaneously handing him some minor repayment for a service performed and thinking I might be paying too much. Perhaps I said to myself, “Well this is for you, so this must be for me,” and he went along with the show.

How many of these unconscious modules operate in our lives? The only way I know about this one is that my pockets fill up with contraband, and I get occasional questions from friends. Stealing ideas will not leave much evidence and is very common in academia. I once wrote a paper that borrowed heavily from a well-known book, a fact I had forgotten by the time I finished the paper. Only when I reread my copy of the book did I see where the ideas had come from—these sections were heavily underlined, with many marginal notations.

It also seems certain that unconscious ploys to manipulate others in specific ways must be common. Specialized parts of ourselves look out for special opportunities in others. The value of this is precisely that two or more activities can go on simultaneously, with little or no interference. If an independent unconscious module studies for opportunities to steal or lie, it need not interfere (except slightly) with other, ongoing mental activities. We really have no idea how common this kind of activity may be.


In summary, the hallmark of self-deception in the service of deceit is the denial of deception, the unconscious running of selfish and deceitful ploys, the creation of a public persona as an altruist and a person “beneffective” in the lives of others, the creation of self-serving social theories and biased internal narratives of ongoing behavior, as well as false historical narratives of past behavior that hide true intention and causality. The symptom is a biased system of information flow, with the conscious mind devoted (in part) to constructing a false image and at the same time unaware of contravening behavior and evidence.

Of course, it must usually be advantageous for the truth to be registered somewhere, so that mechanisms of self-deception are expected often to reside side-by-side with mechanisms for the correct apprehension of reality. The mind must be constructed in a very complex manner, repeatedly split into public and private portions, with complicated interactions between them.

The general cost of self-deception is the misapprehension of reality, especially social, and an inefficient, fragmented mental system. As we shall learn, there are also important immune costs to self-deception, and there is something called imposed self-deception, in which an organism works unconsciously to further the interests of the organism inducing the self-deception costs on all sides, the worst of all possible worlds. At the same time, as we shall also see in Chapter 3, there is sufficient slack in the system for people to sometimes deceive themselves for direct advantage (even immunological). Before we turn to that, we will review the subject of deception in nature. There is an enormous literature on this subject and a few principles of genuine importance.