The End of Average: How We Succeed in a World That Values Sameness - Todd Rose (2016)



Are you an extrovert or an introvert? This deceptively simple question plunges us into one of the oldest and most contentious debates in psychology: the nature of your personality. On one side of the debate are the trait psychologists, who argue that our behavior is determined by well-defined personality traits, such as introversion and extroversion. These psychologists date their scientific origins all the way back to Francis Galton, who argued that human temperament and character were “durable realities, and persistent factors of our conduct.”1

Situation psychologists, on the other hand, claim the environment drives our personality far more than personal traits. They believe that culture and immediate circumstances determine how we behave, arguing, for example, that violent movies are likely to make people aggressive, regardless of innate tendencies.2 The situation psychologists trace their origins to an equally impressive founder, Adolphe Quetelet, who famously claimed that “society prepares the crime and the guilty are only the instrument by which it is executed.”3

The famous obedience study by Yale psychologist Stanley Milgram is a classic situationist experiment.4 In this study participants were told to deliver electrical shocks (ranging from 15V to a potentially fatal 450V) to a person located in another room every time the individual gave an incorrect answer to a question. Unbeknownst to the participants, the people in the other room were actors and were not actually receiving electrical shocks. Milgram wanted to know: How far could people be pushed to harm another person if given orders by an authority figure? The results were alarming: 65 percent of people administered the full 450V shock even when the person in the other room begged them not to, cited heart problems, or simply stopped responding.5 According to situationists, the results of this study proved that a strong situation influences the behavior of most people, even compelling them to acts of cruelty.

Throughout the twentieth century, the trait and situation theorists battled it out in the halls and laboratories of academia, but by the 1980s the trait psychologists emerged as the undisputed victors.6 While the situation psychologists were able to predict, on average, how most people would behave in a situation, they could never predict how any particular individual was going to behave. They could, for instance, predict that a majority of people would deliver an electrical shock to an innocent stranger if commanded to by an authority figure, but not whether Mary Smith from Cincinnati was more likely to do so than Abigail Jones from Tallahassee.

In contrast, the trait theorists did a better job of predicting the behavior of any given individual—on average, at least. They also produced something far more useful to business: personality tests. Today, twenty-five hundred different kinds of trait-based assessments are administered each year to employees.7 For example, eighty-nine of the Fortune 100 companies, thousands of universities, and hundreds of government agencies use the Myers-Briggs Type Indicator (MBTI) test, which assesses four dimensions of personality and categorizes people into sixteen distinct types.8, meanwhile, relies on the Enneagram personality test to evaluate applicants, a test that assigns people one of nine numbered personality types (“Type 8,” for instance, is a “Challenger”).9 These and other tests are all part of a half-billion-dollar industry that is dedicated exclusively to measuring and classifying our personality traits.

But perhaps the greatest reason for the success of trait theory is that it seems to jibe with our private sense of ourselves—and others. When confronted with the Myers-Briggs, for instance, we tend to instinctively map our personality onto its structure, quickly deciding if we are introverts or extroverts, thinkers or feelers, judgmental or perceptive. Similarly, if asked to describe the personality of our best friend—or our worst enemy—we would most likely offer a list of their prominent traits. They are helpful, optimistic, and impulsive, we might conclude, or perhaps aggressive, cynical, and selfish. Similarly, if I asked you to point out a few introverted colleagues, I suspect you would have little difficulty providing names.

Tests that score us on a set of traits are popular because they satisfy our deep-seated conviction that we can get to the heart of a person’s “true” identity by knowing those traits that define the essence of that person’s personality. We tend to believe that, deep down in the bedrock of a person’s soul, someone is essentially wired to be friendly or unfriendly, lazy or industrious, introverted or extroverted, and that these defining characteristics will shine through no matter what the circumstances or task. This belief is known as essentialist thinking.10

Essentialist thinking is both a consequence and a cause of typing: if we know someone’s personality traits, we believe we can classify them as a particular type. And if we know someone belongs to a particular type, we believe we can form conclusions about their personality and behavior. That’s what happened to me in seventh grade, after I started a spitball fight in English class. The incident (rightly) earned me a trip to the school counselor’s office, and since it was not my first visit, I was required to complete an aggression questionnaire that determined I ranked near the 70th percentile. My parents were called to the school where the counselor informed them that, in his opinion, I was an “aggressive child” and patiently laid out the evidence: my spitballs, a fight earlier in the year, and the most damning evidence of all—the questionnaire results.

The counselor believed that aggressiveness was something essential about my character, a defining feature of who I was, and he understandably presumed this knowledge allowed him to make predictions about me. He recommended that I see a psychologist, warning that aggressive children usually struggle in school and are often not suited for the pressures of college. He also informed my parents that I would struggle with authority figures and would therefore have trouble holding down a job if my counseling was not effective. This is, of course, the reason we rely on essentialist thinking to size people up: knowing someone’s traits seems to grant us the ability to predict how they will perform in school, on the job, or even (as dating websites insist) as a romantic partner.11

But here’s the problem: when it comes to predicting the behavior of individuals—as opposed to predicting the average behavior of a group of people—traits actually do a poor job. In fact, correlations between personality traits and behaviors that should be related—such as aggression and getting into fights, or extroversion and going to parties—are rarely stronger than 0.30.12 Just how weak is that? According to the mathematics of correlation, it means that your personality traits explain 9 percent of your behavior. Nine percent! There are similarly weak correlations between trait-based personality scores and academic achievement, professional accomplishments, and romantic success.13

If our personality and behavior are not explained by a collection of enduring traits, then how do we explain our personalities? After all, our behavior is not random—and it doesn’t just depend on the situation alone. The reason trait theory, and the essentialist thinking that supports it, does such a poor job explaining human behavior is because it completely ignores the second principle of individuality: the context principle.


University of Washington professor Yuichi Shoda is one of the top researchers in child development, and one of my favorite scientists in all psychology.14 Shoda began conducting personality research as a Stanford graduate student in the 1980s, at the height of the academic conflict between the trait theorists and the situation theorists. But though his research thrust him into the middle of the personality debate, he never sided with either camp. Early on he intuited that both approaches were incomplete and, ultimately, misguided.15

Shoda approached human personality analytically and systematically, throwing out old assumptions. In doing so, he became convinced that the perennial conflict between trait and situationist theory was holding the field back because both approaches failed to account for what he saw as the true complexity of the individual. Shoda thought there was a third way to think about personality, not in terms of traits or situations, but in terms of the ways in which traits and situations interacted. This was no compromise position—if he was right, it would mean that both sides of the venerable personality debate were wrong.16

To persuade other scientists of the legitimacy of his theory, he knew he would need a very convincing study, one that gathered a large amount of behavioral data on individuals across a variety of natural settings. It did not seem feasible to study adults with that kind of comprehensiveness, since it would almost certainly require monitoring them all day long, including at their jobs. Instead, he decided to study children at a residential summer camp program in New Hampshire known as Wediko Children’s Services.17

The kids at Wediko ranged in age from six to thirteen and were mostly from low-income families in the Boston area. Shoda followed 84 children (60 boys and 24 girls) around the camp during every hour of camp activity, for six weeks, documenting their behavior in every location at Wediko except for the bathrooms. To accomplish this enormous undertaking, Shoda relied on a team of seventy-seven adult camp counselors who recorded more than 14,000 hours of observation, an average of 167 hours for each child. The camp counselors also filled out subjective ratings for every child at the end of every hour.18

At the end of the summer, Shoda painstakingly sifted through this massive bundle of data by first analyzing each individual child’s behavior, and then looking for collective patterns. The results were plain and unmistakable—and a direct blow to essentialist thinking: each child exhibited different personalities in different situations.19

Now, in some sense, this is no big surprise, and you might be quick to counter, “Of course we behave differently in different settings!” But think about trait models of personality for a moment. The Myers-Briggs, for example, definitely does not say that our traits fundamentally change depending on the setting; as a matter of fact, it says the opposite: that dispositions like whether we are introverted or extroverted influence our behavior no matter the situation. Trait-based personality tests assume that we can be either extroverts or introverts … but not both. Yet, Shoda discovered that every child really was both.20

A girl might be extroverted in the cafeteria, but introverted on the playground. A boy might be extroverted on the playground, but introverted in math class. And it was not the situation alone that was the determining factor: if you picked two girls, one might be introverted in the cafeteria and extroverted in the classroom, while the other might be extroverted in the cafeteria and introverted in the classroom. The way someone behaved always depended on both the individual and the situation. There was no such thing as a person’s “essential nature.” Sure, you could say someone was more introverted or extroverted on average—this was, in fact, exactly what trait psychology amounted to. But if you relied on averages, then you missed out on all the important details of a person’s behavior.

Shoda’s results directly contradicted the basic tenets of trait theory. Assessing personality on average may have been good enough for academics trying to draw broad conclusions about groups of people, but it is not good enough if you are looking to hire the employee best suited for the job or to deliver the most effective counseling to a student, and it is not nearly good enough for making decisions about you. Defining yourself as “generous” or “stingy” does not do justice to the fact that you donate money to a struggling non-profit organization with a sense of mission, but not to your well-endowed alma mater. However, Shoda’s results also repudiated situation theory, since his data demonstrated that any given situation affected each person differently. Not surprisingly, when personality psychologists learned about Shoda’s results, many reacted the same way psychometricians did when they first heard about Peter Molenaar’s ergodic switch: by accusing Shoda of promoting anarchy.

Shoda seemed to be suggesting that nothing about people’s personalities was consistent, that their behaviors were a constant whirlwind, shifting randomly from place to place. What were personality theorists supposed to model if traits were no longer stable? But Shoda wasn’t undermining the concept of personality—rather, by placing the person and context together, he was giving it life. Shoda demonstrated that, in fact, there is something consistent about our identity—it just wasn’t the kind of consistency anyone expected: we are consistent within a given context. According to Shoda’s results (as well as a great deal of subsequent research), if you are conscientious and neurotic while driving today, it’s a pretty safe bet you will be conscientious and neurotic while driving tomorrow. At the same time, what makes you uniquely you is that you may not be conscientious and neurotic when you are playing Beatles cover songs with your band in the context of your local pub.

Shoda’s research embodies the second principle of individuality, the context principle, which asserts that individual behavior cannot be explained or predicted apart from a particular situation, and the influence of a situation cannot be specified without reference to the individual experiencing it.21 In other words, behavior is not determined by traits or the situation, but emerges out of the unique interaction between the two. If you want to understand a person, descriptions of their average tendencies or “essential nature” are sure to lead you astray. Instead, you need a new way of thinking that focuses on a person’s context-specific behavioral signatures.


Shoda summarized his trailblazing findings in his aptly titled book: The Person in Context: Building a Science of the Individual.22 In it, he provides an alternative to essentialist thinking he calls “if-then signatures.”23 If you want to understand a coworker named Jack, for example, it is not particularly useful to say “Jack is extroverted.” Instead, Shoda suggests a different characterization: IF Jack is in the office, THEN he is very extroverted. IF Jack is in a large group of strangers, THEN he is mildly extroverted. IF Jack is stressed, THEN he is very introverted.

An example from Shoda’s study illustrates the practical value of knowing someone’s if-then signatures. When evaluated using standard aggression questionnaires, two boys at Wediko exhibited almost identical levels of aggression, which, interpreted through the lens of essentialist thinking, would lead you to assume that their future outlooks were similar and that they required similar forms of intervention. Yet Shoda’s data revealed a hidden distinction—a distinction that makes all the difference for understanding these children. One of the boys was aggressive around his peers, but docile around adults. The other boy was only aggressive around adults, but docile around his peers. The aggressiveness of each boy was markedly different, and yet these crucial differences were erased using a trait-based score. Aggression was not the “essence” of each boy’s personality—rather, there were situations where each boy was aggressive, and situations where he was not. There was a real cost to ignoring contexts and simply tagging each boy with the same averagarian label.

Consider the following figure, which depicts the if-then signatures for aggressiveness in two boys who are based on the boys in Shoda’s study.




When I first read Shoda’s study, I thought back to my experience when my school designated me an “aggressive child.” I recalled that when my grandmother heard this verdict she refused to believe it, telling my parents, “He’s always so nice at my house!” This was not grandmotherly obliviousness. I really was nice when I was around her. My aggressiveness was triggered by very specific contexts, such as when I was being bullied. In the class where I got in trouble for shooting spitballs there were three bigger kids who liked to push me around. I tried to avoid them outside of class, but within class I often reacted to their presence by becoming the class clown, since I thought if I could make them laugh, they would be more likely to ignore me. It usually worked, though it also earned me a trip to the counselor’s office.

If the school administrators (who I genuinely believe cared about me) had attempted to understand the context of my behavior, perhaps they could have helped me, instead of labeling me as aggressive, instead of consigning me to the troubled realm of the “problem child.” If they had tried to gain insights into why I was misbehaving in that context perhaps they could have intervened—by talking to the teacher or moving me to a new class—instead of presuming they understood something essential about my character.

Later, when I managed to attend Weber State University, I used my knowledge of my if-then signatures to change the way I approached my classes. One invaluable thing I did from the beginning was to avoid classes where I knew other students from my high school. I knew that particular context would lead me to behave like the class clown, and I knew I would never be successful in college as the class clown.

Similarly, I knew that I responded well to certain teaching styles. I especially liked teachers who challenged students to think for themselves and argue over ideas, while I tended to get frustrated and disengaged from teachers who felt that the facts were known and it was our job to sit there and digest them. So at the beginning of each semester I signed up for six courses and attended at least one session of each. If there was a kid I already knew, or if the teacher’s style was a bad fit, then I simply dropped the class.

Knowing how I behaved in certain contexts allowed me to make better decisions as a college student and beyond.


It’s not hard to accept our if-then signatures when it comes to personality—to accept that we might be simultaneously aggressive with some people and nice and quiet with others, or that our introversion or extroversion is specific to whatever situation we find ourselves in. But what about honesty? Loyalty? Kindness? Aren’t these inherent to our character? Or is character, too, a changeable, contextual quality?

For a long time, it was believed that people’s character is burned into their nature. If we learn that our neighbor’s son was caught shoplifting candy at the local convenience store, we instinctively presume he is going to steal other things, too. We certainly would not leave him alone in our home. We might even be inclined to believe that he has some defect of moral fiber that will inevitably drive him to not only commit further acts of thievery, but will most likely lead him to behave in other unscrupulous ways, such as cheating at school and lying to adults.

This view, as it turns out, is wrong. Character is no different than any human honest or dishonest with nothing in between, the idea that each of these important qualities is characterized by a highly individualized if-then signature can seem provocative. And yet, the knowledge that character is contextual is nothing new.

One of the earliest large-scale scientific investigations of character was conducted in the 1920s by a psychologist and ordained minister named Hugh Hartshorne.24 Those were the heady days when schools across America were getting standardized, and a heated debate arose about whether and how schools should teach character.25 As president of the Religious Education Association, Hartshorne personally believed that religious education was the best means for inculcating moral values in young people. But, as a scientist, he also knew that before he could advocate any particular approach, he first needed to conduct research to clarify the nature of character.

Hartshorne’s team examined 8,150 public school students and 2,715 private school students between the ages of eight and sixteen years. Each student was placed in twenty-nine different experimental contexts that included four different situations (school, home, party, and athletic contest) and three possible acts of deception (lying, cheating, or stealing). Each context was manipulated to have two conditions. In the first condition (the monitored condition), there was no way for students to behave dishonestly. For instance, while taking a test at school they were watched closely by the teacher, who then graded their answers. In the second condition (the unmonitored condition), students were led to believe that any deception they committed would not be detected. For instance, after taking a test at school, they were given the opportunity to grade their own test alone in a room, but Hartshorne inserted a hidden carbon sheet beneath the test to detect whether students changed answers for a better score. For any given context, the difference between a student’s behavior in the monitored and unmonitored conditions provided a measure of the student’s honesty in that context.26

When he began the study, Hartshorne viewed honesty through the prism of essentialist thinking, expecting that each individual student would be either virtuous or unvirtuous. But that was not what he found at all. Instead, students showed little consistency in their virtue. A child who cheated in scoring her own test might be honest when keeping score at a party game. A child who cheated on a test by copying another student’s answers might not cheat when he scored his own test. A child who stole money at home might not steal money at school. Honesty, it turned out, was contextual.27




To get a sense of what Hartshorne found, take a look at the if-then signatures for two eighth-grade students from his study. Each student had a similar average honesty score. The student on the right, with one exception, consistently showed the same level of honesty, regardless of opportunities to cheat. Hartshorne stressed that this student was a true rarity: out of the 10,865 students in his study, she was the most consistent performer by far, with an honesty profile that is much flatter than anyone else’s.28 The student on the left, meanwhile, possesses a decidedly different if-then signature. That student’s behavior varied wildly across contexts, all the way from the most meticulous honesty to the most egregious cheating. Yet if you take an essentialist view of character, you would conclude there is no difference between these students—they are equally honest, on average. The context principle, however, shows us that such a view is wrong because it ignores the individuality of each student.

When the public learned of Hartshorne’s results, there was widespread shock and outrage. “There has been no more disconcerting theory for parents and teachers generally than the doctrine that moral behavior is specific and conditioned to a large degree by the external situation,” Hartshorne declared in response. “If Johnny is honest at home and you should remark that he cheats in his school examinations, his mother is more apt to be incredulous. However repugnant to popular opinion, the doctrine of specificity would seem to be well established … honesty, charity, cooperation, inhibition, and persistence are particular habits rather than general traits.”29

Things have not changed very much, and today parents and teachers still want to believe that moral fiber is a personal trait and not dependent on the situation. Take self-control. Parents are bombarded with studies and books that claim that self-control is the key to our children having a successful life.30 One of the most famous studies cited to support the importance of self-control—and arguably the most famous psychology study of our generation—is the so-called “marshmallow study.”

The general framework for the marshmallow study has been replicated many times.31 In the most common version of the study, an adult presents a child, usually between three and five years of age, with a marshmallow and a choice. They can eat the marshmallow immediately, or wait fifteen minutes and receive a second marshmallow. The adult leaves the room. The length of time the child lasts without eating the marshmallow is taken as a singular measure of their self-control, ranging from low to high.

The marshmallow study was invented more than forty years ago by a Columbia University psychologist named Walter Mischel.32 The popular influence of the study really exploded, however, when Mischel and our friend Yuichi Shoda followed up with the participants from the original studies years later and found that, on average, the participants who exhibited the highest self-control as children tended to be the ones who were better socially adjusted, and had achieved greater academic success as adolescents.33

This set off nothing less than a self-control craze that stretched across science, parenting, and education. Neuroscientists sought out the “self-control” structures in the brain that enabled kids to resist the temptation of eating marshmallows,34 child psychologists developed programs that parents could use to increase self-control in their sons and daughters,35 and educators rushed to promote new forms of character education believed to help boost self-control.36 Pundits and the media suggested that weak-willed children who could not wait patiently for additional marshmallows were at grave risk for failure in life.37 Of course, the entire marshmallow-fueled furor was based on the implicit assumption that self-control was an essentialist trait.

“It was very ironic that everyone used the study to support the trait perspective and promote character education,” Shoda told me. “Because Walter [Mischel] was fighting against that his whole career. In actuality, we were trying to show that kids can enhance their control over situational pressures through if-then strategies.”38

The context principle reminds us that self-control does not exist apart from a particular situation, and one person who recognized that context was missing from the popular accounts of the marshmallow test was a scientist named Celeste Kidd.39 Now an Assistant Professor of Brain and Cognitive Sciences at the University of Rochester, Kidd was working as a volunteer at a homeless shelter when she first heard about the marshmallow studies. “There were many kids staying at the shelter,” she told me. “If a child got a toy or candy there was always the real risk that another kid would simply grab it, so the safest and smartest thing to do was to either keep it hidden or eat it as quickly as you could. So when I came across the marshmallow studies my immediate reaction was that every kid who stayed in the shelter would eat the marshmallow right away.”40

Kidd conducted her own version of the marshmallow study, with a crucial twist: she placed one group of children in a “reliable” situation and another group in an “unreliable” situation. Before the marshmallow test began, the kids in the unreliable situation encountered an adult who did not keep his word—for example, during an art project the adult promised the child that if she waited for a little while he would bring her a new set of art supplies to replace her container of broken and well-worn crayons. After a few minutes, he returned empty-handed. The kids in the reliable group, meanwhile, encountered an adult who delivered the new supplies exactly as promised.41

The kids in the reliable situation behaved pretty much like the kids in previous marshmallow studies: a few kids gave into temptation quickly, but about two-thirds of them managed to wait all fifteen minutes—the maximum time. Things were quite different for the kids in the unreliable situation. Half of them devoured the marshmallow within the first minute after the adult departed. Only one child lasted long enough to get a second marshmallow.42 Self-control feels like some kind of essential trait, but Kidd helped show that it, too, is contextual.


The popularity of the marshmallow test and its conclusion that self-control is the key to success shows that the one domain where society remains most bound to essentialist thinking is in our attitude toward ability, talent, and potential. We imagine that these are essential qualities—that individuals either possess them or they don’t, that circumstances might have some minor influence over something like talent, but the circumstances don’t determine or create talent.

Nowhere is that reflected more than in how we hire employees. When it comes to finding the best person for the job, all the systems of our business world are set up to ignore the context, and it starts with the most essentialist hiring tool of them all: the job description. A typical job description for a director of marketing position might include a “key qualifications” or “required skills” section that looks something like this:

✵Must have 10 or more years of progressive marketing and sales management experience.

✵Bachelor’s degree is required; master’s degree is preferred.

✵Must possess exceptional communication, strategy, and leadership skills.

✵Must be a pro at multichannel marketing and managing affiliate programs.

Hundreds of thousands of businesses every week post similarly drafted job descriptions to attract candidates for open positions. Recruiters list the experience, skills, and credentials an employer is looking for, then filter out applicants who don’t meet these criteria and select the best candidate among those who are left. At first glance, this seems like common sense: candidates either have certain skills or abilities, or they don’t; either you are a “good communicator” or you are not; either you are “a pro” at something like multichannel marketing, or you are not. Of course, the reason it’s so difficult to see what’s wrong with the approach is because we’ve been duped into essentialist thinking.

Instead of focusing on the “essence” of the employee, the context principle suggests that a better starting point is to focus on the performance that we need the employee to perform, and the context in which that performance will occur. One person who has pioneered exactly such an approach is Lou Adler, founder of the Lou Adler Group, and one of the most influential recruiting and hiring consultants around.43

Before switching to a career in recruiting, Adler designed missiles and guidance systems for an aerospace manufacturer. As a result, he approached the practice of finding and selecting employees with the mindset of an engineer. “One day it just hit me: Once you see how performance depends on context, and how recruiting should be focused on matching individuals to optimal contexts, it just seems like common sense,” Adler explained to me. “But it turned out to be really hard to get companies to implement common sense.”44

Inspired by his context-focused vision for the workplace, Adler developed a new way to recruit and hire employees that he calls “performance-based hiring.” Instead of describing the person they want, Alder tells employers to first describe the job they want done.45 “Companies always say they want a good communicator. That’s one of the most common skills you see on a job description,” Adler explained to me. “But there’s no such thing as an all-around ‘good communicator.’ There are many different kinds of communication skills you might need in a particular job, and there’s no such thing as someone who’s good at all of them.” For a customer service rep, good communication is asking questions to understand a customer’s problem. For an accountant, it might be explaining to a senior executive how a shortfall in sales affects earnings. For an account executive, it might be leading a full-day presentation to a buying committee. Adler’s revelation was that all these contextual details for the performance of “good communication” really mattered.46

The Adler Group has helped more than ten thousand hiring managers at businesses ranging from start-ups to Fortune 500 companies switch to performance-based hiring.47 One client who raves about the impact that performance-based hiring has had on his firm is twenty-five-year-old wunderkind Callum Negus-Fancey, the founder of London-based Let’s Go Holdings.48 The company quickly made a name for itself as “brand advocacy specialists” for media and tech companies and, as a result, Let’s Go experienced very rapid growth in its first three years.49

“At first, we really didn’t know what we were doing when it came to hiring, so we just used the traditional job description approach,” Callum told me. “We needed someone to run a marketing team, and we hired someone who matched our generic job description. He had a lot of impressive experience, but his experience was in big corporations, and when he started working for us, a fast-moving start-up, he simply didn’t fit in at all. It was a disaster.”50

That’s when Callum heard about performance-based hiring and asked Adler to help him find a new human resources manager. “Adler showed us that what really mattered was selecting someone with success performing in similar contexts to the ones at Let’s Go,” Callum told me. In this case, Adler’s model ended up identifying a very counterintuitive prospect: a pharmacist from Belgium. “Thierry Thielens wasn’t British, and he had never done anything in human resources before,” Callum recalled. At first, Callum was skeptical, but Adler explained that the pharmacist’s previous performance and the conditions he had worked under (such as quickly learning to manage fast-changing staff through a series of new situations), were almost identical to what they needed him to do at Let’s Go. So Callum hired him. “Today, he’s one of the most important people in the company,” Callum told me, “and we would have never considered him if we simply looked at job descriptions.”51

The human resources industry was born out of Taylorism, with personnel departments tasked with looking for average employees to fill average jobs. Essentialist thinking was fundamental to the mindset from the beginning, and in many ways remains so today. “Companies always lament there’s a shortage of talent, that there’s a skills gap,” Adler told me. “But really there’s just a thinking gap. If you spend the effort thinking through the contextual details of the job, you’re going to be rewarded.”52 Companies that apply the context principle—companies that attempt to match the if-then signatures of candidates with the performance profiles of the positions they are trying to fill—will end up with more successful, loyal, and motivated employees. For our part, we will have the chance to enjoy a career that matches who we really are.

But a better career match is not the only thing that the context principle opens up for us. It also presents us with a better map for understanding ourselves as well as other people and their talents, abilities, and potential. And this deeper understanding of who we are and how we interact with those around us is at the heart of our personal and professional success.


The context principle challenges us to think about ourselves and others in a way that is counter to how we’ve been taught to think about personality most of our lives. It’s natural that many people might resist letting go of the idea that, deep down, we must surely possess some kind of enduring, essential traits. Most of us believe that, when it comes right down to it, we are optimists at heart—or cynics. That we are nice—or rude. That we are honest—or dishonest. The idea that who we are changes according to the circumstances we find ourselves in—even if those changes are unique to our own self—seems to violate the fundamental tenet of identity: to us, our personality feels stable and steadfast.

We feel this way because our brain is exquisitely sensitive to context and automatically adapts to the situation we find ourselves in. When we are extroverted at a friend’s party, our brain instinctively compares our behavior to experiences in similar contexts and concludes that we are acting as expected: we are an extrovert, or at least we are at parties. At work, on the other hand, we might consider ourselves introverted, since our brain remembers that we usually behave in a low-key manner around our coworkers. If our personality feels stable and steadfast, it’s because it is stable and steadfast—within a given context. Astrologers figured this out long ago, which is why horoscopes often seem persuasive—if the astrologer informs us that Leos are sometimes shy, well, we are all shy sometimes. It just depends on the context.

Other people’s personalities seem stable to us, however, for a different reason: we tend to interact with most people within a narrow range of contexts. We might know a colleague solely at work, for example—not at home with his family. Or we go out shopping and drinking with a friend on weekends, but never see her in the boardroom. We spend time with our children at home, but rarely see them at school or with their friends. Another reason people’s behavior feels trait-like is that you are a part of their context. Your boss might think you are a timid person when you know that you are only timid around her; at the same time, we might think our boss is overbearing and arrogant, even though she might be only behaving that way around you. We simply do not see the diversity of contexts in the lives of our acquaintances or even those closest to us and, as a result, we make judgments about who they are based on limited information.

Breaking free from essentialist thinking, and becoming aware of contextual if-then signatures, can give us an incredible advantage in our personal and professional lives. On a personal level, it helps us more easily recognize the situations where we shine, which allows us to make better decisions. For example, you may excel as part of a collaborative team, but struggle in a context that is more isolated and individualistic, so when offered a big promotion that requires you to work independently from home 90 percent of the time, you might decide to decline because you recognize that, regardless of its benefits, the job does not fit your if-then signature. Conversely, the context principle also helps us identify situational factors that might lead us to behave in negative or self-sabotaging ways, and change or avoid those factors.

In many ways it’s not hard to develop awareness of the contexts where we are ourselves successful, and the contexts where we struggle. The hard part is to think about other people’s if-then signatures. Essentialist thinking still pervades every aspect of our social lives, and it is hard to resist the pull of false certainty. That’s the challenge for all of us—and where the context principle may offer its biggest benefits. Each time we find ourselves thinking someone is neurotic, aggressive, or aloof we should remember that we are only seeing them in one particular context.

Understanding the if-then strategies of others is especially important when we find ourselves entrusted with helping them succeed—as their manager, parent, counselor, teacher, and so on. When we are acting in that capacity, the context principle allows us to deal more productively whenever we see our child, employee, student, or client engaging in negative behaviors we want to change. Instead of asking why they are behaving in that way, we can reframe the question in terms of context and ask ourselves, “Why are they behaving that way in that context?” When we see behaviors we think are bad, we can withhold from responding until we first find an example where their behavior isn’t true (for example, my aggressive behaviors were true in art class, but not true with my grandmother). Or we can follow Celeste Kidd’s lead—she told me that any time she finds herself judging someone based on behaviors that strike her as insensitive or irrational, she stops herself, takes a step back, and tries to imagine a set of circumstances that would make the behavior rational and sensible. Most of the time, she realizes that she was projecting her own context onto the other person instead of appreciating his.

Even if we are not entrusted to help others succeed, remembering that we only see others we interact with—like a coworker or boss—in a single context can help us to be more compassionate and understanding with others. If we could see that “difficult” coworker in all her contexts, we might find her to be a devoted friend outside the office, a caring sister, a loving aunt to her nieces. It’s then harder to judge that coworker, to reduce her to a singular unflattering personality trait and in the process strip her of what makes her human—her complexity. Remembering that there is more to that person than the context that finds both of us together in that moment opens up the door for us to treat others with a deeper understanding and respect than essentialist thinking ever allows us to. And that understanding and respect are the foundation of the positive relationships that are most likely to lead to our success and happiness.