The End of Average: How We Succeed in a World That Values Sameness - Todd Rose (2016)

Part I. THE AGE OF AVERAGE

Individual talent is too sporadic and unpredictable to be allowed any important part in the organization of society. Social systems which endure are built on the average person who can be trained to occupy any position adequately if not brilliantly.

—STUART CHASE, THE PROPER STUDY OF MANKIND

Chapter 1. THE INVENTION OF THE AVERAGE

In 2002, UC Santa Barbara neuroscientist Michael Miller conducted a study of verbal memory. One by one, sixteen participants lay down in an fMRI brain scanner and were shown a set of words. After a rest period, a second series of words was presented and they pressed a button whenever they recognized a word from the first series. As each participant decided whether he had seen a particular word before, the machine scanned his brain and created a digital “map” of his brain’s activity. When Miller finished his experiment, he reported his findings the same way every neuroscientist does: by averaging together all the individual brain maps from his subjects to create a map of the Average Brain.1 Miller’s expectation was that this average map would reveal the neural circuits involved in verbal memory in the typical human brain.

Whenever you read about some new neuroscience discovery accompanied by a blob-splotched cross section of a brain—here are the regions that light up when you feel love; here are the regions that light up when you feel fear—it’s a near certainty that you are looking at a map of an Average Brain. As a graduate student, I was also taught the method of producing and analyzing the Average Brain (referred to as the “random effects model” in the jargon of science2) when I was trained in brain imaging at Massachusetts General Hospital. The driving assumption of this method is that the Average Brain represents the normal, typical brain, while each individual brain represents a variant of this normal brain—an assumption that mirrors the one that motivated the Norma look-alike contest. This premise leads neuroscientists to reject left-handed people from their studies (since it is presumed the brains of left-handed people are different from normal brains) or sometimes even throw out those individuals whose brain activity deviates too far from average, since researchers worry these outliers might cloud their view of the Average Brain.

There would have been nothing strange about Miller reporting the findings of his study by publishing his map of the Average Brain. What was strange was the fact that when Miller sat down to analyze his results, something made him decide to look more carefully at the individual maps of his research participants’ brains. Even though Miller was investigating a well-studied mental task using the standard method of brain research—and even though there was nothing unusual about his participants’ Average Brain—he glanced over a few of the individual maps. “It was pretty startling,” Miller told me. “Maybe if you scrunched up your eyes real tight, a couple of the individual maps looked like the average map. But most didn’t look like the average map at all.”3

Other people before Miller had noticed that individual brains often failed to resemble the Average Brain, but since everyone else ignored this awkward fact, they usually ignored it, too—just as scientists and physicians long ignored the fact that no real woman looked like Norma. But now Miller did something that might seem perfectly obvious to do, yet few had ever bothered to attempt: he systematically compared each of the sixteen individual brain maps from his verbal memory experiment to the map of the Average Brain. What he found astonished him. Not only was each person’s brain different from the average, they were all different from one another.

 width=

MIKE DICKS, DESCIENCE LIMITED

MEMORY ACTIVITY IN THE BRAIN

Some people’s brains were mostly activated on the left, others on the right. Some brains were mostly activated in the front, others in the back. Some looked like a map of Indonesia with long, thick archipelagos of activation; others were almost entirely blank. But there was no avoiding the most salient fact: nobody’s brain looked like the Average Brain. Miller’s results paralleled those obtained by Gilbert Daniels during his investigation of hands, except this time the organ being studied wasn’t a limb—it was the very cradle of thought, feeling, and personality.

Miller was bewildered. According to the key assumption behind the method of the Average Brain, most people’s brains should be fairly close to average. Neuroscientists certainly expected that some brains should be similar to the average. But hardly any of the brains in Miller’s study even remotely resembled the Average Brain. Miller feared that perhaps there had been some kind of technical error in his equipment, so he brought many of the same participants back a couple months later and once again scanned their brains as they performed the same word memory task. The results were nearly identical: each person’s new brain map was pretty close to his original brain map—and each individual brain map remained quite different from the map of the Average Brain.

“That convinced me that the individual patterns we were seeing was not random noise but something systematic about the way each individual performed the task, that each person’s memory system consisted of a unique neural pattern,” Miller explained to me. “But what was most surprising was that these differences in patterns were not subtle, they were extensive.”4

The “extensive” differences that Miller found in people’s brains aren’t limited to verbal memory. They’ve also been found in studies of everything from face perception and mental imagery to procedural learning and emotion.5The implications are hard to ignore: if you build a theory about thought, perception, or personality based on the Average Brain, then you have likely built a theory that applies to no one. The guiding assumption of decades of neuroscience research is unfounded. There is no such thing as an Average Brain.

When Miller published his counterintuitive findings, at first they elicited skepticism. Some scientists suggested the findings might be due to problems with his software algorithms, or simply bad luck in his choice of subjects—maybe too many of his participants were “outliers.” The most common response from Miller’s colleagues, however, was not criticism, but bored dismissal. “Other people had noticed what I noticed before in their own work; they just shrugged it off,” Miller told me. “People were saying, ‘Everybody already knows this, it’s no big deal. That’s why you use the average, it takes all these individual differences into account. You don’t need to bother pointing out all this variability, because it doesn’t matter.’”6

But Miller was convinced it did matter. He knew this was not some academic debate, but a problem with practical consequences. “I’ve been approached by people working on the neuroscience of law,” Miller says. “They’re trying to make inferences they can use in a court of law about people’s psychiatric condition and mental states. They want to use brain scans to decide if someone should go to jail or not, so it most definitely matters if there’s a systematic difference between the individual brain and the ‘average’ brain.”7

Miller is not the only scientist to confront a field-shaking dilemma involving the use of averages. Every discipline that studies human beings has long relied on the same core method of research: put a group of people into some experimental condition, determine their average response to the condition, then use this average to formulate a general conclusion about all people. Biologists embraced a theory of the average cell, oncologists advocated treatments for the average cancer, and geneticists sought to identify the average genome. Following the theories and methods of science, our schools continue to evaluate individual students by comparing them to the average student and businesses evaluate individual job applicants and employees by comparing them to the average applicant and average employee. But if there is no such thing as an average body and there is no such thing as an average brain, this leads us to a crucial question: How did our society come to place such unquestioning faith in the idea of the average person?

The untold story of how our scientists, schools, and businesses all came to embrace the misguided notion of the “Average Man” begins in 1819, at the graduation of the most important scientist you have never heard of, a young Belgian by the name of Adolphe Quetelet.

THE MATHEMATICS OF SOCIETY

Quetelet (“Kettle-Lay”) was born in 1796. At age twenty-three he received the first doctorate in mathematics ever awarded by the University of Ghent. Smart and hungry for recognition, he wanted to make a name for himself like one of his heroes, Sir Isaac Newton. Quetelet marveled at the way Newton uncovered hidden laws governing the operation of the universe, extracting orderly principles out of the chaos of matter and time. Quetelet felt that his best chance for a similar achievement was in astronomy, the leading scientific discipline of his time.8

In the early nineteenth century, the most prominent scientific minds turned their attention to the heavens, and the greatest symbol of a nation’s scientific status was the possession of a telescopic observatory. Belgium, however, did not have one. In 1823, Quetelet somehow managed to convince the Dutch government that ruled Belgium to shell out the exorbitant sum needed to build an observatory in Brussels, and very soon Quetelet was appointed to its top position, director of the observatory.9 As the lengthy construction proceeded, Quetelet embarked on a series of visits to observatories throughout Europe to learn the latest observational methods. It seemed he had perfectly positioned himself to make an enviable run at scientific acclaim—but then, in 1830, just as he was wrapping up his tour of Europe, Quetelet received bad news: Belgium had plunged into revolution. The Brussels observatory was occupied by rebel troops.10

Quetelet had no idea how long the revolution would last, or whether the new government would support the completion of the observatory—or if it would even allow him to continue as Belgium’s “Astronomer Royale.” It would prove to be a turning point in his life—and in the way society conceived of individuals.11

Previously, Quetelet never cared much about politics or the complexities of interpersonal dynamics. He was solely focused on astronomy. He believed he could keep his distance from any social commotion, which he viewed as irrelevant to his lofty scientific endeavors. But when revolution erupted in his own backyard—in his own observatory—human social behavior suddenly became very personal. Quetelet found himself longing for a stable government that passed sensible laws and policies that would prevent the sort of social chaos that had derailed his career plans—and which seemed to keep leading to upheaval all around Europe. There was just one glaring problem: modern society seemed utterly unpredictable. Human behavior did not appear to follow any discernible rules . . . just like the universe had seemed so indecipherable before Isaac Newton.12

As he contemplated the revolution that had put an end to his professional ambitions, Quetelet was struck with inspiration: Might it be possible to develop a science for managing society? He had spent his life learning how to identify hidden patterns in the mysterious whirl of the celestial heavens. Couldn’t he use the same science to find hidden patterns in the apparent chaos of social behavior? Quetelet set himself a new goal. He would apply the methods of astronomy to the study of people. He would become the Isaac Newton of social physics.13

Fortunately for Quetelet, his decision to study social behavior came during a propitious moment in history. Europe was awash in the first tidal wave of “big data” in history, what one historian calls “an avalanche of printed numbers.”14 As nations started developing large-scale bureaucracies and militaries in the early nineteenth century, they began tabulating and publishing huge amounts of data about their citizenry, such as the number of births and deaths each month, the number of criminals incarcerated each year, and the number of incidences of disease in each city.15 This was the very inception of modern data collection, but nobody knew how to usefully interpret this hodgepodge of data. Most scientists of the time believed that human data was far too messy to analyze—until Quetelet decided to apply the mathematics of astronomy.

Quetelet knew that one common task for any eighteenth-century astronomer was to measure the speed of celestial objects. This task was accomplished by recording the length of time it took an object such as a planet, comet, or star to pass between two parallel lines etched onto the telescope glass. For example, if an astronomer wanted to calculate the speed of Saturn and make predictions about where it would appear in the future, he would start his pocket watch when he observed Saturn touch the first line, then stop the watch when it touched the second line.16

Astronomers quickly discovered this technique suffered from one major problem: if ten astronomers each attempted to measure the speed of the same object, they often obtained ten different measurements. If multiple observations resulted in multiple outcomes, how could scientists decide which one to use? Eventually, astronomers adopted an ingenious solution that was originally known as the “method of averages”17: all the individual measurements were combined together into a single “average measurement” which, according to the advocates of the method, more accurately estimated the true value of the measurement in question than any single observation.18

When Quetelet ventured to establish a social science, his most pivotal decision was borrowing astronomy’s method of averages and applying it to people. His decision would lead to a revolution in the way society thought of the individual.

THE AVERAGE MAN

In the early 1840s, Quetelet analyzed a data set published in an Edinburgh medical journal that listed the chest circumference, in inches, of 5,738 Scottish soldiers. This was one of the most important if uncelebrated studies of human beings in the annals of science. Quetelet added together each of the measurements, then divided the sum by the total number of soldiers. The result came out to just over thirty-nine and three-quarters inches—the average chest circumference of a Scottish soldier. This number represented one of the very first times a scientist had calculated the average of any human feature.19 But it was not Quetelet’s arithmetic that was history making, it was his answer to a rather simple-seeming question: What, precisely, did this average actually mean?

If you spend a few moments thinking about it, it’s not actually clear what the significance of “average size” is. Is it a rough guide to the size of normal human beings? An estimate of the size of a randomly selected person? Or is there some kind of deeper fundamental meaning behind the number? Quetelet’s own interpretation—the first scientific interpretation of a human average—was, not surprisingly, conceived out of concepts from astronomical observation.

Astronomers believed that every individual measurement of a celestial object (such as one scientist’s measurement of the speed of Saturn) always contained some amount of error, yet the total amount of aggregate error across a group of individual measurements (such as many different scientists’ measurements of the speed of Saturn, or many different measurements by a single scientist) could be minimized by using the average measurement.20 In fact, a celebrated proof by the famous mathematician Carl Gauss appeared to demonstrate that an average measurement was as close to a measurement’s true value (such as the true speed of Saturn) as you could ever hope to get.21 Quetelet applied the same thinking to his interpretation of human averages: he declared that the individual person was synonymous with error, while the average person represented the true human being.22

After Quetelet calculated the average chest circumference of Scottish soldiers, he concluded that each individual soldier’s chest size represented an instance of naturally occurring “error,” whereas the average chest size represented the size of the “true” soldier—a perfectly formed soldier free from any physical blemishes or disruptions, as Nature intended a soldier to be.23 To justify this peculiar interpretation, Quetelet offered an explanatory metaphor known as the “Statue of the Gladiator.”

Quetelet invites us to imagine a statue of a gladiator. Suppose that sculptors make 1,000 copies of the statue. Quetelet claims that every one of these hand-carved copies will always feature some mistakes and flaws that will render it different from the original. Yet, according to Quetelet, if you took the average of all 1,000 copies, this “average statue” would be nearly identical to the original statue. In the same manner, contended Quetelet in a striking leap of logic, if you averaged together 1,000 different soldiers, you would end up with a very close approximation of the One True Soldier, existing in some Platonic realm, of which each living, breathing soldier was an imperfect representation.24

Quetelet followed the same line of reasoning with regard to humanity as a whole, claiming that every one of us is a flawed copy of some kind of cosmic template for human beings. Quetelet dubbed this template the “Average Man.”25 Today, of course, we often consider someone described as “average” to be inferior or lacking—as mediocre. But for Quetelet, the Average Man was perfection itself, an ideal that Nature aspired to, free from Error with a capital “E.” He declared that the greatest men in history were closest to the Average Man of their place and time.26

Eager to unmask the secret face of the Average Man, Quetelet began to compute the average of every human attribute he could get data on. He calculated average stature, average weight, and average complexion. He calculated the average age couples got married and the average age people died. He calculated average annual births, average number of people in poverty, average annual incidents of crime, average types of crimes, the average amount of education, and even average annual suicide rates. He invented the Quetelet Index—today known as the body mass index (BMI)—and calculated men’s and women’s average BMIs to identify average health. Each of these average values, claimed Quetelet, represented the hidden qualities of the One True Human, the Average Man.

As much as Quetelet admired the Average Man, he held an equal amount of antipathy toward those unfortunate individuals who deviated from the average. “Everything differing from the Average Man’s proportions and condition, would constitute deformity and disease,” Quetelet asserted. “Everything found dissimilar, not only as regarded proportion or form, but as exceeding the observed limits, would constitute a Monstrosity.”27 There’s little question that Quetelet would have lauded the statue of Norma. “If an individual at any given epoch of society possessed all the qualities of the Average Man,” pronounced Quetelet, “he would represent all that is great, good, or beautiful.”28

Though today we don’t think an average person is perfection, we do presume that an average person is a prototypical representative of a group—a type. There is a powerful tendency in the human mind to simplify the way we think about people by imagining that all members of a group—such as “lawyers,” “the homeless,” or “Mexicans”—act according to a set of shared characteristics, and Quetelet’s research endowed this impulse with a scientific justification that quickly became a cornerstone of the social sciences. Ever since Quetelet introduced the idea of the Average Man, scientists have delineated the characteristics of a seemingly endless number of types, such as “Type-A personalities,” “neurotic types,” “micro-managers,” and “leader types,” arguing that you could make useful predictions about any given individual member of a group simply by knowing the traits of the average member—the group’s type.

Since Quetelet’s new science of the Average Man seemed to impose welcome order on the accelerating jumble of human statistics while simultaneously validating people’s natural urge to stereotype others, it’s little wonder his ideas spread like wildfire. Governments adopted Quetelet’s social physics as a basis for understanding their citizens and crafting social policy. His ideas helped focus political attention on the middle class, since they were perceived to be closest to a nation’s average citizen and, according to Queteletian reasoning, the truest type of Belgian, Frenchman, Englishman, Dutchman, or Prussian. In 1846, Quetelet organized the first census for the Belgian government, which became the gold standard for all modern censuses; Quetelet even consulted with James A. Garfield, then a member of the U.S. Congress, about ways to improve the American census.29

Quetelet also influenced the American military. During the American Civil War, President Abraham Lincoln decided that the Union Army needed more information about its soldiers in order to provide a more efficient distribution of resources, so he authorized the largest anthropometric study in the history of the world up to that time. Every Union soldier was measured physically, medically, and morally, and then—in explicit obedience to Quetelet’s new science—averages were calculated and reported. This mammoth study formed the basis for the American military’s long-standing philosophy of standardized design.30

You and I take averages for granted. They form part of the everyday babble and hum of our daily media. As I write this, today’s New York Times reports the average amount of student debt, the average number of viewers of prime-time television, and the average salary of physicians. But each time Quetelet unveiled a new average, the public boggled. For example, Quetelet showed that the average rate of suicide was relatively stable from year to year.31While this would hardly be startling news for us, in the 1830s suicide seemed to be a highly irrational private decision that could not possibly conform to any deeper pattern. Instead, Quetelet showed that suicides occurred with reliable and consistent regularity—and not only that, he claimed that the stability of the occurrences indicated that everyone possesses an average propensity toward suicide. The Average Man, attested Quetelet, was suicidal to an average extent.32

Scholars and thinkers in every field hailed Quetelet as a genius for uncovering the hidden laws governing society. Florence Nightingale adopted his ideas in nursing, declaring that the Average Man was “God’s Will.” Karl Marx adopted Quetelet’s ideas to develop his economic theory of Communism, announcing that the Average Man proved the existence of historical determinism. The physicist James Maxwell was inspired by Quetelet’s mathematics to formulate the classical theory of gas mechanics. The physician John Snow used Quetelet’s ideas to fight cholera in London, marking the start of the field of public health. Wilhelm Wundt, the father of experimental psychology, read Quetelet and proclaimed, “It can be stated without exaggeration that more psychology can be learned from statistical averages than from all philosophers, except Aristotle.”33

Quetelet’s invention of the Average Man marked the beginning of the Age of Average. It represented the moment when the average became normal, the individual became error, and stereotypes were validated with the imprint of science. These assumptions would eventually lead the Air Force to design cockpits to fit the average pilot and my instructors at Mass General Hospital to teach me how to interpret maps of the Average Brain. It would prompt generations of parents to worry if their child did not develop according to the average milestones, and cause almost every one of us to feel anxiety when our health, social life, or career deviated too far from the average.

But Quetelet is only half the story of how the Age of Average came about. The other half centers on Sir Francis Galton, a giant of a figure who started out as one of Quetelet’s most devout disciples but eventually became his most distinguished detractor.34

THE EMINENT AND THE IMBECILE

In 1851, the Great Exhibition—sometimes called the first World’s Fair—was held in London. Exhibitors from every nation showcased their most interesting products, technologies, and inventions. The British people fully expected the event would demonstrate to the world their country’s superiority. But as they strolled through the exhibits, it quickly became apparent that their hopes were not being fulfilled. The most impressive exhibits were not British, but American. Entrepreneurs from across the Atlantic touted industrial marvels that surpassed anything the British had on offer, including Samuel Colt’s revolver, Isaac Singer’s sewing machine, and Robert McCormick’s mechanical reaper.35 Many Englishmen began to worry their country was falling behind the rest of the world—and one man who was especially concerned was Francis Galton. He was sure he knew the precise cause of the United Kingdom’s abrupt downturn: the growing status of the lower classes.36

Galton, whose family had made its fortune in banking and gun manufacturing, was a member of the wealthy merchant class. He believed in the innate superiority of his family and other members of the upper class, and to his mind, the growing democratization of society was polluting the greatness of the British Empire.37 He was confident that the way to restore Britain’s tarnished glory was to reestablish the lapsing authority of the superior social strata—and he believed Quetelet’s math explained why.

A mathematician by training, Galton viewed the elder Belgian as brilliant, calling him “the greatest authority on vital and social statistics.”38 Galton concurred with Quetelet that the average represented the scientific foundation for understanding people. In fact, Galton agreed with almost all of Quetelet’s ideas, save one: the idea that the Average Man represented Nature’s ideal. Nothing could be further from the truth, claimed Galton. For him, to be average was to be mediocre, crude, and undistinguished—like the lower classes who were now voting for representatives in the House of Commons.39 Galton would have scoffed at the idea that women should try to fashion themselves after Norma. No, if women wanted a model to emulate, Galton believed there was none better than Her Majesty the Queen.

Galton believed that it was the imperative of humankind to attempt to improve on the average as much as possible, and he cited his cousin Charles Darwin’s research to support this claim, writing, “What nature does blindly, slowly, and ruthlessly, man may do providently, quickly, and kindly.”40 Though Quetelet considered that excessive deviation from the average constituted “monstrosity,” Galton believed the Belgian’s view was only half right. Luminaries who were far above average—like Galton and Queen Victoria and Isaac Newton—were assuredly not monstrosities, but instead formed a distinct class that Galton dubbed “the Eminent.” Those who were far below average Galton termed “the Imbecile.”41

Thus, Galton rejected Quetelet’s conviction that individuals who deviated from the average represented “error.” At the same time, he agreed with Quetelet’s concept of types, since he believed that the Eminent, the Imbecile, and the Mediocre each comprised a separate type of human being. Put simply, Galton wanted to preserve Quetelet’s idea that the average member of a group represented that group’s type, but reject Quetelet’s idea that an individual’s deviation from average represented error. How did he resolve this apparent paradox? Through an act of moral and mathematical jujitsu: he redefined “error” as “rank.”42

Quetelet might say that it did not really matter whether you were 50 percent faster than the average person or 50 percent slower—in either case, you were an equal deviation from the average, embodying equal error and equal distance from perfection. Galton would have disagreed. He said that a person who was 50 percent faster than average was clearly superior to someone 50 percent slower. They were not equal: the faster person represented an individual of higher rank.

Galton carved up humankind into fourteen distinct classes, ranging from the “Imbeciles” in the lowest rank through the “Mediocre” in the middle ranks all the way up to the most “Eminent” members of the highest rank. This was a monumental shift in the meaning of average, transforming the notion of normality into mediocrity. But Galton didn’t stop there. Galton was so confident that the Eminent represented a separate category of human being that he claimed a person’s rank was consistent across all qualities and dimensions—mental, physical, and moral.43 According to Galton, if your intelligence was Eminent, your physical health would most likely be Eminent, too, as well as your courage and honesty. Similarly, if your math skills lingered in the lowest ranks, your verbal skills would probably also slouch far below average, not to mention your beauty and self-discipline. “As statistics have shown, the best qualities are largely correlated,” wrote Galton in 1909.44 “The youths who became judges, bishops, statesman, and leaders of progress in England could have furnished formidable athletic teams in their times.”

If Galton’s conception of rank were true it would support his contention that the Eminent offered the best hope for restoring Britain’s lapsed glory, since it would mean the Eminent as a class were eminent in all things. To help him prove the existence of rank, Galton developed new statistical methods, including correlation, a technique that allowed him to assess the relationship of rank across different qualities.

All of his statistical inventions were predicated on what Galton called the “law of deviation from the average”: the idea that what mattered most about an individual was how much better or worse they were than the average. To our twenty-first-century minds, it has come to seem so natural and obvious that talented people are “above average” while incompetent folks are “below average” that it seems simplistic to attribute the origins of this idea to one person. And yet, it was Galton who almost single-handedly supplanted Quetelet’s conviction that human worth could be measured by how close a person was to the average with the notion that worth was better measured by how far a person was from the average. Just as Quetelet’s ideas about types took the intellectual world by storm in the 1840s, so did Galton’s idea about rank in the 1890s, and by the early 1900s, the notion that people could be sorted into distinct bins of ability from low to high had infiltrated virtually all the social and behavioral sciences.

The Age of Average—a cultural era stretching from Quetelet’s invention of social physics in the 1840s until today—can be characterized by two assumptions unconsciously shared by almost every member of society: Quetelet’s idea of the average man and Galton’s idea of rank. We have all come to believe, like Quetelet, that the average is a reliable index of normality, particularly when it comes to physical health, mental health, personality, and economic status. We have also come to believe that an individual’s rank on narrow metrics of achievement can be used to judge their talent. These two ideas serve as the organizing principles behind our current system of education, the vast majority of hiring practices, and most employee performance evaluation systems worldwide.

Though Quetelet’s influence on the way we think about individuals remains deeply embedded in our institutions, for most of us, it is Galton’s legacy that clenches itself around our personal life in a more vivid and intimate manner. We all feel the pressure to strive to rise as far above average as possible. Much of the time, we don’t even think about what, exactly, we’re trying so hard to be above-average at, because the why is so clear: we can only achieve success in the Age of Average if others do not view us as mediocre or—disaster!—as below-average.

THE RISE OF THE AVERAGARIANS

By the dawn of the twentieth century, a majority of social scientists and policymakers were making decisions about people based on the average.45 This development did not merely consist of the adoption of new statistical techniques. It marked a seismic change in how we conceived of the relationship between the individual and society. Typing and ranking both rely on a comparison of the individual to a group average. Thus, both Quetelet and Galton claimed, explicitly and ardently, that any particular person could only be understood by comparison to the group, and therefore, from the perspective of the new social sciences, the individual was almost entirely irrelevant.

“In speaking of the individual it must be understood that we are not attempting to speak of this or that man in particular; we must turn to the general impression that remains after having considered a great number of people,” Quetelet wrote in 1835. “Removing his individuality we will eliminate all that is accidental.”46 Similarly, the first issue of Biometrika, an academic journal that Galton founded in 1901, proclaimed, “It is almost impossible to study any type of life without being impressed by the small importance of the individual.”47 It might seem that there is some fundamental difference between saying a person scored in the 90th percentile and saying that a person is an introverted type, but both ultimately require a comparison to an average score. These two approaches merely reflect an alternate interpretation of the same underlying mathematics—but share the same core conviction: individuality doesn’t matter.

When the average was first introduced into society, many educated Victorians recognized right away that something vital was under threat by their strange new approach to understanding people, driving many to warn, rather prophetically, of the perils of ignoring individuality. In an 1864 essay, a well-known British poet named William Cyples acknowledged the ostensible triumphs of a new generation of average-wielding scientists and bureaucrats, before endowing them with a moniker as distinctive as it was disparaging: averagarians. This term is so useful and apt that I employ it to describe anyone—scientists, educators, managers—who use averages to understand individuals.

In his essay, Cyples worried about what the future would look like if the averagarians took over: “These averagarians usually give the statistics of murders, suicides, and (unhappy connection!) marriages, as proof of the periodic uniformity of events. . . . We should seem rather to be human units than men. . . . We endure or achieve in the degree of a percentage; fate is not so much a personal ordainment as an allotment made on us in statistical groups. . . . A protest may be safely entered against this modern superstition of arithmetic which, if acquiesced in, would seem to threaten mankind with a later and worse blight than any it has yet suffered—that not so much of a fixed destiny, as of a fate expressive in decimal fractions, not falling upon us personally, but in averages.”48

It wasn’t just poets who were concerned about the growing influence of the averagarians. Physicians, too, were staunchly opposed to the use of the average to evaluate individuals under their care. “You can tell your patient that, of every hundred such cases, eighty are cured . . . but that will scarcely move him. What he wants to know is whether he is numbered among those who are cured,” wrote Claude Bernard, the French doctor regarded as the father of experimental medicine, in 1865.49 “Physicians have nothing to do with what is called the law of large numbers, a law which, according to a great mathematician’s expression, is always true in general and false in particular.”50

Yet society failed to listen to these early protests, and today we reflexively judge every individual we meet in comparison to the average—including ourselves. When the media reports the number of close friends the average citizen possesses (8.6 in the United States), or the number of romantic partners the average person kisses in a lifetime (15 for women, 16 for men), or the number of fights over money the average couple instigates each month (3 in the United States)—it is the rare person who doesn’t automatically weigh her own life against these figures. If we have claimed more than our fair share of kisses, we may even feel a surge of pride; if we have fallen short, we may feel self-pity or shame.51

Typing and ranking have come to seem so elementary, natural, and right that we are no longer conscious of the fact that every such judgment always erases the individuality of the person being judged. A century and a half after Quetelet—exactly as the poets and physicians of the nineteenth century feared—we have all become averagarians.