HOW OUR WORLD BECAME STANDARDIZED - THE AGE OF AVERAGE - Abraham Lincoln - James M. McPherson

The End of Average: How We Succeed in a World That Values Sameness - Todd Rose (2016)

Part I. THE AGE OF AVERAGE

Chapter 2. HOW OUR WORLD BECAME STANDARDIZED

After I dropped out of high school, I worked briefly for a large aluminum stamping plant in Clearfield, Utah. It served as my introduction to the world of work. On my first day, I was handed a little card that explained in precise detail exactly how I was supposed to do my job, even dictating the preferred motions of my arms and feet. I picked up a raw block of aluminum from a pile and carried it over to the blistering hot stamping machine. I ran the block through the machine, which squeezed out an L- or S-shaped beam like a Play-Doh Fun Factory. I stacked the beam on a pallet and punched a button recording the fact that I had completed one unit (a portion of my pay was based on the number of units I stamped), then I raced back to the pile and started all over again.

My two most enduring memories of the job are the endless repetition of punching and running, punching and running, punching and running … and the piercing metallic clanging of the factory bell announcing the start and end of my shift. The experience was dehumanizing. As an aluminum plant employee, my individuality did not matter at all. Instead, just as the British poet Cyples had warned, I was a “human unit”—a mere statistic, an Average Worker. This was no coincidence: the entire workplace was designed according to the tenets of averagarianism, the supposition that individuals can be evaluated, sorted, and managed by comparing them to the average.

Since averagarianism originally grew out of the work of two European scientists who were trying to use mathematics to solve complex social problems, it could have remained an esoteric philosophy of interest only to scholars or intellectuals. But you and I were born into a world where the notion of the average colors every aspect of our lives from birth to death, infiltrating our most confidential judgments of self-worth. How, precisely, did averagarianism go from an abstract ivory tower conjecture to the pre-eminent organizational doctrine of businesses and schools across the world? The answer to this question largely centers on a single man named Frederick Winslow Taylor.

One economist has written that Taylor “probably had a greater effect on the private and public lives of the men and women of the twentieth century than any other single individual.”1 Born to a wealthy Pennsylvania family in 1856, the teenage Taylor spent two years studying in Prussia, one of the first countries to re-organize its schools and military around Quetelet’s ideas, and the place where Taylor was most likely first exposed to the ideas of averagarianism that would eventually form the philosophical foundation of his work.2

After Taylor returned home to attend Phillips Exeter Academy, a college prep school, his family expected him to follow in his father’s footsteps and study law at Harvard. Instead, he became an apprentice machinist at a Philadelphia pump-manufacturing company. When I first read about Taylor’s youthful career decision, I thought I recognized a kindred spirit, imagining Taylor as a troubled adolescent who couldn’t find his way through school or life. I was mistaken. Taylor’s decision to work for a pump manufacturer is better compared to an ambitious Mark Zuckerberg dropping out of Harvard to found Facebook.

In the 1880s, America was transitioning from an agrarian economy to an industrial one. Newly laid railways linked cities in webs of iron, immigrants flooded into the country so fast that you could walk through entire neighborhoods without hearing a word of English, and cities expanded so rapidly that between 1870 and 1900 the population of Chicago increased sixfold. These social disruptions were accompanied by significant economic changes, and the biggest ones were taking place inside the giant new edifices of manufacturing: factories. Taylor skipped Harvard to work at Enterprise Hydraulic Works at the dawn of electrified factories, a time when making things, assembling things, and building things roused the same sense of world-conquering opportunity found today in Silicon Valley.3

Taylor hoped to make a name for himself in this thrilling new world of industry, an ambition much aided by the fact that the pump company was owned by family friends. Taylor had to do a bare minimum of heavy labor, so he was largely free to observe and contemplate the details of the factory’s operations. When he finished his apprenticeship he became a machine-shop laborer at Midvale Steelwork, another friend-owned business, where he was put on the fast track to promotions. After six promotions in six years, he was appointed chief engineer for the entire company.4

During these six years, Taylor contemplated the problems of the new era of factory production. There were plenty. The early decades of the Second Industrial Revolution were characterized by runaway inflation, plunging wages, and frequent financial panics. When Taylor first started at Midvale, the country was in the midst of the worst depression of his generation. Workers rarely stayed in one place, and factories had turnover rates ranging from 100 percent to 1,500 percent in a single year.5 Nobody really understood what was causing all the new economic problems of the factory age, but by the time he became chief engineer, Taylor was convinced that he knew their true source: inefficiency.6

The new electrified factories, Taylor asserted, wasted outrageous amounts of labor. All of this waste was the result of the way factories organized its workers, which, to Taylor’s mind, was clumsy, inept, and—most importantly—unscientific. Seven decades earlier, at the dawn of the Industrial Revolution, the first large-scale industries of textile manufacturing, iron making, and steam power created massive social upheaval, prompting Adolphe Quetelet to try to solve these new social problems through a science of society. Quetelet became the Isaac Newton of social physics. Now, in the 1890s, Taylor looked at a new era of economic upheaval and declared that the problems of the factory age could only be solved through a science of work. In other words, Taylor set out to become the Adolphe Quetelet of industrial organization.

He believed that he could systematically eliminate inefficiency from business by adopting the core precept of averagarianism, the idea that individuality did not matter. “In the past the man was first,” announced Taylor, “in the future the system must be first.”7

THE SYSTEM MUST BE FIRST

Before Taylor set out to develop a new science of work, companies usually hired the most talented workers available, regardless of their particular skill set, and then let these star employees reorganize a company’s processes according to what they believed would help them be most productive. Taylor insisted this was completely backward. A business should not conform its system to fit individual employees, no matter how special they were perceived to be. Instead, businesses should hire Average Men who fit the system. “An organization composed of individuals of mediocre ability, working in accordance with policies, plans, and procedures discovered by analysis of the fundamental facts of their situation, will in the long run prove more successful and stable than an organization of geniuses each led by inspiration,” affirmed Taylor.8

Starting in the 1890s, Taylor began sharing a new vision for industrial organization that he suggested would minimize inefficiency in the same way that the method of averages was presumed to minimize error. His vision was grounded on one key averagarian concept: standardization.9 Though Quetelet was the first scientist to champion standardization in government bureaucracies and scientific data collection, Taylor said that his own inspiration for standardizing human labor came from one of his math teachers at Phillips Exeter Academy.10

The teacher often assigned Taylor and his classmates a series of math problems, instructing each boy to snap his fingers and raise his hand when he completed the problems. The teacher used a stopwatch to time his students, then calculated how long it took the average boy to finish. Then, when the teacher created homework assignments, he used this average time to calculate exactly how many problems he needed to include in the assignment so that it would take the average boy exactly two hours to complete.

Taylor realized his teacher’s method of standardizing homework could also be used to standardize any industrial process.11 His earliest attempts at standardization took place at Midvale Steelworks. First, Taylor looked for ways to improve the speed of any given task in the factory, such as shoveling coal into the furnace. Once a task was optimized according to Taylor’s satisfaction, he measured the average time it took workers to complete the task. He also determined the average physical motions that workers used to perform that task. For example, he determined that the optimal amount of coal to shovel in a single swing was 21 pounds. Taylor then standardized the entire industrial process around these averages so that the way to perform each task became fixed and inviolable (in the case of coal shoveling, he insisted that special shovels optimized to carry 21 pounds were always used), and workers were not permitted to deviate from these standards—just as I was required to stamp aluminum in a precisely prescribed way.

According to Taylor, there was always “one best way” to accomplish any given process—and only one way, the standardized way.12 For Taylor, there was nothing worse than a worker trying to do things his own way. “There is a rock upon which many an ingenious man has stranded, that of indulging his inventive faculty,” warned Taylor in a 1918 magazine article. “It is thoroughly illegitimate for the average man to start out to make a radically new machine, or method, or process to replace one which is already successful.”13 American factories embraced Taylor’s principles of standardization and were soon posting work rules, printing books of standard operating procedures, and issuing job instruction cards, all laying out the requisite way to get things done. The worker, once celebrated as a creative craftsman, was relegated to the role of automaton.14

Today, standardization is implemented in modern enterprises in a form virtually unchanged from Taylor’s earliest proposals, a form I experienced firsthand at the aluminum stamping plant. Since it was my first real full-time job, I thought its dehumanizing grind was unique to one particular company in Utah. I was quickly disabused of that notion. Two years later, I was hired as a customer service rep for a major credit card company, sitting in a comfy swivel chair in an air-conditioned office. It seemed like it would be much different from my factory job. It wasn’t. My role, once again, was completely shaped by Taylor’s principles of standardization.

I was given a detailed script to use on calls and instructed not to veer from this script in any way. Since following the script correctly meant that a customer service call would last an average length of time, I was evaluated on the duration of each and every call. If a call exceeded the average time, my screen began flashing red. Instead of focusing on the quality of the call, I focused on making sure I hit the disconnect button as fast as possible. The computer updated my average time after each call and showed how I compared to the group average—and shared my average with my supervisor, too. If my average exceeded the group average by too much, my supervisor paid me a visit, which he did, several times. If my average had remained high, he could have fired me—though I quit before that could actually happen.

Over the next few years I worked in retail, restaurants, sales, and factories, and in each and every organization my job was standardized according to Taylor’s belief that “the system must be first.” Each time, I was a cog in a machine, with no opportunity to express individual initiative or take individual accountability. Each time, I was expected to conform to the average as closely as possible—or to be like everyone else, only better. What was worse, when I complained about how these jobs failed to take my own personality into account, leaving me feeling helpless and bored, I was often accused of being lazy or irresponsible. In a standardized system, individuality does not matter, and that was exactly what Taylor intended.

THE BIRTH OF THE MANAGER

Standardization left one crucial question unanswered: Who should create the standards that governed a business? Certainly not the worker, insisted Taylor. He argued that businesses should take away all planning, control, and decision making from the workers and hand it over to a new class of “planners” who would be responsible for overseeing the workers and determining the one best way to standardize an organization’s processes. Taylor adopted a recently invented term to describe this new role: “the manager.”15

While the notion of managers may seem like a fairly obvious idea to our modern minds, it ran counter to the conventional wisdom of nineteenth-century business. Before Taylor, companies viewed “nonproductive” employees who sat at a desk without doing physical labor as an unnecessary expense. It didn’t seem to make any sense to hire someone to plan a job who couldn’t actually do the job. But Taylor insisted this view was all wrong. Factories needed brains to direct the hands.16 It needed planners to figure out the one best way to set up the stamping machines, the one best way to stamp the aluminum, and the one best way to hire, schedule, pay, and fire workers. It was Taylor’s singular vision that shaped our modern sense of the manager as an executive decision-maker.

Taylor also established the fundamental division of business roles that quickly came to define our modern workplace: the managers in charge of running the show, and the employees who actually did the work. In Taylor’s time, these employees were primarily factory workers, but today they include roles as varied as administrative assistants, phlebotomists, air traffic controllers, electrical engineers, and pharmaceutical researchers. At a 1906 lecture, Taylor explained how he saw the relationship between workers and managers: “In our scheme, we do not ask for the initiative of our men. We do not want any initiative. All we want of them is to obey the orders we give them, do what we say, and do it quick.”17 In 1918, Taylor doubled down on these ideas, dishing out similar advice to aspiring mechanical engineers: “Every day, year in and year out, each man should ask himself, over and over again, two questions: First, ‘What is the name of the man I am now working for?’ and … ‘What does this man want me to do?’ The most important idea should be that of serving the man who is over you his way, not yours.”18

Taylor laid out his ideas of standardization and management in his 1911 book The Principles of Scientific Management.19 The book became a national and international business bestseller and was translated into a dozen languages.20 Almost immediately after the book’s publication, scientific management—often simply called “Taylorism”—swept across the world’s industries.

Business owners restructured their enterprises by creating departments and subdepartments, each headed by a Taylorist manager, making the organizational chart (“org chart”) a new focal point. Personnel and human resources departments were established and tasked with finding and hiring employees and assigning them to jobs. Taylorism inaugurated planning departments, efficiency experts, industrial-organizational psychology, and time-study engineering. (A single Westinghouse plant in 1929 had a time-study staff of a hundred and twenty who set the standards for more than a hundred thousand industrial processes each month.)21

Since thinking and planning were now cleanly separated from making and doing, businesses developed an insatiable appetite for experts to tell them the best way to do all that thinking and planning. The management consulting industry was born to satisfy this appetite, and Frederick Taylor became the world’s first management consultant. His opinion was so highly sought after that he sometimes charged the modern equivalent of $2.5 million for his advice.

All these management consultants, planning departments, and efficiency experts relied on the mathematics of the average to conduct their analyses. Managers believed that the science of Quetelet and Galton justified treating each worker like a cell on a spreadsheet, as a number in a column, an interchangeable Average Man. It was not very difficult to convince managers that individuality did not matter, since it made their job easier and more secure. After all, if you make decisions about people using types and ranks, you will not be right every time—but you will tend to be right on average, and that was good enough for large organizations with many standardized processes and roles. On those occasions where managers did make a wrong decision about an employee, they could simply blame the employee for not fitting into the system.

United States Rubber Company, International Harvester Company, and General Motors were all early adopters of the principles of scientific management. Taylorism was also applied to bricklaying, canning, food processing, dyeing, bookbinding, publishing, lithography, and wire weaving, and then to dentistry, banking, and the manufacturing of hotel furniture. In France, Renault applied Taylorism to automaking and Michelin applied it to the manufacture of tires. President Franklin Roosevelt’s system of national planning was explicitly modeled on Taylorism. By 1927, scientific management had already become so widely adopted that a League of Nations report called it “a characteristic feature of American civilization.”22

Even though Taylorism was often equated with American capitalism, its appeal crossed borders and ideologies. In Soviet Russia, Lenin heralded scientific management as the key to jump-starting Russian factories and organizing five-year industrial plans, and by the start of World War II, Frederick Taylor was as famous in the Soviet Union as Franklin Roosevelt. Mussolini and Hitler added their names to Lenin and Stalin as ardent supporters of Taylorism, adopting it for their war industries.23

Meanwhile, collectivist cultures in Asia applied scientific management even more ruthlessly than their Western counterparts, with companies like Mitsubishi and Toshiba completely remaking themselves according to the principles of standardization and worker-manager separation. When Taylor’s son visited Japan in 1961, Toshiba executives begged him for a pencil, a picture, anything that had been touched by his father.24

Today, scientific management remains the most dominant philosophy of business organization in every industrialized country.25 No company likes to admit it, of course, since in many circles Taylorism has acquired the same disreputable connotation as racism or sexism. But many of the largest and most successful corporations on Earth are still organized around the idea that the individuality of the employee does not matter.

All of this leads to a profound question that transcends Taylorism: If you have a society predicated upon the separation of system-conforming workers from system-defining managers, how does society decide who gets to be a worker and who gets to be a manager?

FACTORIES OF EDUCATION

As Taylorism began to transform American industry at the dawn of the twentieth century, factories began to develop an insatiable need for semi-skilled workers who possessed a high school education. But there was a problem. Not only did the country lack universal high school education, there were hardly any high schools at all. In the year 1900, roughly 6 percent of the American population graduated from high school. Just 2 percent graduated from college.26 At the same time, there was a massive influx of children of immigrants and factory workers, particularly in the cities, threatening to increase the number of uneducated youth still further. It soon became apparent to everyone that the American educational system needed a major overhaul.

The question that occupied the earliest education reformers was what the mission of the new school system should be. A group of educators with a humanist perspective argued that the proper goal of education was to provide students with the freedom to discover their own talents and interests by offering an environment that would allow them to learn and develop at their own pace. Some humanists even suggested that there should be no required courses, and that schools should offer more courses than any student could possibly take.27 But when it came time to establish a nationwide, compulsory high school system, the humanist model was passed over in favor of a very different vision of education—a Taylorist vision.

It was never a fair fight. On one side stood the humanists, a coterie of tweed-coated academics at cushy, exclusive northeastern colleges. They were opposed by a broad coalition of pragmatic industrialists and ambitious psychologists steeped in the values of standardization and hierarchical management. These educational Taylorists pointed out that while it was nice to think about humanistic ideals like educational self-determination, at a time when many public schools had a hundred kids in a single classroom, half unable to speak English, many living in poverty, educators did not have the luxury of giving young people the freedom to be whatever they wanted to be.28

The educational Taylorists declared that the new mission of education should be to prepare mass numbers of students to work in the newly Taylorized economy. Following Taylor’s maxim that a system of average workers was more efficient than a system of geniuses, educational Taylorists argued that schools should provide a standard education for an average student instead of trying to foster greatness. By way of example, John D. Rockefeller funded an organization known as the General Education Board, which published a 1912 essay describing its Taylorist vision of schools: “We shall not try to make these people or any of their children into philosophers or men of learning or of science. We are not to raise up from among them authors, orators, poets, or men of letters. We shall not search for embryo great artists, painters, musicians … nor lawyers, doctors, preachers, politicians, statesmen, of whom we have ample supply… . The task that we set before ourselves is very simple as well as very beautiful … we will organize our children into a little community and teach them to do in a perfect way the things their fathers and mothers are doing in an imperfect way.”29

To organize and teach children to become workers who could perform industrial tasks in “a perfect way,” the Taylorists set out to remake the architecture of the entire educational system to conform to the central tenet of scientific management: standardize everything around the average. Schools around the country adopted the “Gary Plan,” named after the industrialized Indiana city where it originated: students were divided into groups by age (not by performance, interest, or aptitude) and these groups of students rotated through different classes, each lasting a standardized period of time. School bells were introduced to emulate factory bells, in order to mentally prepare children for their future careers.30

The Taylorist educational reformers also introduced a new professional role into education: the curriculum planner. Modeled after scientific management, these planners created a fixed, inviolable curriculum that dictated everything that happened in school, including what and how students were taught, what textbooks should contain, and how students were graded. As standardization spread through schools nationwide, school boards rapidly adopted a top-down hierarchical management that replicated the management structure of Taylorism, assigning executive planning roles to principals, superintendents, and district superintendents.

By 1920, most American schools were organized according to the Taylorist vision of education, treating each student as an average student and aiming to provide each one with the same standardized education, regardless of their background, abilities, or interests. In 1924, the American journalist H. L. Mencken summarized the state of the educational system: “The aim of public education is not to spread enlightenment at all; it is simply to reduce as many individuals as possible to the same safe level, to breed and train a standardized citizenry, to put down dissent and originality. That is its aim in the United States … and that is its aim everywhere else.”31

American schools, in other words, were staunchly Queteletian, their curriculum and classrooms designed to serve the Average Student and create Average Workers. Even so, one man felt that the educational Taylorists had not taken averagarianism far enough. In an eerie parallel, just as Galton had once embraced Quetelet’s ideas about the Average Man before retrofitting the elder Belgian’s ideas so that Galton could use them to separate society’s superior citizens from its inferior ones, Edward Thorndike embraced Taylor’s ideas about standardization before refitting the elder American’s ideas so that Thorndike could use them to separate school’s superior students from its inferior ones.

THE GIFTED AND THE USELESS

Thorndike was one of the most prolific and influential psychologists of all time.32 He published more than four hundred articles, and sold millions of textbooks.33 His mentor at Harvard, William James, described Thorndike as a “freak of nature” for his workaholic productivity. He helped invent the fields of educational psychology and educational psychometrics as he pursued his most influential achievement of all: establishing the mission of schools, colleges, and universities in the Age of Average.

Thorndike fully supported the Taylorization of schools. In fact, Thorndike played a leading role in the country’s largest training program for school superintendents, preparing them for their roles as scientific managers in the standardized educational system.34 But Thorndike believed that Taylorists were making a mistake when they argued that the goal of education was to provide every student with the same average education to prepare them for the same average jobs. Thorndike believed that schools should instead sort young people according to their ability so they could efficiently be appointed to their proper station in life, whether manager or worker, eminent leader or disposable outcast—and so that educational resources could be allocated accordingly. Thorndike’s guiding axiom was “Quality is more important than equality,” by which he meant that it was more important to identify superior students and shower them with support than it was to provide every student with the same educational opportunities.

Thorndike was an enthusiastic advocate of the ideas of Francis Galton, whom he revered as “an eminently fair scientific man.”35 He agreed with Galton’s notion of rank, the theory that if a person was talented at one thing, he was likely to be talented at most other things, too. He justified this conviction using his own biological theory of learning: Thorndike believed that some people were simply born with brains that learned quickly, and these fast-learning individuals would not only be successful at school, they would be successful in life. On the other hand, some people were born with slow brains; these poor souls were destined to fare poorly at school and would struggle all life long.

Thorndike believed that schools should clear a path for talented students to proceed to college, and then onward into jobs where their superior abilities could be put to use leading the country. The bulk of students, whose talents Thorndike assumed would hover around the average, could go straight from high school graduation—or even earlier—into their jobs as Taylorist workers in the industrial economy. As to the slow-learning students, well … Thorndike thought we should probably stop spending resources on them as soon as possible.36

So how, precisely, should schools go about ranking students? Thorndike answered this question in his book ironically titled Individuality, where he redefined individuality according to the Galtonian definition: that a person’s uniqueness and value stemmed from his deviation from the average.37 Thorndike agreed that every aspect of the educational system should be standardized around the average, not only because this would ensure standardized outcomes, as the Taylorists believed, but because it made it easier to measure each student’s deviation from the average—and thus made it easier to determine who was superior and who was inferior.

To help establish his desired system of student ranking, Thorndike created standardized tests for handwriting, spelling, arithmetic, English comprehension, drawing, and reading, all of which were quickly adopted by schools across the country.38 He wrote textbooks for arithmetic, vocabulary, and spelling that were all standardized around the average student of a particular age, a practice still used in our school systems today. He designed entrance exams for private schools and elite colleges; he even fashioned an entrance exam for law school.39 Thorndike’s ideas gave birth to the notion of gifted students, honors students, special needs students, and educational tracks. He supported the use of grades as a convenient metric for ranking students’ overall talent and believed that colleges should admit those students with the best GPAs and highest standardized test scores since (according to Galton’s idea of rank) he believed they were not only the most likely to succeed in college, but most likely to succeed in whatever profession they chose.

For Thorndike, the purpose of schools was not to educate all students to the same level, but to sort them, according to their innate level of talent. It is deeply ironic that one of the most influential people in the history of education believed that education could do little to change a student’s abilities and was therefore limited to identifying those students born with a superior brain—and those born with an inferior one.

Like so many other students, I felt the full weight of Thorndikian rankings on my aspirations for the future. In high school I took a standardized college aptitude test that is widely used as an admissions criterion by most American universities. Thorndike would have loved the test because not only does it report your ranking, the test uses this ranking to predict how you will perform at different colleges, should you choose to attend. I’ve tried to forget everything about my test results, but memory traces still endure like the painful residue of a traumatic experience. My score placed me in the area that Galton would have termed “Mediocrity,” and the test informed me that, based on this score, the probability of me getting a B or higher at Weber State University, an open-enrollment school in Ogden, Utah, was a disheartening 40 percent. But that was still better than the odds of me getting a B or higher at my top choice, Brigham Young University: a mere 20 percent.

I remember reading these predictions and feeling pretty hopeless about my life. After all, these percentages, arranged in tidy columns, were endowed with the sober authority of mathematics: I felt like this single test had weighed my entire worth as a person and found me wanting. I initially thought I might one day be an engineer or a neurologist, but no—what a silly fantasy that was. Instead, the test solemnly announced that I better get used to being average.

Today, Thorndike’s rank-obsessed educational labyrinth traps everyone within its walls—and not just the students. Teachers are evaluated at the end of each school year by administrators, and the resulting rankings are used to determine promotions, penalties, and tenure. Schools and universities are themselves ranked by various publications, such as U.S. News and World Report, who give great weight to the average test scores and GPA of the students, and these rankings determine where potential students will apply and what they’re willing to pay. Businesses base their hiring decisions on applicants’ grades and the ranking of their alma mater; these businesses are themselves sometimes ranked based on how many of their employees have advanced degrees and attended famous colleges. The educational systems of entire countries are ranked based on their national performance on international standardized tests such as the PISA (Programme for International Student Assessment) exam.40

Our twenty-first-century educational system operates exactly as Thorndike intended: from our earliest grades, we are sorted according to how we perform on a standardized educational curriculum designed for the average student, with rewards and opportunities doled out to those who exceed the average, and constraints and condescension heaped upon those who lag behind. Contemporary pundits, politicians, and activists continually suggest that our educational system is broken, when in reality the opposite is true. Over the past century, we have perfected our educational system so that it runs like a well-oiled Taylorist machine, squeezing out every possible drop of efficiency in the service of the goal its architecture was originally designed to fulfill: efficiently ranking students in order to assign them to their proper place in society.

A WORLD OF TYPE AND RANK

In the span of roughly fifty years—from the 1890s to the 1940s—virtually all our social institutions came to assess each of us in terms of our relationship to the average. During this transformative stretch, businesses, schools, and the government all gradually adopted the guiding conviction that the system was more important than the individual, offering opportunities to each of us according to our type or rank. Today, the Age of Average continues unabated. In the second decade of the twenty-first century, we are each evaluated according to how closely we approximate the average—or how far we are able to exceed it.

I’m not going to pretend that the Taylorization of our workplace and the implementation of standardization and rankings in our schools was some kind of disaster. It wasn’t. When society embraced averagarianism, businesses prospered and consumers got more affordable products. Taylorism increased wages across society as a whole and probably lifted more people out of poverty than any other single economic development in the past century. By forcing college applicants and job seekers to take standardized tests, nepotism and cronyism were reduced and students from less privileged backgrounds attained unprecedented access to opportunities to a better life. Though it is easy to disparage Thorndike’s elitist belief that society should divert resources toward superior students and away from inferior ones, he also believed that wealth and inherited privilege should play no part in determining a student’s opportunities (on the other hand, he attributed different levels of mental talent to different ethnicities). Thorndike helped establish a classroom environment that made Americans out of millions of immigrants and raised the number of Americans with a high school diploma from 6 percent to 81 percent.41 Overall, the universal implementation of averagarian systems across American society undoubtedly contributed to a relatively stable and prosperous democracy.

Yet, averagarianism did cost us something. Just like the Norma Look-Alike competition, society compels each of us to conform to certain narrow expectations in order to succeed in school, our career, and in life. We all strive to be like everyone else—or, even more accurately, we all strive to be like everyone else, only better. Gifted students are designated as gifted because they took the same standardized tests as everyone else, but performed better. Top job candidates are desirable because they have the same kinds of credentials as everyone else, only better. We have lost the dignity of our individuality. Our uniqueness has become a burden, an obstacle, or a regrettable distraction on the road to success.

We live in a world where businesses, schools, and politicians all insist that individuals really do matter, when everything is quite clearly set up so the system always matters more than you. Employees work for companies where they feel like they are being treated like cogs in the machine. Students get test results or grades that make them feel as if they will never attain their dreams. In our jobs and in school we are told there is one right way to get things done, and if we pursue an alternate course, we are often told that we are misguided, naive, or just plain wrong. Excellence, too often, is not prioritized over conforming to the system.

Yet we want to be recognized for our individuality. We want to live in a society where we can truly be ourselves—where we can learn, develop, and pursue opportunities on our own terms according to our own nature, instead of needing to conform ourselves to an artificial norm.42 This desire prompts the billion-dollar question that drives this book: How can a society predicated on the conviction that individuals can only be evaluated in reference to the average ever create the conditions for understanding and harnessing individuality?