REPLACING THE AVERAGE IN HIGHER EDUCATION - THE AGE OF INDIVIDUALS - Abraham Lincoln - James M. McPherson

The End of Average: How We Succeed in a World That Values Sameness - Todd Rose (2016)

Part III. THE AGE OF INDIVIDUALS

Chapter 8. REPLACING THE AVERAGE IN HIGHER EDUCATION

When I started college in Ogden, Utah, I was desperate for a way out of a life of hardship and welfare. I needed a path to a better career that would allow me to provide for my wife and two sons, and this pathway needed to fit my incredibly tight financial constraints. Enrolling at Weber State University was the first step on this path, but nothing about my education came easy. During my first two years at college, I took all of my courses at night so I could continue working a full-time job during the day. Even so, my meager wages cooking bagels and selling electronics were never enough to cover all of my family’s needs. Each month, without fail, we had to pick one bill not to pay. My wife sold as much blood plasma as she was legally allowed. I borrowed diapers from neighbors. We stole toilet paper from public restrooms.

My story is not so different from countless other families who endure hardship so that they, or their children, can graduate from college. The calculation behind these sacrifices is both rational and practical: we believe, correctly, that higher education is the single most important gateway to opportunity in our society. We are willing to do just about anything to obtain a diploma because we expect that it will give us or our children the best possible chance for a good job, a good income, a good neighborhood, a good life.

For anyone who looks at the value of a college degree in these pragmatic terms—and you can count me as one of them—the implicit purpose of higher education is to prepare students for their self-chosen careers at an affordable price. Perhaps you think higher education should have other goals, too, like promoting critical thinking, instilling an appreciation for the arts, or simply exposing students to new ideas. I agree that there are several other worthwhile goals that have a place in the mission statement—but I believe they must all be secondary to the primary goal of career preparation. In college, I learned critical thinking and social values and a lot of other wonderful things that made me a better person. But at the end of all those difficult years, if I had not obtained a good job that suited me, I would have considered the experience a failure.

If we agree with this practical goal for higher education, we cannot help but conclude that our current system is falling short.1 Too many graduates cannot find a job in their field (31 percent, in one recent study by CareerBuilder);2 too many employers cannot fill good-paying jobs (35 percent, according to the Manpower Group),3 and too many employers report that the graduates they do hire are not equipped for their jobs.4 And I doubt I have to try very hard to convince you that costs are out of control, but here is one telling fact: the cost of a college degree has risen 538 percent since 1985.5 To put that into perspective, during that same period, medical costs rose by 286 percent.6 Americans now have 1.1 trillion dollars in student loan debt,7 more than all American credit card debt combined. I still owe a sizeable amount of money (enough to buy a very nice house in many parts of the U.S.) in student loans, a debt that hovers like a storm cloud over my financial future.

It is easy to imagine that it is the universities’ fault we are all in this situation. It’s not—or at least, no more than capitalism is at fault for some companies treating workers like statistics.

Like so much of the business world, the educational model of our system of higher education (and, just as important, its business model) is based on Taylorism.8 Our contemporary universities are caretakers of an averagarian system they inherited that enforces the conviction that the system is more important than the individual and compels the standardization of all educational processes. The shortcomings of our system—its costs and, most important, the gap between what graduates learn and their ability to get a job—are due to a deeply entrenched averagarian architecture that was established long ago.

THE SAME, ONLY BETTER

Regardless of what colleges and universities believe their mission is today—whether it’s to encourage problem solving and critical thinking or challenge students’ viewpoints or some other worthy humanistic goal—our existing system of higher education was designed a century ago very explicitly to sort students by ranking them based on their performance in a standardized curriculum. High school students with the best grades and test scores go the best colleges, and then the college students with the best grades get the best jobs, as well as admission into the best professional schools. The system is the educational equivalent of the Norma look-alike competition, since its relentless focus on one-dimensional rankings compels every student to do exactly the same things the average student does. Be the same as everyone else, only better.

Even before they enter college, the system pressures students to conformity: if they want to get admitted to a good college, students need to take the same classes, tests, and extracurricular activities that everyone else is taking—but do better than everyone else. Once in college, students have to take the same courses as everyone in their major, in the same amount of time, to be ranked against the average, and to earn at the end of four years an undifferentiated diploma—all at a huge financial cost to them and their parents.

Judy Muir is a college admissions consultant based in Houston, and she understands this problem of conformity better than anyone.9 She has dedicated her life to helping high school students get into college and succeed there, and for my money she’s the best at what she does. She consults for the children of celebrities, presidents, and wealthy Europeans and Middle Easterners, though most of her clients are middle-class teens. She also does more than her share of pro bono consultations for underprivileged youngsters. Muir helps parents and teenagers make sense of the complex and daunting process of applying to college. But if you sit down with Muir, it doesn’t take long before she vents her abiding frustration.

“The process is set up to ignore everything about the individuality of the student; it’s all about average, average, average, select, select, select, leading teens to sublimate their own identity in the pursuit of the façade they think admissions officers want,” Judy told me. “This is what the system has done to people, this runaway system that compares everyone against an average. Kids try to doctor their essay, they take internships they don’t believe in. Overseas they cheat on their SATs. One of the most common questions I get is how many hours of community service do I need to do to get into this or that college. What I always tell them is that the only path to a life of excellence is by understanding and developing your own unique individuality. Instead, too many parents and kids focus on hiding their individuality instead of developing it, all because they are trying to stand out on the exact same things that everyone else is trying to stand out on.”10

Bill Fitzsimmons, the Dean of Admissions and Financial Aid at Harvard, agrees, telling me, “Getting into college is usually a game of averages, except people are mortgaging their homes to play the game of averages. You’re trading in your uniqueness to be like everyone else, in the hope that you can be a little bit better at the thing that everybody else is also trying to be. But if you’re just playing the averages, then, on average, it won’t work.”11

So why are we all so willing to continue to play the game of averages when we know how flawed one-dimensional rankings of talent actually are? There is no scientific evidence that a sixteen-year-old’s performance on a standardized test, or how many churches a seventeen-year-old helped build in Costa Rica, is meaningfully connected with becoming a Supreme Court justice or founding a successful start-up or discovering a cure for cancer. But as long as everyone else is playing the game of averages—and as long as universities and employers continue to play the game—there is a real cost for any student who chooses not to play.

So at every turn, students and their families make all kinds of sacrifices, taking on a staggering amount of debt, doing their best to conform themselves to a narrow and ruthless system based on a nineteenth-century notion of ranking—to receive a diploma that is no longer even a reliable guarantor of a job. The promise of our averagarian system of higher education keeps going down, while the costs imposed by the system keep going up.

If the architecture of higher education is based upon the false premise that students can be sorted by their rank—that a standardized, institution-centered system is necessary in order to efficiently separate the talented students from the untalented ones—then no matter how great the triumphs this system might produce, it is still guaranteed to produce some failures that we simply cannot tolerate as a society. Addressing these failures will require more than doubling down on the status quo: it will require committing to valuing the individual over the system, and changing the basic architecture of higher education so that the individual student truly comes first.

This might seem like an idea that sounds good in theory, but in practice is impossible to implement. But it turns out that the path to an individualized system of higher education, while not simple or easy, is reasonably straightforward, practical, and is already happening in colleges and universities around the world with great success.

To transform the averagarian architecture of our existing system into a system that values the individual student requires that we adopt these three key concepts:

✵Grant credentials, not diplomas

✵Replace grades with competency

✵Let students determine their educational pathway

These concepts offer a blueprint for establishing an educational system that is consistent with the principles of individuality, and that will help all students choose and get trained for a career.

GRANT CREDENTIALS, NOT DIPLOMAS

Our current system of undergraduate education is standardized around one defining educational element—the four-year degree or diploma. For centuries the diploma and all the traditions surrounding its attainment—graduation ceremony, caps and gowns—have signaled to the community a student’s achievement of a milestone, an educational rite of passage.

The problem is the requirements for a bachelor’s degree are, to a large extent, arbitrary: no matter what subject you might pursue in college, the degree almost always requires the same four years. Whether you major in German literature or business administration or molecular biology—in each case, a bachelor’s degree takes nearly the same total number of credit hours (what is known in the education field as “seat time”) stretched over the same billable number of semesters.12 It doesn’t matter how difficult your chosen subject is, how fast or slow you learn, whether you go to a small private college or a sprawling public university, or whether you have mastered the necessary skills for your intended career: as long as you log the necessary hours of seat time (and do not fail a class), you will get a diploma. This, advocates of the four-year degree argue, results in a kind of “equality” of this rank across different fields.

Using the diploma as the basic unit of education introduces some obvious shortcomings into the system. If you finish all four years of seat time for a bachelor’s degree in mechanical engineering and pass all of your courses—except for a single class in the humanities—you won’t get a diploma. (You would still have to pay for four years of tuition, though.) It does not matter how well prepared you might be for a job as a mechanical engineer, if you do not complete every requirement set by the university, you don’t get a diploma. Conversely, you could fulfill all the requirements for a computer science degree from an Ivy League college—and still not be equipped for a job as a computer programmer.13

There is a logical alternative to diplomas as the basic unit of educational achievement: credentials.14 Credentialing is an approach to education that emphasizes awarding credit for the smallest meaningful unit of learning. For example, you might earn a credential for Java programming for websites, the history of World War I, pastry baking, or the climatology of Asia. Some credentials can be obtained after a few classes or even one class, whereas some may take a year or longer. Credentialing offers a more flexible and finer-grained level of certification of your skills, abilities, and knowledge.

Credentials can be combined (“stacked”) to create more advanced credentials. For example, let’s say you want to become a video game designer. Instead of pursuing a bachelor’s degree in computer science, you might get credentials in programming theory, mobile device programming, computer animation, and graphic design. The completion of all four of these credentials might qualify you for a combined “mobile device-based video game design” credential. Similarly, if you want to be an astrophysicist who studies dark matter, you would proceed through a wide range of credentials in math, physics, astronomy, and research methods that would ultimately qualify you for a “dark matter astrophysics” credential. With credentialing there are no undergraduate programs that compel you to pay exorbitant tuition to a single university for four years to earn the necessary seat-hours for a standardized degree. Instead, you can pursue as few or as many credentials as you need to prepare for the career you want.

While the idea of credentials may seem a bit radical, the reality is that it has been an important part of skill-based education for a long time. For example, MIT already offers several credentialing programs (they call them “certificates”), including credentials in areas like supply chain management, managing complex technical projects, and big data (to name just a few).15

Virginia, meanwhile, has a large-scale state-sponsored program offering credentials in several industries, including information technology, cybersecurity, advanced manufacturing, energy, and health care.16 The jobs that credentialed graduates obtain in these industries pay well and offer long-term career opportunities. The program requires approximately two to three weeks of full-time training in a simulated work environment and costs a total of $250 for each credential (the remaining costs are shouldered by industry, who are getting employees trained in the skills they need). So far, 93 percent of credentialed graduates from the program have obtained jobs. According to Governor Terry McAuliffe, the program has a goal of delivering nearly half a million credentials by 2030.17

There’s nothing special about the particular fields targeted in the Virginia credentialing initiative—they were merely those fields with a known shortage of qualified job candidates—and there is no reason that credentialing cannot be extended to include everything taught in higher education, from French drama to quantum physics to cinematography.

Another recent educational development promises to make credentialing even more viable. Massively Open Online Courses, commonly referred to as MOOCs, are online courses offered by universities that do not require students to first be admitted into the university in order to enroll. Over the past decade, hundreds of universities have begun to offer MOOCs on every topic from Asian art to zoology. Much of the focus on MOOCs has been on their capacity to deliver online learning experiences at a discount or even for free. But I think the most innovative aspect of MOOCs is not their low cost or the fact that they are online, but rather the fact that many leading MOOC providers, including Harvard and MIT, have begun to offer credentials (such as certificates) for students who complete these courses.18

MOOCs point the way to what a fully developed individualized credentialing system might look like: no more undergraduate programs where you are compelled to pay exorbitant tuition to a single university for four years in order to earn the necessary seat-hours for a standardized degree. Instead, you pursue as many credentials as you need, at the cost you want, on your own terms, in order to pursue the career of your choice.

REPLACE GRADES WITH COMPETENCY

The second element of our averagarian system of higher education that must be changed is its basic method of evaluating performance: grades. Grades serve as a one-dimensional ranking of ability—grades supposedly represent how well we’ve mastered a subject and thus measure our ability within that field. They also serve as a marker of a student’s progress along the standardized, fixed-pace pathway to a diploma.

There are two related problems with relying on grades for measuring performance. The first, and most important, is they are one-dimensional. The jaggedness principle, of course, tells us that any one-dimensional ranking cannot give an accurate picture of an individual’s true ability, skill, or talent—or, as psychologist Thomas R. Guskey wrote in Five Obstacles to Grading Reform, “If someone proposed combining measures of height, weight, diet, and exercise into a single number or mark to represent a person’s physical condition, we would consider it laughable… . Yet every day, teachers combine aspects of students’ achievement, attitude, responsibility, effort, and behavior into a single grade that’s recorded on a report card and no one questions it.”19

The other problem posed by grades is that they require employers to perform a complex interpretation of what a particular graduate’s diploma actually means. A transcript gives employers very little direct knowledge of a student’s skills, abilities, or mastery of a topic. All they have to go on is the rank of a university and the graduate’s GPA.

Fortunately, there is a straightforward solution to this problem: replace grades with a measure of competency. Instead of awarding grades for accumulating seat time in a course, completing all your homework on time, and acing your midterm, credentials would be given if, and only if, you demonstrate competency in the relevant skills, abilities, and knowledge needed for that particular credential. Although the nature of competency will differ from field to field, competency-based evaluation will have three essential features.

The first is rather obvious: it should be pass/incomplete—either you have demonstrated the competency or you have not. Second, competency evaluations must be institution-agnostic. This means you should be able to acquire the necessary competency for a credential in whatever way you like. You can still take a course—in most cases, this is probably the best option—but you will not get special credit just for completing the course, like you do right now under the current system. If you can acquire the competency online, on your own, or on the job, that’s great—you do not need to pay for a course.

The third feature of competency-based evaluations of performance is that they should be professionally aligned. Obviously, that means professional organizations, as well as employers who will be hiring individuals with the credentials, should have some input into determining what constitutes competency for a particular profession-related credential. Of course, I am not saying employers should be the only ones to decide—that would be incredibly shortsighted—but I am saying they should have a genuine seat at the table. This will help ensure a tight, flexible, and real-time match between what students learn and what they will need to succeed in their jobs.

Does the idea of an industry-aligned, competency-based approach to education seem far-fetched? It is already here. Consider, for example, Western Governors University.20 WGU is a nonprofit university that offers programs in business, information technology, health care, and teaching. Nineteen governors founded it in 1997 as an innovative strategy to better prepare students to work in particular high-need careers. The curriculum at WGU is entirely online, enabling students to move through material at their own pace. And though WGU grants degrees rather than credentials, students earn credit toward a degree by demonstrating competency, not by earning seat time in class. WGU also allows students to get credit for material they already know through competency exams without having to sit through an unnecessary course. Tuition at the school supports the notion of self-pacing: $6,000 covers as many courses as you can finish in two semesters of time.21

To ensure the industry-specific relevance of their programs, WGU has a two-step process for defining competency in a particular subject. The first is the “Program Councils”—panels of industry and academic experts who, together, define what a graduate in that area should know and be able to do to succeed at a job. The second is the “Assessment Councils,” which consist of national experts who work to create competency exams that assess whether students have mastered the necessary material. Most important, WGU relies on industry-accepted assessments whenever possible rather than inventing its own.22 Since WGU graduates have demonstrated competency in their field, they are attractive to employers.

WGU is not alone. More than two hundred schools are currently implementing or exploring competency-based forms of evaluating performance. There is even a consortium of universities working together to develop standards for scalable competency-based programs. Replacing grades with competency-based measures of performance will ensure that students can learn at their pace, and be judged according to their abilities.23

LET STUDENTS DETERMINE THEIR EDUCATIONAL PATHWAYS

Granting credentials instead of degrees and replacing grades with competency-based evaluations are necessary for higher education to support individuality, but they are not sufficient. Today, universities control almost every aspect of your educational pathway. First and foremost, the university decides whether or not to admit you into one of its diploma-granting programs. If you do get admitted, the university dictates the requirements you must fulfill to obtain a diploma—and, of course, how much you pay for the privilege. Just about the only aspect of your education you do have control over is what university to apply to, and what to major in. We must cede more control to individual students by ensuring that our educational architecture supports self-determined pathways.

We can accomplish this by building on the competency-based credentialing foundation and focusing on two additional features of the higher education system. First, students should have more educational options to choose from than the ones offered by any single university. Second, the credentialing process should be independent of any particular institution, so that students have the ability to stack their credentials, no matter how or where they earned them.

In this system, students should be able to take a course anywhere: online or in a classroom, at an employer’s training center or a local university. You could take a huge online course with thousands of students from all over the world, or get a local tutor to instruct you one-on-one, face-to-face. You could take an evening course once a week for six months, or an immersive two-week crash course. You could seek out high-intensity instructors who drive their students hard, or teachers who prefer to gently guide their students without pressing them. You could get all your credentials through courses from one institution, or stack together credentials from a variety of institutions. Or, in many cases, you could simply learn the material on your own, at your own pace, for free. The choice is up to you. Select the credentials pathway that helps you master the relevant knowledge, skills, and abilities, according to your own jagged profile, if-then signatures, and budget.

Self-determined pathways benefit students in many ways. Say you start out pursuing one stacked credential—maybe you are going after a neuroscience credential to one day become a researcher. You obtain a neuroanatomy credential and a neural systems credential, but discover that you like helping and interacting with people too much to spend your career focused on the physiological minutiae that is part of the daily grind of a bench scientist. So you decide to switch career goals and pursue a clinical psychology credential instead. The relevant neuroscience credentials that you have already obtained can be restacked and applied toward the clinical psychology credential. Or, if you decide that discussing people’s problems does not quite suit you either, you could build on your existing credentials and restack them toward a career in marketing medical devices.

Right now, if you decided to switch majors in the middle of a traditional four-year neuroscience program, you would either need to shell out additional tuition as you make up for the classes you missed, or attempt to take an extra-large course load to finish in time, or maybe you would just complete the neuroscience degree and then apply to clinical psychology graduate programs or business schools—investing four years on a subject you are not fond of on your way to more years and more tuition for learning the subject you are really interested in.

With self-determined competency-based credentialing, there are fewer penalties for experimenting in order to discover what you are truly passionate about, and even fewer costs for switching horses midstream. In fact, if it is designed to support self-determination the entire educational system should encourage you to constantly re-assess what you like to do and what you might be good at, and give you a natural way to adjust your career plans as you go along according to what you learn about yourself and according to the changing job marketplace.

One of the most common reactions I hear from people when they first learn about self-determined educational pathways is, “So you are telling me that we expect college students to make their own decisions? Have you met today’s college kids?” Though I won’t disagree that a nineteen-year-old is more apt to make a foolish mistake than a forty-year-old, I am also skeptical of any system that tells us we cannot trust people to make decisions for themselves. Indeed, the notion that we should take away individuals’ abilities to make decisions and allow the system to decide is quintessential Taylorism—the kind of thinking that got us into trouble in the first place.

That is the choice we are presented with: Do we want a system of higher education that compels each student to be like everyone else, only better? Or do we want a system that empowers each student to make her own choices?

EDUCATION IN THE AGE OF INDIVIDUALS

These three concepts—granting credentials, not diplomas; replacing grades with competency; and letting students determine their educational pathways—can help transform higher education from a system modeled after Taylorist factories that values top-down hierarchy and standardization, to a dynamic ecosystem where each student can pursue the education that suits her or him best.

A self-determined, competency-based credentialing system is also more closely aligned with the principles of individuality. It fulfills the jaggedness principle, since it allows students to figure out what they like, what they are good at, and what is the best way to pursue these interests. It fulfills the context principle by evaluating students’ competency in a context as close as possible to the professional environment where they will actually perform. And it fulfills the pathways principle by allowing each student to learn at their own pace, and follow a sequence that is right for them.

Perhaps more important, adopting these concepts would help solve the conformity problem: instead of trying to be like everyone else, only better, students will strive to be the very best version of themselves. Instead of playing the game of averages to get into a high-ranked university, you strive for professional excellence. Instead of competing with other students to be the best possible University Applicant, you compete with other students to be the best possible hire for an architectural firm, or an anthropology research laboratory, or a children’s fashion designer. In this system, you get to be exactly who you are, not who the system tells you to be.

In addition, a self-determined, competency-based credentialing system would also put us on a path to solve the problem of endlessly rising educational costs. In an individualized system, you pay for exactly the credentials you want and need—and nothing more. Instead of one institution locking you into paying four years of tuition, different institutions will compete to offer you the best possible credential at the lowest possible price. Some institutions that adopt these elements might choose to emulate Western Governors University’s “all you can learn” approach, where you pay a fixed fee and get all the training you wish from the institution. Other institutions might follow the lead of Arizona State University, which partnered with Harvard EdX to create a pioneering approach to online classes where first-year students only pay if they successfully complete them.24

An individualized educational system based on competency and credentialing also creates a much better match between students and employers, because the value and availability of credentials adjusts in real time according to the realities of the ever-changing job market. For example, if a new programming language starts to sweep through Silicon Valley, companies will quickly announce they are looking for individuals who are credentialed in the new language. Similarly, if the automobile industry switches away from an old engine style, there will be immediate pressure to reduce the engineering credentials that feature the outmoded technology. This provides students with tremendous flexibility to adjust their own pathways to take advantage of the changing market. Any student, at any moment, can see which credentials are valued by the companies they like, in geographic regions where they want to work, in industries where they want a career. They can compare costs, pathways, and difficulty of pursuing credentials and balance this with the potential salary and personal fit of various jobs.

At the same time, businesses and organizations can be assured of job applicants who have the skills and knowledge necessary for the job, because they can specify any combination of credentials that are needed for a particular job, no matter how demanding or complex, and because they have input into the competencies required for any given set of credentials. Employers can directly influence the pool of available employees, since they could offer to pay candidates to get a rare or unfamiliar credential, or even a new set of credentials.

It might seem like I’m saying that universities are the problem, or that universities are done for. No, I love universities. They provided me with the opportunity to attain a better life, and today one even pays part of my salary. Universities are essential for a vibrant, healthy democracy and a thriving economy. But the present architecture of our higher education system is based on a false premise: that we need a standardized system to efficiently separate the talented from the untalented. No matter how great the triumphs this present system might produce, its architecture is still guaranteed to produce some intolerable failures—so we must strive to change it.

Universities need to start asking tough questions about their education model. But if we really want to revolutionize the higher education system and move toward this new approach to education, then we need the help of the business world. Universities are unlikely to change unless employers demand something different. As long as employers continue to demand diplomas and degrees, there’s little incentive for universities to change their system. This revolution in individualized education will only come once employers recognize how they will benefit from it and start to hire employees based on credentials, rather than diplomas, and based on employees’ demonstrated competency rather than on grades.

An individualized approach to higher education is not easy, but it is possible. It’s already happening in colleges and universities around the world. And it will benefit everyone—students, employers, even universities themselves. It all starts with one decision—to value the individual.