One Man's America: The Pleasures and Provocations of Our Singular Nation - George F. Will (2008)

Chapter 5. LEARNING

image

National Amnesia and Planting Cut Flowers

The 9/11 summons to seriousness ended the nation’s 1990s holiday from history, and even the National Endowment for the Humanities has enlisted in the war. Emphasizing that historical illiteracy threatens homeland security—people cannot defend what they cannot define—the NEH’s chairman, Bruce Cole, is repairing the ravages of the 1990s, when his two immediate predecessors made the NEH frivolous.

The first was Sheldon Hackney, a former college president whose big idea was a “national conversation” about diversity, his peculiar theory being that there is insufficient talk about that subject. His NEH distributed instructional kits telling Americans how to converse.

The second NEH chairman of the Clinton administration was William Ferris, a Mississippi folklorist guided by today’s hedonistic calculus—the greatest self-esteem for the greatest number. His NEH worked to “celebrate”—preferred verb of the warriors against intellectual elitism—the quotidian. He said: “Today the lives of ordinary American people have assumed a place beside volumes of European classics in the humanities.” So Middlemarch and the “life” of your mailman are equally humanities classics.

Back then, the chairman of the National Endowment for the Arts was another folklorist, Bill Ivey, whose definition of art was latitudinarian. It included “the expressive behavior of ordinary people,” such as “piecrust designs” and “dinner-table arrangements.” He talked like this: “Do we want to possess a confidence that the rich cultural matrix of our nation is appropriately auditioned for the world?”

That was then. This is now:

Cole, author of fourteen books, many on the Renaissance, was for twenty-nine years a professor of fine arts, art history, and comparative literature at Indiana University. One of his missions is to reverse America’s deepening amnesia, and especially the historical illiteracy of college students.

Fresh evidence of the latter came last week from the National Association of Scholars, whose members defend academic standards against the depredations of those levelers who, rigorous only in applying the hedonistic calculus (see above), are draining rigor from curricula. A survey sponsored by the NAS, using questions on general cultural knowledge originally asked by the Gallup Organization in 1955, establishes that today’s college seniors score little—if any—higher than 1955 high school graduates.

In his office in the Old Post Office on Pennsylvania Avenue, Cole says of the war on terrorism, “What we fight for is part of what we do around here.” What the NEH aims to help do is “make good citizens.” And “scholarship should be the basis of all we do.” Not all scholars are professors. David McCullough, the historian and biographer, is not an academic. But, then, neither were Thucydides and Gibbon.

“A wise historian,” says McCullough, “has said that to try to plan for the future without a sense of the past is like trying to plant cut flowers.” Hence the three components of the NEH’s “We the People” initiative.

One is the funding of scholarship on significant events and themes that enhance understanding of America’s animating principles. Another is an essay contest for high school juniors, concerning America’s defining tenets. The contest winner—Cole dryly notes that there will be a winner, whatever the cost to anyone’s self-esteem—will be recognized at the third component of the “We the People” initiative, the annual “Heroes of History” lecture.

The very subject of this new lecture series goes against the grain of today’s academic culture, which rejects the idea of heroes—those rare event-making individuals who are better and more important than most people. To banish elites from the human story, many academic historians tell that story as one of vast impersonal forces, in which individuals are in the iron grip of economic, racial, or gender roles. Small wonder students turn away from history taught without the drama of autonomous individuals moved by reason, conviction, and rhetoric that appeals to the better angels of their natures.

Cole soon will have a worthy colleague at the National Endowment for the Arts. The fact that its new chairman will be Dana Gioia, the distinguished poet, critic, and translator, is additional evidence that cultural revival is a priority of today’s president, who has so many despisers among the lettered.

George W. Bush is married to a librarian and his vice president is married to a former NEH chairman. Bush may have passed through Yale largely unscathed by what his professors were professing (which is just as well, considering campus conditions in the 1960s). However, he is restoring both endowments to their proper functions, defending culture as the poet Allen Tate defined it—“the study of perfection, and the constant effort to achieve it.”

[DECEMBER 26, 2002]

A Sensory Blitzkrieg of Surfaces

The first modern celebrity—the first person who, although not conspicuous in church or state, still made his work and life fascinating to a broad public—may have been Charles Dickens. Novelist Jane Smiley so argues in her slender life of Dickens, and her point is particularly interesting in light of Reading at Risk, the National Endowment for the Arts’ report on the decline of reading.

A survey of 17,135 persons reveals an accelerating decline in the reading of literature, especially among the young. Literary reading declined 5 percent between 1982 and 1992, then 14 percent in the next decade. Only 56.9 percent of Americans say they read a book of any sort in the past year, down from 60.9 percent in 1992. Only 46.7 percent of adults read any literature for pleasure.

The good news is that “literature,” as the survey defined it, excludes serious history, for which there is a sizable audience. The bad news is that any fiction counts as literature, and most fiction, like most of most things, is mediocre. But even allowing for the survey’s methodological problems, the declining importance of reading in the menu of modern recreations is unsurprising and unsettling.

Dickens, a volcano of words, provided mass entertainment before modern technologies—electricity, film, broadcasting—made mass communication easy. His serialized novels seized the attention of the British public. And America’s: Ships arriving from England with the latest installment of Dickens’s 1840 novel The Old Curiosity Shop reportedly were greeted by American dockworkers shouting, “Did Little Nell die?”

When journalists in 1910 asked an aide to Teddy Roosevelt whether TR might run for president in 1912, the aide replied, “Barkis is willin’,” and he expected most journalists, and their readers, to recognize the reference to the wagon driver in David Copperfield who was more than merely willin’ to marry Clara Peggotty, David’s childhood nurse.

Exposure to David Copperfield used to be a common facet of reaching adulthood in America. But today, young adults eighteen to thirty-four, once the most avid readers, are among the least. This surely has something to do with the depredations of higher education: Professors, lusting after tenure and prestige, teach that the great works of the Western canon, properly deconstructed, are not explorations of the human spirit but mere reflections of power relations and social pathologies.

By 1995—before the flood of video games and computer entertainments for adults—television swallowed 40 percent of Americans’ free time, up one-third since 1965. Today, electronic entertainments other than television fill 5.5 hours of the average child’s day.

There have been times when reading was regarded with suspicion. Some among the ancient Greeks regarded the rise of reading as cultural decline: They considered oral dialogue, which involves clarifying questions, more hospitable to truth. But the transition from an oral to a print culture has generally been a transition from a tribal society to a society of self-consciously separated individuals. In Europe, that transition alarmed ruling elites, who thought the “crisis of literacy” was that there was too much literacy: readers had, inconveniently, minds of their own. Reading is inherently private, hence the reader is beyond state supervision or crowd psychology.

Which suggests why there are perils in the transition from a print to an electronic culture. Time was, books were the primary means of knowing things. Now most people learn most things visually, from the graphic presentation of immediately, effortlessly accessible pictures.

People grow accustomed to the narcotic effect of their own passive reception of today’s sensory blitzkrieg of surfaces. They recoil from the more demanding nature of active engagement with the nuances encoded in the limitless permutations of twenty-six signs on pages. Besides, reading requires two things that are increasingly scarce and to which increasing numbers of Americans seem allergic—solitude and silence.

In 1940, a British officer on Dunkirk beach sent London a three-word message: “But if not.” It was instantly recognized as from the book of Daniel. When Shadrach, Meshach, and Abednego are commanded to worship a golden image or perish, they defiantly reply: “Our God who we serve is able to deliver us from the burning fiery furnace, and He will deliver us out of thine hand, O king. But if not, be it known unto thee, O king, that we will not serve thy gods….”

Britain then still had the cohesion of a common culture of shared reading. That cohesion enabled Britain to stay the hand of Hitler, a fact pertinent to today’s new age of barbarism.

[JULY 23, 2004]

“Philosophy Teaching by Examples”

When Yale awarded President Kennedy an honorary degree, he said he had the ideal combination—a Yale degree and a Harvard education. Today, he might rethink that, given the Harvard faculty’s tantrum that caused President Lawrence Summers’ cringing crawl away from his suggestion of possible gender differences of cognition. At least the phrase “Yale education” does not yet seem, as “Harvard education” does, oxymoronic.

And will not while Donald Kagan adorns Yale’s campus, where he is professor of history and classics. Last week in Washington, he delivered the thirty-fourth Jefferson Lecture in the Humanities, “In Defense of History,” which was agreeably subversive of several sides in today’s culture wars.

“The world we live in,” he said, “is a difficult place to try to make a case for the value of history.” The essence of the historian’s craft—the search for truth by painstaking research unassisted by revelation or other recourse to supernatural explanations—is embattled from two directions.

Some intellectuals today are know-nothings—literally and proudly. They argue that objectivity is a chimera, that we cannot confidently know anything of truth or virtue. These postmodernists argue, Kagan says, that all studies, including history, are “literature” because all knowledge is guesswork colored by politics—self-interest, ideology, power relationships, etc.

Those who most confidently dispute this idea derive their confidence from religious faith, arguing that only through religion can individuals, or societies, know and steadily adhere to valid standards of right and wrong. This translates into a religious litmus test in politics—only devout individuals should be chosen to lead societies.

Historians, however, say to the postmodernists that the defining characteristics of postmodernism—skepticism and cynicism—have long histories. And the historians’ riposte to those who say that religion is the only foundation for knowledge or virtue is, Kagan says, to insist that in the study of history, knowledge, far from impossible, is cumulative.

Herodotus, whom Kagan calls the first true historian, said he wrote to preserve the memory of “great and marvelous deeds” and the reasons they were done. All writers of history have “the responsibility of preserving the great, important and instructive actions of human beings,” says Kagan, which is why history is “the queen of the humanities,” ranking above literature and even philosophy.

Philosophy, he says, is valuable for untangling sloppy thinking. But philosophy, like religion, leads to investigations of first principles and ultimate reality, and such investigations have produced profound disagreement. Hence the primacy of history as the study from which, especially today, we can take our moral bearings:

“Religion and the traditions based on it were once the chief sources for moral confidence and strength. Their influence has faded in the modern world, but the need for a sound base for moral judgments has not. If we cannot look simply to moral guidance firmly founded on religious precepts, it is natural and reasonable to turn to history, the record of human experience, as a necessary supplement if not a substitute.”

Kagan’s idea is not novel. Nearly three centuries ago, Lord Boling-broke said that “history is philosophy teaching by examples.” However, at this American moment of mutual incomprehension and even contempt between theists and their postmodernist despisers, it is “transgressive”—to purloin a bit of the postmodernists’ jargon—for Kagan to insist that there is a firm middle, or perhaps higher, ground for moral confidence.

This ground, he says, is occupied by neither those who say that only theological reasoning leads to certainty nor those who say that no reasoning does. It is held by those who study and write history.

The late Daniel Boorstin, historian and librarian of Congress, said that “trying to plan for the future without knowing the past is like trying to plant cut flowers.” His point was that knowledge of history is conducive to practicality. Kagan’s point is that history, properly studied, is conducive to virtues, of which practicality is one.

Moderation is the virtue of which hubris is the opposite—and often ruinous—political vice. Historian David McCullough says the study of history is “an antidote to the hubris of the present—the idea that everything we have and everything we do and everything we think is the ultimate, the best.” Compare, for example, the heroic construction of the Panama Canal and the debacle of Boston’s “Big Dig” one hundred years later.

Near the Big Dig sits today’s Harvard, another refutation of the theory of mankind’s inevitable, steady ascent. From Yale, however, comes Kagan’s temperate affirmation of the cumulative knowledge that comes from the study of history.

[MAY 19, 2005]

Fascinating Contingencies

When George Washington, in a spiffy uniform of buff and blue and sitting his horse with a grace uncommon even among Virginians vain about their horsemanship, arrived outside Boston in July 1775 to assume command of the American rebellion, he was aghast. When he got a gander at his troops, mostly New Englanders, his reaction was akin to the Duke of Wellington’s assessment of his troops, many of them the sweepings of Britain’s slums, during the Peninsular War: “I don’t know what effect these men will have upon the enemy, but, by God, they terrify me.”

You think today’s red state–blue state antagonism is unprecedented? Washington thought New Englanders “exceeding dirty and nasty.” He would not have disputed the British general John Burgoyne’s description of the Americans besieging Boston as “a rabble in arms.” A rabble that consumed, by one sober estimate, a bottle of rum per man each day.

If, in the autumn of 1775, a council of Washington’s officers had not restrained him from a highly risky amphibious attack on Boston across the shallow Back Bay, there might never have been a Declaration of Independence. If a young officer, Henry Knox, had not had the ingenuity to conceive, and the tenacity to execute, a plan for dragging captured mortars, some weighing a ton, and cannon, some weighing two and a half tons, the three hundred miles from Fort Ticonderoga on Lake Champlain to the Dorchester Heights overlooking Boston, the British might have fought, and perhaps won, rather than evacuating the city. If after the disastrous Battle of Brooklyn, the first great battle of the war, a fog had not allowed nine thousand of Washington’s soldiers to escape across the East River, the war might have effectively ended less than two months after the Declaration.

So says David McCullough in his new book, 1776, a birthday card to his country on this Independence Day. “Ingratitude,” he has said elsewhere, “is a shabby failing,” and he writes to inspire gratitude for what a few good men, and one great one, did in the nation’s Year One.

What British historian George Otto Trevelyan said of the December 1776 Battle of Trenton, which may have saved the Revolution, could be said of all the events—defeats redeemed by skillful retreats, and a few victories—of that year: “It may be doubted whether so small a number of men ever employed so short a space of time with greater and more lasting effects upon the history of the world.”

What is history? The study of it—and the making of it, meaning politics—changed for the worse when, in the nineteenth century, history became History. When, that is, history stopped being the record of fascinating contingencies—political, intellectual, social, economic—that produced the present, and became instead a realm of necessity. The idea that History is a proper noun, denoting an autonomous process unfolding a predetermined future in accordance with laws mankind cannot amend, is called historicism. That doctrine discounts human agency, reducing even large historical figures to playthings of vast impersonal forces. McCullough knows better.

Solid, unpretentious narrative history like 1776 satisfies the healthy human thirst for a ripping good story. McCullough says E. M. Forster, the novelist, efficiently defined a story: If you are told that the king died and then the queen died, that is a sequence of events. If you are told that the king died and then the queen died of grief, that is a story that elicits empathy.

Using narrative history to refute historicism, McCullough’s two themes in 1776 are that things could have turned out very differently, and that individuals of character can change the destinies of nations. There is a thirst for both themes in this country, which is in a less-than-festive frame of mind on this birthday. It is, therefore, serendipitous that 1776, with 1.35 million copies already in print, sits atop the New York Times bestseller list on Independence Day.

But, then, serendipity has often attended the Fourth of July. That day is the birthday of Nathaniel Hawthorne (1804), arguably the father of American literature. And of Stephen Foster (1826), arguably the father of American music. And—saving the most luminous for last—of the sainted Calvin Coolidge (1872), who oversaw a 45 percent increase in America’s production of ice cream.

So, this Fourth read McCullough. Perhaps by the light of a sparkler.

[JULY 3, 2005]

Ed Schools vs. Education

The surest, quickest way to add quality to primary and secondary education would be addition by subtraction: Close all the schools of education. Consider the Chronicle of Higher Education’s recent report concerning the schools that certify America’s teachers.

Many education schools discourage, even disqualify, prospective teachers who lack the correct “disposition,” meaning those who do not embrace today’s “progressive” political catechism. Karen Siegfried had a 3.75 grade-point average at the University of Alaska Fairbanks, but after voicing conservative views, she was told by her education professors that she lacked the “professional disposition” teachers need. She is now studying to be an aviation technician.

In 2002, the National Council for Accreditation of Teacher Education declared that a “professional disposition” is “guided by beliefs and attitudes related to values such as caring, fairness, honesty, responsibility, and social justice.” Regarding that last, the Chronicle reports that the University of Alabama’s College of Education proclaims itself “committed to preparing individuals to”—what? “Read, write, and reason”? No, “to promote social justice, to be change agents, and to recognize individual and institutionalized racism, sexism, homophobia, and classism,” and to “break silences” about those things and “develop anti-racist, anti-homophobic, anti-sexist community [sic] and alliances.”

Brooklyn College, where a professor of education required her class on Language Literacy in Secondary Education to watch Fahrenheit 9/11 before the 2004 election, says it educates teacher candidates about, among many other evils, “heterosexism.” The University of Alaska Fairbanks, fluent with today’s progressive patois, says that, given America’s “caste-like system,” teachers must be taught “how racial and cultural ‘others’ negotiate American school systems, and how they perform their identities.” Got it?

The permeation of education schools by politics is a consequence of the vacuity of their curricula. Concerning that, read “Why Johnny’s Teacher Can’t Teach,” by Heather Mac Donald of the Manhattan Institute (available at city-journal.org). Today’s teacher-education focus on “professional disposition” is just the latest permutation of what Mac Donald calls the education schools’ “immutable dogma,” which she calls “Anything But Knowledge.”

The dogma has been that primary and secondary education is about “self-actualization” or “finding one’s joy” or “social adjustment” or “multicultural sensitivity” or “minority empowerment.” But is never about anything as banal as mere knowledge. It is about “constructing one’s own knowledge” and “contextualizing knowledge,” but never about knowledge of things like biology or history.

Mac Donald says “the central educational fallacy of our time,” which dates from the Progressive Era of the early twentieth century, is “that one can think without having anything to think about.” At City College of New York, a professor said that in her course Curriculum and Teaching in Elementary Education she would be “building a community, rich of talk” and “getting the students to develop the subtext of what they’re doing.” Although ed schools fancy themselves as surfers on the wave of the future, Mac Donald believes that teacher education “has been more unchanging than Miss Havisham. Like aging vestal virgins, today’s schools lovingly guard the ancient flame of progressivism”—an egalitarianism with two related tenets.

One, says Mac Donald, is that “to accord teachers any superior role in the classroom would be to acknowledge an elite hierarchy of knowledge, possessed by some but not all.” Hence, second, emphasis should be on group projects rather than individual accomplishments that are measured by tests that reveal persistent achievement gaps separating whites and Asians from other minorities.

Numerous inner-city charter and private schools are proving that the gaps can be narrowed, even closed, when rigorous pedagogy is practiced by teachers in teacher-centered classrooms where knowledge is regarded as everything. But most ed schools, celebrating “child-centered classrooms” that do not “suffocate discourses,” are enemies of rigor.

The steady drizzle of depressing data continues. A new assessment of adult literacy shows a sharp decline over the last decade, with only 31 percent of college graduates able to read and extrapolate from complex material. They were supposed to learn how to read before college, but perhaps their teachers were too busy proving their “professional dispositions” by “breaking silences” as “change agents.”

Fewer than half of U.S. eighth graders have math teachers who majored in math as undergraduates or graduate students or studied math for teacher certification. U.S. twelfth graders recently performed below the international average for twenty-one countries on tests of general knowledge of math and science. But perhaps U.S. pupils excel when asked to “perform their identities.”

[JANUARY 16, 2006]

This Just In from the Professors: Conservatism Is a Mental Illness

This just in: Conservatism often is symptomatic of a psychological syndrome. It can involve fear, aggression, uncertainty avoidance, intolerance of ambiguity, dogmatic dislike of equality, irrational nostalgia, and need for “cognitive closure,” all aspects of the authoritarian personality.

Actually, this theory has been floating around academic psychology for half a century. It is reprised in “Political Conservatism as Motivated Social Cognition,” written by four professors for Psychological Bulletin.

“Motivated social cognition” refers to the “motivational underpinnings” of ideas, the “situational as well as dispositional variables” that foster particular beliefs. Notice: situations and dispositions—not reasons. Professors have reasons for their beliefs. Other people, particularly conservatives, have social and psychological explanations for their beliefs. “Motivated cognition” involves ways of seeing and reasoning about the world that are unreasonable because they arise from emotional, psychological needs.

The professors note, “The practice of singling out political conservatives for special study began…[with a 1950] study of authoritarianism and the fascist potential in personality.” The industry of studying the sad psychology of conservatism is booming. It began with a European mixture of Marxism and Freudianism. It often involves a hash of unhistorical judgments, including the supposedly scientific, value-free judgment that conservatives are authoritarians, and that fascists—e.g., the socialist Mussolini, and Hitler, the National Socialist who wanted to conserve nothing—were conservatives.

The four professors now contribute “theories of epistemic and existential needs, and socio-political theories of ideology as individual and collective rationalizations” and “defensive motivations”—defenses against fear of uncertainty and resentment of equality. The professors have ideas; the rest of us have emanations of our psychological needs and neuroses.

“In the post-Freudian world, the ancient dichotomy between reason and passion is blurred,” say the professors, who do not say that their judgments arise from social situations or emotional needs rather than reason. The professors usefully survey the vast literature churned out by the legions of academics who have searched for the unsavory or pathological origins of conservatism (fear of death? harsh parenting? the “authoritarian personality”?).

But it is difficult to take the professors’ seriousness seriously when they say, in an essay responding to a critique of their paper, that Ronald Reagan’s “chief accomplishment, in effect, was to roll back both the New Deal and the 1960s.” His “accomplishment”? So that is why Social Security and Medicare disappeared.

The professors write, “One is justified in referring to Hitler, Mussolini, Reagan, and Limbaugh as right-wing conservatives…because they all preached a return to an idealized past and favored or condoned inequality in some form.” Until the professors give examples of political people who do not favor or condone equality in any form, it is fair to conclude that, for all their pretensions to scientific rigor, they are remarkably imprecise. And they are very political people, who would be unlikely ever to begin a sentence: “One is justified in referring to Stalin, Mao, Franklin Roosevelt and the editors of the New York Times as left-wing liberals because…”

The professors acknowledge that “the same motives may underlie different beliefs.” And “different motives may underlie the same beliefs.” And “motivational and informational influences on belief formation are not incompatible.” And no reasoning occurs in a “motivational vacuum.” And “virtually all belief systems” are embraced because they “satisfy some psychological needs.” And all this “does not mean that conservatism is pathological or that conservative beliefs are necessarily false.”

Not necessarily. What a relief. But there is no comparable academic industry devoted to studying the psychological underpinnings of liberalism. Liberals, you see, embrace liberalism for an obvious and uncomplicated reason—liberalism is self-evidently true. But conservatives embrace conservatism for reasons that must be excavated from their inner turmoils, many of them pitiable or disreputable.

The professors’ paper is adorned with this epigraph:

“Conservatism is a demanding mistress and is giving me a migraine.”

—GEORGE F. WILL

A “mistress” who is “demanding”? Freud, call your office. The epigraph is from Bunts, a book of baseball essays, from an essay concerning what conservatives should think about the designated hitter. Will probably thought he was being lighthearted. Silly him. Actually, he was struggling with fear of ambiguity and the need for cognitive closure.

Conservatives, in the crippling grip of motivated social cognition, think they oppose the DH because it makes the game less interesting by reducing managers’ strategic choices. But they really oppose that innovation because mental rigidity makes them phobic about change and intolerant of the ambiguous status of the DH. And because Mussolini would have opposed the DH.

[AUGUST 10, 2003]

The Law of Group Polarization in Academia

Republicans Outnumbered in Academia, Studies Find

New York Times, November 18

Oh, well, if studies say so. The great secret is out: Liberals dominate campuses. Coming soon: “Moon Implicated in Tides, Studies Find.”

One study of one thousand professors finds that Democrats outnumber Republicans at least seven to one in the humanities and social sciences. That imbalance, more than double what it was three decades ago, is intensifying because younger professors are more uniformly liberal than the older cohort that is retiring.

Another study, of voter registrations records, including those of professors in engineering and the hard sciences, found 9 Democrats for every Republican at Berkeley and Stanford. Among younger professors, there were 183 Democrats, 6 Republicans.

But we essentially knew this even before the American Enterprise magazine reported in 2002 of examinations of voting records in various college communities. Some findings about professors registered with the two major parties or with liberal or conservative minor parties:

Cornell:

166 liberals, 6 conservatives

Stanford:

151 liberals, 17 conservatives

Colorado:

116 liberals, 5 conservatives

UCLA:

141 liberals, 9 conservatives

The nonpartisan Center for Responsive Politics reports that in 2004, of the top five institutions in terms of employee per capita contributions to presidential candidates, the third, fourth, and fifth were Time Warner, Goldman Sachs, and Microsoft. The top two were the University of California system and Harvard, both of which gave about nineteen times more money to John Kerry than to George Bush.

But George Lakoff, a linguistics professor at Berkeley, denies that academic institutions are biased against conservatives. The disparity in hiring, he explains, occurs because conservatives are not as interested as liberals in academic careers. Why does he think liberals are like that? “Unlike conservatives, they believe in working for the public good and social justice.”

That clears that up.

A filtering process, from graduate school admissions through tenure decisions, tends to exclude conservatives from what Mark Bauerlein calls academia’s “sheltered habitat.” In a dazzling essay in the Chronicle of Higher Education, Bauerlein, professor of English at Emory University and director of research at the National Endowment for the Arts, notes that the “first protocol” of academic society is the “common assumption”—that, at professional gatherings, all the strangers in the room are liberals.

It is a reasonable assumption, given that in order to enter the profession, your work must be deemed, by the criteria of the prevailing culture, “relevant.” Bauerlein says various academic fields now have regnant premises that embed political orientations in their very definitions of scholarship:

“Schools of education, for instance, take constructivist theories of learning as definitive, excluding realists (in matters of knowledge) on principle, while the quasi-Marxist outlook of cultural studies rules out those who espouse capitalism. If you disapprove of affirmative action, forget pursuing a degree in African-American studies. If you think that the nuclear family proves the best unit of social well-being, stay away from women’s studies.”

This gives rise to what Bauerlein calls the “false consensus effect,” which occurs when, due to institutional provincialism, “people think that the collective opinion of their own group matches that of the larger population.” There also is what Cass Sunstein, professor of political science and jurisprudence at the University of Chicago, calls “the law of group polarization.” Bauerlein explains: “When like-minded people deliberate as an organized group, the general opinion shifts toward extreme versions of their common beliefs.” They become tone-deaf to the way they sound to others outside their closed circle of belief.

When John Kennedy brought to Washington such academics as Arthur Schlesinger Jr., John Kenneth Galbraith, McGeorge and William Bundy, and Walt Rostow, it was said that the Charles River was flowing into the Potomac. Actually, Richard Nixon’s administration had an even more distinguished academic cast—Henry Kissinger, Pat Moynihan, Arthur Burns, James Schlesinger, and others.

Academics, such as the next secretary of state, still decorate Washington, but academia is less listened to than it was. It has marginalized itself, partly by political shrillness and silliness that have something to do with the parochialism produced by what George Orwell called “smelly little orthodoxies.”

Many campuses are intellectual versions of one-party nations—except such nations usually have the merit, such as it is, of candor about their ideological monopolies. In contrast, American campuses have more insistently proclaimed their commitment to diversity as they have become more intellectually monochrome.

They do indeed cultivate diversity—in race, skin color, ethnicity, sexual preference. In everything but thought.

[NOVEMBER 28, 2004]

Antioch College’s Epitaph

During the campus convulsions of the late 1960s, when rebellion against any authority was considered obedience to every virtue, the film To Die in Madrid, a documentary about the Spanish civil war, was shown at a small liberal arts college famous for, and vain about, its dedication to all things progressive. When the film’s narrator intoned, “The rebels advanced on Madrid,” the students, who adored rebels and were innocent of information, cheered. Antioch College in Yellow Springs, Ohio, had been so busy turning undergraduates into vessels of liberalism and apostles of social improvement that it had not found time for the tiresome task of teaching them tedious facts, such as that the rebels in Spain were Franco’s fascists.

That illustrates why it is heartening that Antioch will close after the 2007–2008 academic year. Its board of trustees says the decision is to “suspend operations” and it talks dottily about reviving the institution in 2012. There is, however, a minuscule market for what Antioch sells for a tuition, room, and board of $35,221—repressive liberalism unleavened by learning.

Founded in 1852—its first president was Horace Mann—Antioch was, for a while, admirable. One of the first colleges to enroll women and blacks, it was a destination for escaped slaves. Its alumni include Stephen Jay Gould, Coretta Scott King, and Rod Serling, whose Twilight Zone never imagined anything weirder than what Antioch became when its liberalism curdled.

In 1972–1973, Antioch had 2,470 students. In 1973, a protracted and embittering student and employee strike left the campus physically decrepit and intellectually toxic. By 1985, enrollment was down 80 percent. This fall there may be 300 students served by a faculty of forty.

In 1993, Antioch became an international punch line when it wrote rules to ensure that all sexual conduct would be consensual, step by minute step: “If the level of sexual intimacy increases during an interaction…the people involved need to express their clear verbal consent before moving to that new level.” Does consent to a touch cover a caress? Is there consent regarding all the buttons?

Although laughable, Antioch was not funny. Former public radio correspondent Michael Goldfarb matriculated at what he calls the “sociological petri dish” in 1968. In his first week, he twice had guns drawn on him, once “in fun” and once by a couple of drunken ex-cons “whom one of my classmates, in the interest of breaking down class barriers, had invited to live with her.” A true Antiochian still, Goldfarb says: “I do think I was made stronger for having to deal with these experiences.”

Steven Lawry—Antioch’s fifth president in thirteen years—came to the college eighteen months ago. He told Scott Carlson of the Chronicle of Higher Education about a student who left after being assaulted because he wore Nike shoes, symbols of globalization. Another left because, she told Lawry, the political climate was suffocating: “They all think they are so different, but they are just a bunch of conformists.”

Carlson reports that Lawry stopped the student newspaper’s practice of printing “announcements containing anonymous, menacing threats against other students for their political views.” Antioch likes to dabble in menace: It invited Mumia Abu-Jamal to deliver its 2000 commencement speech, which he recorded on death row in a Pennsylvania prison, where he lives because twenty-six years ago he shot a Philadelphia police officer first in the back, then three times in the face. Antioch’s invitation was its way of saying…what?

In an essay in the Chronicle, Cary Nelson, Antioch class of 1967 and now a professor of English at the University of Illinois, waxes nostalgic about the fun he had spending, as Antioch students did, much time away from campus, receiving academic credits. What Nelson calls “my employee resistance to injustice” got him “released from almost every job I had until I became a faculty member.” But “my little expenditure was never noticed” when “I used some of Lyndon Johnson’s anti-poverty money” to bus anti-Vietnam war protesters from Harlem to Washington.

Given that such was Antioch’s idea of “work experience” in the “real world,” it is unsurprising that the college never produced an alumni cohort capable of enlarging the college’s risible $36 million endowment. Besides, the college seems always to have considered raising money beneath its dignity, given its nobility.

“Ben & Jerry could have named a new flavor for us,” says John Feinberg, class of 1970 and president of the alumni board, with a melancholy sense of unfulfilled destiny. His lament for a forfeited glory is a suitable epitaph for Antioch.

[JULY 15, 2007]

A Scholar’s Malfeasance Gunned Down

In a large event, much commented on, the Justice Department last week told the Supreme Court that the Second Amendment (“A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed”) “broadly protects the rights of individuals,” not just the right of states to organize militias.

This event was pertinent to a small event two weeks earlier, noticed by almost no one. The National Endowment for the Humanities demanded a review of “the serious charges that have been made against Michael Bellesiles’ scholarship,” which the NEH helped to finance.

He is the Emory University historian whose 2000 book Arming America: The Origins of a National Gun Culture earned—well, received—critical acclaim, including the Bancroft Prize, the most distinguished prize in American history. But now, slowly but relentlessly, some responsible intellectuals are defending standards of scholarship.

Bellesiles’s thesis is startling. It is that guns were not widely owned, or reliable enough to be important, at the time the Second Amendment was written. The implication is that the amendment should be read to protect only the collective rights of states, not the rights of individuals. The book pleased partisans of a cause popular in the liberal political culture of academia—gun control. Reviews were rapturous: “exhaustive research,” “intellectual rigor,” “inescapable policy implications,” “the NRA’s worst nightmare.”

Not exactly.

What has become Bellesiles’s nightmare began when a historian, suspecting nothing and hoping to build upon Bellesiles’s data, asked for more details about the eighteenth-and early-nineteenth-century probate records that Bellesiles says show that guns were infrequently listed among the estates of deceased people. He also purported to find that many of the guns that were listed were in disrepair.

When Bellesiles’s evasive response led to more tugging on the threads of his argument, it unraveled. The unraveling revealed a pattern of gross misstatements of facts and unfounded conclusions. His errors are so consistently convenient for his thesis, it is difficult to believe that the explanation is mere sloppiness or incompetence. It looks like fraud.

Responding to critics, he said some of his crucial research notes had been destroyed in a flood in his office. He said he had relied on microfilm records in the federal archive in East Point, Georgia. But it has no such records. He said he had examined probate records in thirty places around the country, such as the San Francisco Superior Court. But those records were destroyed in the 1906 earthquake.

Well, then, he said he had seen them in the Contra Costa County Historical Society. But the society has no such records, and no record of Bellesiles’s visiting the society. Then he said he did the research somewhere else, but is not sure where. Researchers have found that he consistently misrepresents extant records in Providence, Rhode Island, and Vermont. When he tried to buttress his case by posting evidence on his website, critics found grave errors there, too, purporting to support his thesis. He blamed the errors on a hacker. A hacker who attacked his website on his behalf?

This spring, the William and Mary Quarterly, the preeminent journal of early American history, undertook to referee this rumble, publishing essays from historians on aspects of Bellesiles’s argument, and a response from him. The criticism is lacerating (“nonsense” “boggles the mind” “he regularly uses evidence in a partial or imprecise way” “gives the impression that he has shaped his figures to suit his argument” “every tally of homicides Bellesiles reports is either misleading or wrong”). Bellesiles’s limping response (which begins with a jest about a French mutineer saying to his firing squad, “I am honored by all this attention”) tries to change the subject, to the meaning of “culture.”

This academic scandal is still several chapters from a satisfactory resolution. Emory, having taken the unusual step of directing Bellesiles to respond to his critics, now is conducting its own investigation, which must weigh the patent inadequacies of his responses so far, and the proper penalty for what has already been proved. Furthermore, do those responsible for awarding him the Bancroft Prize believe the award should be revoked?

Bellesiles’s malfeasance, although startling in its sweep, brazenness, and apparently political purpose, actually reveals something heartening—a considerable strength in America’s scholarly community. Its critical apparatus is working. Scholars and their journals are doing their duty, which is to hold works of scholarship up to the bright light of high standards.

As a result, when next the Supreme Court is required to rule on the controversy concerning which Bellesiles’s book was supposed to be so decisively informative, the court’s judgment will not be clouded by Bellesiles’s evident attempt to misrepresent the context in which the Framers wrote the Second Amendment.

[MAY 20, 2002]

Juggling Scarves in the Therapeutic Nation

It hurt her feelings, says Jane Fonda, sharing her feelings, that one of her husbands liked them to have sexual threesomes. “It reinforced my feeling I wasn’t good enough.”

In the Scottsdale, Arizona, Unified School District office, the receptionist used to be called a receptionist. Now she is “director of first impressions.” The happy director says, “Everyone wants to be important.” Scottsdale school bus drivers now are “transporters of learners.” A school official says such terminological readjustment is “a positive affirmation.” Which beats a negative affirmation.

Manufacturers of pens and markers report a surge in teachers’ demands for purple ink pens. When marked in red, corrections of students’ tests seem so awfully judgmental. At a Connecticut school, parents consider red markings “stressful.” A Pittsburgh principal favors more “pleasant-feeling tones.” An Alaska teacher says substituting purple for red is compassionate pedagogy, a shift from “Here’s what you need to improve on” to “Here’s what you have done right.”

Fonda’s confession, Scottsdale’s tweaking of terminology, and the recoil from red markings are manifestations of today’s therapeutic culture. The nature and menace of “therapism” is the subject of a new book, One Nation Under Therapy: How the Helping Culture Is Eroding Self-Reliance, by Christina Hoff Sommers and Sally Satel, MD, resident scholars at the American Enterprise Institute.

From childhood on, Americans are told by “experts”—therapists, self-esteem educators, grief counselors, traumatologists—that it is healthy for them continuously to take their emotional temperature, inventory their feelings, and vent them. Never mind research indicating that reticence and suppression of feelings can be healthy.

Because children are considered terribly vulnerable and fragile, playground games like dodgeball are being replaced by anxiety-reducing and self-esteem-enhancing games of tag where nobody is ever “out.” But abundant research indicates no connection between high self-esteem and high achievement or virtue. Is not unearned self-esteem a more pressing problem? Sensitivity screeners remove from texts and tests distressing references to things like rats, snakes, typhoons, blizzards, and…birthday parties (which might distress children who do not have them). The sensitivity police favor teaching what Sommers and Satel call “no-fault history.” Hence California’s Department of Education stipulating that when “ethnic or cultural groups are portrayed, portrayals must not depict differences in customs or lifestyles as undesirable”—slavery? segregation? anti-Semitism? cannibalism?—“and must not reflect adversely on such differences.”

Experts warn about what children are allowed to juggle: Tennis balls cause frustration, whereas “scarves are soft, nonthreatening, and float down slowly.” In 2001, the Girl Scouts, illustrating what Sommers and Satel say is the assumption that children are “combustible bundles of frayed nerves,” introduced, for girls eight to eleven, a “Stress Less Badge” adorned with an embroidered hammock. It can be earned by practicing “focused breathing,” keeping a “feelings diary,” burning scented candles, and exchanging foot massages.

Vast numbers of credentialed—that is not a synonym for “competent”—members of the “caring professions” have a professional stake in the myth that most people are too fragile to cope with life’s vicissitudes and traumas without professional help. Consider what Sommers and Satel call “the commodification of grief” by the “grief industry”—professional grief “counselors” with “degrieving” techniques. Such “grief gurus” are “ventilationists”: They assume that everyone should grieve the same way—by venting feelings sometimes elicited by persons who have paid $1,795 for a five-day course in grief counseling.

The “caregiving” professions, which postulate the minimal competence of most people to cope with life unassisted, are, of course, liberal and politics can color their diagnoses. Remember the theory that because Vietnam was supposedly an unjust war, it would produce an epidemic of “post-traumatic stress disorders.” So a study released in 1990 claimed that half of Vietnam veterans suffered from some PTSD—even though only 15 percent of Vietnam veterans had served in combat units. To ventilationists—after a flood damaged books at the Boston Public Library, counselors arrived to help librarians cope with their grief—a failure to manifest grief is construed as alarming evidence of grief repressed, and perhaps a precursor of “delayed onset” PTSD.

Predictably, 9/11 became another excuse for regarding healthy human reactions as pathological. Did terrorist attacks make you angry and nervous? Must be PTSD. And 9/11 gave rise to “diagnostic mission creep” as the idea of a “trauma” was expanded to include watching a disaster on television. Sommers and Satel’s book is a summons to the sensible worry that national enfeeblement must result when therapism replaces the virtues on which the republic was founded—stoicism, self-reliance, and courage.

[APRIL 21, 2005]

Nature, Nurture, and Larry Summers’s Sin

HYSTERIA—A functional disturbance of the nervous system, characterized by such disorders as anaesthesia, hyperaesthesia, convulsions, etc., and usually attended with emotional disturbances and enfeeblement or perversion of the moral and intellectual faculties.

Oxford English Dictionary

Forgive Larry Summers. He did not know where he was.

Addressing a conference on the supposedly insufficient numbers of women in tenured positions in university science departments, he suggested that perhaps part of the explanation might be innate—genetically based—gender differences in cognition. He thought he was speaking in a place that encourages uncircumscribed intellectual explorations. He was not. He was on a university campus.

He was at Harvard, where he is president. Since then he has become a serial apologizer and accomplished groveler. Soon he may be in a Khmer Rouge–style reeducation camp somewhere in New England, relearning this: In today’s academy, no social solecism is as unforgivable as the expression of a hypothesis that offends someone’s “progressive” sensibilities.

Someone like MIT biology professor Nancy Hopkins, the hysteric (see above) who, hearing Summers, “felt I was going to be sick. My heart was pounding and my breath was shallow.” And, “I just couldn’t breathe because this kind of bias makes me physically ill.” She said that if she had not bolted from the room, “I would’ve either blacked out or thrown up.”

Is this the fruit of feminism? A woman at the peak of the academic pyramid becomes theatrically flurried by an unwelcome idea and, like a Victorian maiden exposed to male coarseness, suffers the vapors and collapses on the drawing room carpet in a heap of crinolines until revived by smelling salts and the offending brute’s contrition.

Hopkins’s sufferings, although severe, were not incapacitating: She somehow found strength quickly to share them with the Boston Globe and the Today show, on which she confided that she just did not know whether she could bear to have lunch with Summers. But even while reeling from the onslaught of Summers’s thought, she retained a flair for meretriciousness: She charged that Summers had said “that 50 percent” of “the brightest minds in America” do not have “the right aptitude” for science.

Men and women have genetically based physical differences; the brain is a physical thing—part of the body. Is it unthinkable—is it even counterintuitive—that this might help explain, for example, the familiar fact that more men than women achieve the very highest scores in mathematics aptitude tests? There is a vast and growing scientific literature on possible gender differences in cognition. Only hysterics denounce interest in those possible differences—or, in Hopkins’s case, the mere mention of them—as “bias.”

Hopkins’s hysteria was a sample of America’s campus-based indignation industry, which churns out operatic reactions to imagined slights. But her hysteria also is symptomatic of a political tendency that manifested itself in some criticism of President Bush’s inaugural address, which was a manifesto about human nature.

This criticism went beyond doubts about his grandiose aspirations, to rejection of the philosophy that he might think entails such aspirations but actually does not. The philosophy of natural right—the Founders’ philosophy—rests on a single proposition: There is a universal human nature.

From that fact come, through philosophic reasoning, some normative judgments: Certain social arrangements—particularly government by consent attained by persuasion in a society accepting pluralism—are right for creatures of this nature. Hence the doctrine of “natural right,” and the idea of a nation “dedicated,” as Lincoln said, to the “proposition” that all men are created equal.

The vehemence of the political left’s recoil from this idea is explained by the investment political radicalism has had for several centuries in the notion that human beings are essentially blank slates. What predominates in determining individuals’ trajectories—nature or nurture? The left says nature is negligible, nurturing is sovereign. So a properly governed society can write what it wishes on the blank slate of humanity. This maximizes the stakes of politics and the grandeur of government’s role. And the importance of governing elites, who are the “progressive” vanguards of a perfected humanity.

The vehemence of Hopkins’s recoil from the idea that there could be gender differences pertinent to some cognition might seem merely to reflect a crude understanding of civic equality as grounded shakily on a certain identical physicality. But her hysteria actually expresses the left’s ultimate horror—the thought that nature sets limits to the malleability of human material. Summers should explain this to her, over lunch, when he returns from camp.

[JANUARY 27, 2005]

AP Harry Applies to College

“Ivies,” “safeties,” “AP prep courses,” “legacy,” “résumé-enhancing activity,” “nonbinding early acceptance,” “rolling admissions,” “single-choice early action.” If this argot is familiar to you, poor you: You have a child in high school, and these are the days that try your soul, the spring days when many college admissions are announced, often by e-mail, which is how AP Harry learned he was deferred by Harvard.

Harry is a character in Susan Coll’s new novel Acceptance, set in Verona County, Maryland, which is the real Montgomery County, Maryland, thinly disguised—rich, liberal, full of strivers and contiguous to strivers’ paradise, Washington. Harry earned the nickname AP because beginning with his freshman year he took almost every Advanced Placement course offered at Verona High School, which is so serious about placing graduates in prestigious colleges that the principal stalks the halls quizzing students on vocabulary words. For Harry, only Harvard will do.

But Harry is a white male without a legacy at Harvard, and although he got a perfect 800 on his math SAT, even with the help of private SAT prep tutoring he could boost his critical reading score only to 720. And when he got a B in an AP English course, he worried that it was the beginning of a long slide that would terminate on some skid row or, worse, at a “safety” school not among the Ivies.

Harry, who wears starched shirts and a blazer and carries a briefcase, is a real rara avis in Verona County—a conservative whose heroes include Trent Lott. And he is a wee bit obsessive. He has his mother quiz him to confirm that he remembers the U.S. News & World Report’s list—in order—of the top fifty liberal-arts colleges. He subscribes to a service that each day sends an SAT-type question to his cell phone. Harry taps his phone keyboard and reads:

“‘Their ideal was to combine individual liberty with material equality, a goal that has not yet been realized and that may be as [blank] as transmutation of lead into gold.’”

“Before Harry could continue, a small girl wearing orthodontic headgear blurted out the answer: ‘A, chimerical.’”

Also, Harry’s sentences frequently trail off into lists of synonyms useful for the SAT vocabulary test:

“‘You look kind of pale, Mom…pale, sallow, pallid, wan…’

“Grace forced a smile…‘Ashen?’ she asked.

“‘Very good, Mom,’ Harry said, smiling adorably.”

Grace’s neighbor, who walks her dog on a Burberry leash, began fretting about college admissions during the summer before their children entered eighth grade. That neighbor hired a private college counseling service at a two-year cost of $30,000, and she signed up her daughter—who she insists preferred NPR to television at age four—for SAT prep courses three years ahead of the normal schedule.

Coll writes: “How had a test originally intended to give a smart kid stuck farming pigs in the Midwest a chance to compete with the children of the Northeastern elite morphed back into a tool to help the rich stay on top?” How? By what Coll calls the “snakepit of parental competition” among the kind of parents who send holiday letters like this:

“We are ringing in the New Year in Ireland at the behest of Bree, who was so taken with her reading of ‘Ulysses’ in her rapid learner reading class that we are taking a self-guided tour of Joyce’s Dublin…Sixth grade has proven a bit dull for Bree…An aspiring novelist (as you may have guessed!), she plans to spend the summer honing her writing skills at a workshop at Johns Hopkins…Conveniently, her little brother will also be attending Johns Hopkins this summer. Gordon has been accepted into the ‘HeadsUp’ program for preschoolers who show an innate predisposition for design and engineering…”

Such parents produce children who, Coll writes, worry unhealthily as they were taught to worry in health class: “If exchanging flirty text messages was the first step toward contracting a sexually transmitted disease, a bad decision about where to apply to college would probably lead to a life of future unemployment, then homelessness, and finally exclusion from family gatherings at holidays.”

Acceptance also examines the travails of the admissions official at fictional Yates College (“the Princeton of Upstate New York”), which has just had the deranging experience of cracking the U.S. News list at number fifty. Imagine plowing through applicants’ essays “about how Mahatma Gandhi was the single greatest inspiration in these kids’ lives, or how the historical figure with whom they most closely identified was Harry Potter.”

The mother with the Burberry leash suggests that her daughter’s college application essay begin, “Family lore has it that my first words were ‘Standard Oil.’” What happens to that daughter, to Harry and other young victims of “the Verona madness”? Buy Coll’s book and find out. It is hilarious and dismaying…alarming, disturbing, disquieting, agitating, perturbing

[APRIL 9, 2007]

Teaching Minnows the Pleasure of Precision

LOS ANGELES—After eight years at Robert F. Kennedy Elementary School, Ethel Bojorquez knows a thing or two about teaching. She radiates calm, no-nonsense authority, and today she is watching a kindred spirit, Carole Valleskey, put Bojorquez’s thirty-five fourth and fifth graders briskly through their paces.

Actually, the paces are Valleskey’s. A former ballerina with the Joffrey, she now choreographs dance classes at eight fortunate Los Angeles elementary schools. For a few hours a week, Valleskey’s students restrain their anarchic individualism in order to perform as a dance troupe. Think of training young minnows in synchronized swimming.

The children have high-energy encounters with high-quality popular culture—Ellington, Gershwin, Copland—that is a far cry from hip-hop. Bojorquez, whose experience has immunized her against educational fads, admiringly watches her pupils perform under Valleskey’s exacting tutelage and exclaims, “They are learning about reading right now.

They are, she marvels, learning about—experiencing, actually—“sequencing, patterns, inferences.” She explains: “You don’t only listen to language, you do it.”

Bojorquez and Valleskey, like all teachers, function under the tyranny of the 9/91 formula: between ages six and nineteen, a child spends 9 percent of his life in school, 91 percent elsewhere. In contemporary America, “elsewhere” means immersed in the undertow of popular culture’s increasingly coarse distractions. In Los Angeles, where most public school pupils are Latino (Kennedy school is almost entirely Latino), “elsewhere” often means homes where English is barely spoken.

Bojorquez’s raven-haired students, their dark eyes riveted on Valleskey, mimic her motions. These beautiful children have a beautiful hunger for the satisfaction of structured, collaborative achievement.

That begins when Valleskey, a one-woman swarm, bounces into the room and immediately, without a word of command, reduces the turbulent students to silent, rapt attention. They concentrate in order to emulate Valleskey’s complex syncopation of claps, finger snaps, and thigh slaps by which she sets the tone of the coming hour: This will be fun because things will be done precisely right.

Part Marine Corps drill instructor, part pixie, Valleskey knows that children are realists. They do not want false praise. She knows that self-esteem is result of, not a precondition for, achievement. Her credo is: Every child can do it. The antecedent of “it” is: learn how to learn.

Her students experience a kind of freedom that is, for most children, as exhilarating as it is novel. It is not merely the absence of restraints. Rather, it is the richer freedom of a cooperative group performing to high standards within a structure of rules.

Valleskey’s California Dance Institute is, essentially, her and a few teaching assistants and musicians, sustained by a few exceptionally discerning philanthropists. CDI is associated with, but not financially supported by, the National Dance Institute, founded by Jacques d’Amboise, for many years a leading dancer with the New York City Ballet. He was the subject of the 1983 Academy Award–winning film He Makes Me Feel Like Dancin’. It explored his insight that dance—the pleasures of precision, of a task done just right—serves all the pedagogic goals of schools.

Virtues, says Valleskey, are habits, and dance, as taught by CDI, is habituation in many of the skills of learning, as well as the components of good character. Dance, properly taught, is like sport, properly understood.

The ancient Greeks considered sport serious play, a civic—meaning moral—undertaking. It is because man’s noblest activity is active engagement, as talented performer or informed spectator, with worthy things such as beauty. Including the beauty of strenuous exertion in conformity to exacting rules and high standards. By using our bodies beautifully, we come to appreciate beauty and the discipline—the restraint—that is its prerequisite and civilization’s premise.

Gifted teachers like Bojorquez and Valleskey master the patience required for the unending business of transmitting civilization down the generations, transforming biological facts—children—into social artifacts called citizens. It is wearying work, and it is a wonder teachers can summon the stamina for it. Ralph Waldo Emerson wondered:

“It must be admitted that civilization is onerous and expensive; hideous expense to keep it up;—let it go, and be Indians again; but why Indians?—that is costly, too; the mud turtle and trout life is easier and cheaper, and oyster, cheaper still.”

CDI is inexpensive. Operating on a financial shoestring—a frayed shoestring—CDI is a gift to a few of this city’s public schools. It makes one marvel at what educational improvements could be achieved with small sums in the service of something much scarcer than money—imagination.

[MARCH 25, 2004]