PEOPLE - One Man's America: The Pleasures and Provocations of Our Singular Nation - George F. Will

One Man's America: The Pleasures and Provocations of Our Singular Nation - George F. Will (2008)

Chapter 1. PEOPLE


The Fun of William F. Buckley

In his fortieth anniversary toast to his Yale class of 1950, William F. Buckley said, “Some of us who wondered if we would ever be this old now wonder whether we were ever young.” Those who were not young forty years ago, in 1965, can have no inkling of what fun it was to be among Buckley’s disciples as he ran for mayor of New York vowing that, were he to win, his first act would be to demand a recount.

Murray Kempton, the wonderful liberal columnist who later joined Buckley’s eclectic legion of friends, wrote after Buckley’s first news conference that the candidate “had the kidney to decline the customary humiliation of soliciting the love of the voters, and read his statement of principles in a tone for all the world that of an Edwardian resident commissioner reading aloud the 39 articles of the Anglican establishment to a conscript assemblage of Zulus.” For conservatives, happy days were here again.

Back then, espousing conservatism was regarded by polite society, then soggy with that era’s barely challenged liberalism, as a species of naughtiness, not nice but also not serious. Buckley, representing New York’s Conservative Party, which was just three years old, won 13 percent of the vote. When the winner, John Lindsay, limped discredited from office eight years later, Bill’s brother Jim had been elected, on the Conservative line, U.S. senator from New York.

Buckley, for whom the nation should give thanks, turns eighty on Thanksgiving Day, and National Review, the conservative journal he founded in the belly of the beast—liberal Manhattan—turned fifty this month. It is difficult to remember, and hence especially important to remember, the slough of despond conservatism was in 1955.

Ohio senator Robert Taft, for more than a decade the leading conservative in elective office, had died in 1953. Joseph McCarthy had tainted conservatism in the process of disgracing himself with bile and bourbon. President Eisenhower had so placidly come to terms with the flaccid consensus of the 1950s that the editor of U.S. News & World Report, the most conservative newsweekly, suggested that both parties nominate Eisenhower in 1956.

National Review demurred. When it nailed its colors—pastels were not encouraged—to its mast and set sail upon the choppy seas of American controversy, one novel on the bestseller list was Sloan Wilson’s The Man in the Gray Flannel Suit, voicing the 1950s’ worry about “conformity.” National Review’s premise was that conformity was especially egregious among the intellectuals, that herd of independent minds. The magazine is one reason why the phrase “conservative intelligentsia” is no longer an oxymoron.

In 1964, National Review (circulation then: 100,000) did what the mighty Hearst press had never done—determined a major party’s presidential nomination. Barry Goldwater’s candidacy was essentially an emanation of National Review’s cluttered office on East 35th Street. Which is why an audience of young Goldwaterites took it so hard when, two months before the election, Buckley warned them that bliss would be a bit delayed:

“The point of the present occasion is to win recruits whose attention we might never have attracted but for Barry Goldwater; to win them not only for November the third, but for future Novembers; to infuse the conservative spirit in enough people to entitle us to look about us, on November fourth, not at the ashes of defeat, but at the well-planted seeds of hope, which will flower on a great November day in the future, if there is a future.”

There was. It arrived sixteen years later.

Author of more than four thousand columns, and still adding two a week; author of forty-seven books, eighteen of them novels; host of the Firing Line television program for thirty-four years; a public speaker, often making as many as seventy lectures and debates a year, for almost fifty years; ocean mariner; concert harpsichordist—his energy reproaches the rest of us. Married to a woman who matches his mettle, his proposal to her, made when he called her away from a card game, went like this:

He: “Patricia, would you consider marriage with me?”

She: “Bill, I’ve been asked this question many times. To others I’ve said no. To you I say yes. Now may I please get back and finish my hand?”

Buckley, so young at eighty, was severely precocious at seven when he wrote a starchy letter to the king of England demanding payment of Britain’s war debts. Seventy-three years on, Buckley’s country is significantly different, and better, because of him. Of how many journalists, ever, can that be said? One.

[NOVEMBER 24, 2005]

Buckley: A Life Athwart History

Those who think Jack Nicholson’s neon smile is the last word in smiles never saw William F. Buckley’s. It could light up an auditorium; it did light up half a century of elegant advocacy that made him an engaging public intellectual and the twentieth century’s most consequential journalist.

Before there could be Ronald Reagan’s presidency, there had to be Barry Goldwater’s candidacy. It made conservatism confident and placed the Republican Party in the hands of its adherents.

Before there could be Goldwater’s insurgency, there had to be National Review magazine. From the creative clutter of its Manhattan offices flowed the ideological electricity that powered the transformation of American conservatism from a mere sensibility into a fighting faith and a blueprint for governance.

Before there was National Review, there was Buckley, spoiling for a philosophic fight, to be followed, of course, by a flute of champagne with his adversaries. He was twenty-nine when, in 1955, he launched National Review with the vow that it “stands athwart history, yelling Stop.” Actually, it helped Bill take history by the lapels, shake it to get its attention, and then propel it in a new direction. Bill died Wednesday in his home, in his study, at his desk, diligent at his lifelong task of putting words together well and to good use.

Before his intervention—often laconic in manner, always passionate in purpose—in the plodding political arguments within the flaccid liberal consensus of the post-World War II intelligentsia, conservatism’s face was that of another Yale man, Robert Taft, somewhat dour, often sour, wearing three-piece suits and wire-rim glasses. The word fun did not spring to mind.

The fun began when Bill picked up his clipboard, and conservatives’ spirits, by bringing his distinctive brio and élan to political skirmishing. When young Goldwater decided to give politics a fling, he wrote to his brother: “It ain’t for life and it might be fun.” He was half right: Politics became his life, and it was fun, all the way. Politics was not Bill’s life—he had many competing and compensating enthusiasms—but it mattered to him, and he mattered to the course of political events.

One clue to Bill’s talent for friendship surely was his fondness for this thought of Harold Nicolson’s: “Only one person in a thousand is a bore, and he is interesting because he is one person in a thousand.” Consider this from Bill’s introduction to a collection of his writings titled The Jeweler’s Eye: A Book of Irresistible Political Reflections:

“The title is, of course, a calculated effrontery, the relic of an impromptu answer I gave once to a tenacious young interviewer who, toward the end of a very long session, asked me what opinion did I have of myself. I replied that I thought of myself as a perfectly average middle-aged American, with, however, a jeweler’s eye for political truths. I suppressed a smile—and watched him carefully record my words in his notebook. Having done so, he looked up and asked, ‘Who gave you your jeweler’s eye?’ ‘God,’ I said, tilting my head skyward just a little. He wrote that down—the journalism schools warn you not to risk committing anything to memory. ‘Well,’—he rose to go, smiling at last—‘that settles that!’ We have become friends.”

Pat, Bill’s beloved wife of fifty-six years, died last April. During the memorial service for her at New York’s Metropolitan Museum of Art, a friend read lines from “Vitae Summa Brevis” by a poet she admired, Ernest Dowson:

They are not long, the days of wine and roses:

Out of a misty dream

Our path emerges for a while, then closes

Within a dream.

Bill’s final dream was to see her again, a consummation of which his faith assured him. He had an aptitude for love—of his son, his church, his harpsichord, language, wine, skiing, sailing.

He began his sixty-year voyage on the turbulent waters of American controversy by tacking into the wind with a polemical book, God and Man at Yale (1951), that was a lovers’ quarrel with his alma mater. And so at Pat’s service the achingly beautiful voices of Yale’s Whiffenpoofs were raised in their signature song about the tables down at Mory’s, “the place where Louis dwells”:

We will serenade our Louis

While life and voice shall last

Then we’ll pass and be forgotten with the rest

Bill’s distinctive voice permeated, and improved, his era. It will be forgotten by no one who had the delight of hearing it.

[FEBRUARY 29, 2008]

David Brinkley: Proud Anachronism

To have worked alongside David Brinkley on television is to have experienced what might be called the Tommy Henrich Temptation. Henrich, who played right field for the Yankees when Joe DiMaggio was playing center field, must have been constantly tempted to ignore the game and just stand there watching DiMaggio, who defined for his generation the elegance of understatement and the gracefulness that is undervalued because it makes the difficult seem effortless.

Brinkley, who died Wednesday, a month shy of his eighty-third birthday, was a Washington monument as stately, and as spare in expression, as is the original. Long before high-decibel, low-brow cable shout-a-thons made the phrase “gentleman broadcaster” seem oxymoronic, Brinkley made it his business to demonstrate the compatibility of toughness and civility in journalism.

He was the most famous son of Wilmington, North Carolina, until Michael Jordan dribbled into the national consciousness. Brinkley arrived in Washington in 1943, an era when a gas mask occasionally hung from the president’s wheelchair and the city—then hardly more than a town, really—fit John Kennedy’s droll description of it as a community of Southern efficiency and Northern charm.

It was a town in which the second-most-powerful person was the Speaker of the House, Sam Rayburn, a Texan whose office wall was adorned with five portraits of Robert E. Lee, all facing south, and who said he did not socialize because “these Washington society women never serve chili.” Washington had fifteen thousand outdoor privies and a cleaning establishment that handled white flannel suits by taking them apart at the seams, hand-washing each piece, drying the pieces in the sun, then reassembling each suit. The process took a week—longer during cloudy weather—and cost $10.

By the time Brinkley retired from ABC in 1996, he had covered (in the subtitle of his 1995 autobiography) “11 Presidents, 4 Wars, 22 Political Conventions, 1 Moon Landing, 3 Assassinations, 2,000 Weeks of News and Other Stuff on Television.” Like Walter Cronkite, the only other journalist of comparable stature from television’s founding generation, Brinkley began his career in print journalism. Indeed, Brinkley began at a time when the phrase “print journalist” still seemed almost a redundancy.

During the Second World War, Edward R. Murrow and his CBS radio colleagues, such as Eric Sevareid, Charles Collingwood, Robert Trout, and William Shirer, elevated broadcast journalism. But television took awhile to get the hang of it.

In 1949, John Cameron Swayze’s Camel News Caravan, for which young Brinkley, who had joined NBC in 1943, was a reporter, was carried for fifteen minutes five nights a week. NBC’s network consisted of four stations, in Boston, New York, Philadelphia, and Washington. The sponsor required Swayze, who always wore a carnation in his lapel, to have a lit cigarette constantly in view. Not until 1963 did Cronkite’s CBS Evening News become the first thirty-minute newscast.

In 1981, after thirty-eight years with NBC, Brinkley became host of ABC’s This Week. He understood a fundamental truth about television talk shows: what one does on them one does in strangers’ living rooms. So mind your manners; do not make a scene. Those thoughts guided Brinkley as he provided adult supervision to others on This Week, the first hour-long Sunday morning interview program.

How anachronistic the maxim “mind your manners” seems in the harsh light cast by much of today’s television. How serene, even proud, Brinkley was about becoming somewhat of an anachronism.

Evelyn Waugh’s novel Scott-King’s Modern Europe (1947) concludes on what can be called a Brinkleyesque note. The protagonist, Mr. Scott-King, a teacher at an English boys’ school, is warned by the school’s headmaster that the boys’ parents are only interested in preparing their boys for the modern world.

“You can hardly blame them, can you?” said the headmaster. “Oh, yes,” Scott-King replied, “I can and do,” adding, “I think it would be very wicked indeed to do anything to fit a boy for the modern world.”

Brinkley’s backward-looking gentility made him regret, among much else, the passing of the days when it was unthinkable for a gentleman to wear other than a coat and tie when traveling by air. It is, then, an irony of the sort Brinkley savored that he was not merely present at the creation of television as a shaper of the modern world, he was among the creators of that phenomenon. Like the Founders of this fortunate Republic, Brinkley set standards of performance in his profession that still are both aspirations and reproaches to subsequent practitioners.

[JUNE 13, 2003]

Barry Goldwater: “Cheerful Malcontent”

In 2007, I was asked to write the foreword for a new edition of Goldwater’s book The Conscience of a Conservative, the first in a series of important political books republished by Princeton University Press.

When Barry Goldwater ran for president in 1964, he was Arizona’s junior senator. But, then, measured by length of Senate service, ninety-eight other senators also were junior to Arizona’s senior senator, Carl Hayden, who was a former sheriff in Arizona territory. Hayden had entered the House of Representatives at age thirty-five when Arizona acquired statehood in 1912, and entered the Senate at age forty-nine, where he served until 1969. The Western frontier, so vivid in the national imagination and so associated with American libertarianism, lived in Goldwater’s Senate colleague.

When I visited Goldwater at his home in Phoenix a few years before his death in 1998, he said he had built his house on a bluff to which, when he was young, he would ride his horse and sleep under the stars. When he was a boy, about one hundred thousand people lived in the Valley of the Sun. When Goldwater died, the population of a suburb of Phoenix—Mesa—was larger than St. Louis, and the population of the Phoenix metropolitan area, the nation’s fourteenth largest, was approaching three million.

You must remember this: Goldwater was a conservative from, and formed by, a place with precious little past to conserve. Westerners have no inclination to go through life with cricks in their necks from looking backward. When Goldwater became the embodiment of American conservatism—partly by his own efforts, and partly because he was conscripted by others for the role—that guaranteed that the mainstream of American conservatism would be utterly American. The growing conservative intelligentsia would savor many flavors of conservatism, from Edmund Burke’s to T. S. Eliot’s, conservatisms grounded on religious reverence, nostalgia, and resistance to the permanent revolution of conditions in a capitalist, market society. Such conservatisms would have been unintelligible, even repellent, to Goldwater, if he had taken time to notice them.

In the beginning, which is to say in the early 1950s, America’s modern conservative movement was remarkably bookish. It began to find its voice with Whittaker Chambers’ memoir Witness (1952), Russell Kirk’s The Conservative Mind (1953), and the twenty-five-year-old William F. Buckley Jr.’s God and Man at Yale (1951). The books most congruent with what came to be Goldwaterism included one published in London in 1944 by an Austrian and future Nobel laureate in economics—Friedrich Hayek’s The Road to Serfdom. Another book by another winner of the Nobel price for economics was Milton Friedman’s Capitalism and Freedom (1962). Like Hayek and Friedman, Goldwater’s central preoccupation was freedom, and the natural tendency of freedom’s sphere to contract as government’s sphere expands.

Goldwater was a man of many parts—politician and jet pilot, ham radio operator and accomplished photographer—but no one ever called him bookish. And if anyone ever had, Goldwater, a man of action and of the West, might have said—echoing the protagonist of the novel that invented the Western, Owen Wister’s The Virginian (1902)—“When you call me that, smile!

Then Goldwater would have smiled, because although he could be gruff, he could not stay out of sorts. He was, as journalist Richard Rovere said, “the cheerful malcontent.” In that role, he also was an early symptom—a leading indicator—of the 1960s ferment.

The 1960s are rightly remembered as years of cultural dissent and political upheaval, but they are wrongly remembered as years stirred only from the left. Actually, they were not even stirred first, or primarily, or most consequentially from the left. By the time the decade ended, with Richard Nixon in the White House, conservatism was in the saddle, embarked on winning seven of the ten presidential elections from 1968 through 2004.

But because of the political complexion of the journalists who wrote the “first rough draft of history,” and because of the similar complexion of the academic historians who have written subsequent drafts, and because much of the decade’s most lurid political turbulence, such as the turmoil on campuses and at the riotous 1968 Democratic Convention in Chicago, were episodes of dissent by the left—because of all this, the decade is remembered as one dominated by dissent from the left. Nevertheless, it can reasonably be said that dissent in the 1960s began on the right, and it is certain that the most nation-shaping dissent was from the right.

Some say we should think of the sixties as beginning on November 22, 1963, and ending in October 1973—that is, as beginning with a presidential assassination that supposedly shattered the nation’s sunny postwar disposition, and ending with the Yom Kippur War and the oil embargo that produced a sense of scarcity and national vulnerability. Arguably. But although it may seem eccentric—or banal—to say so, the sixties, understood as a decade of intellectual dissent and political insurgency, began in 1960.

On July 27, to be precise, when an Arizona senator strode to the podium of the Republican Convention in Chicago and barked: “Let’s grow up, conservatives. If we want to take the party back—and I think we can—let’s get to work.”

Back from whom? In two words, “moderate” Republicans. In one word, Northeasterners. What that word denoted, to those who used it as an epithet, was the old Republican establishment that had nominated Wendell Willkie (the “barefoot boy from Wall Street” was from Indiana, but not really), New York’s Governor Tom Dewey twice, and Dwight Eisenhower twice. (Eisenhower was from Texas and Kansas, long ago, but had sojourned in Paris and in Manhattan’s Morningside Heights—as Supreme Allied Commander and president of Columbia University—before winning the 1952 Republican nomination by defeating “Mr. Republican” and the conservatives’ favorite, Senator Robert Taft of Ohio.)

The Republican establishment, speaking through the New York Herald-Tribune, represented what Goldwater and kindred spirits considered a flaccid postwar Republican consensus. Goldwater’s complaint was that timid Republicans challenged neither the New Deal notion of the federal government’s competence and responsibilities nor the policy of mere containment regarding the Soviet Union.

The GOP establishment against which Goldwater rose in rebellion is, like the Herald-Tribune, which ceased publication in 1966, a mere memory. As is the subject of Goldwater’s last chapter, “the Soviet menace.” But what makes this book of lasting interest, and what makes it pertinent to the Republicans’ deepening intramural conflicts in the first decade of the twenty-first century, is this: Goldwater’s primary purpose was to refute the perception that conservatism was an intellectually sterile and morally crass persuasion.

In the first sentence of his first chapter, Goldwater wrote: “I have been much concerned that so many people today with Conservative instincts feel compelled to apologize for them.” Nearly half a century later, people calling themselves “progressives” are in flight from the label “liberal.” It is difficult to remember, but well to remember, how rapidly and thoroughly political fashions can change: There was a time in living memory when…well, in 1950, a man was arrested for creating a public disturbance and a witness said: “He was using abusive language, calling people conservative and all that.”

In 1960, the common caricature was that liberals had ideas and ideals, whereas conservatives had only material interests. Goldwater set out to refute the idea that conservatism is merely “a narrow, mechanistic economic theory that may work very well as a bookkeeper’s guide, but cannot be relied upon as a comprehensive political philosophy.” Goldwater insisted that it was liberalism that had become thin intellectual gruel. He said it produced government that saw the nation as a mere aggregation of clamorous constituencies with material itches that it was Washington’s duty to scratch with federal programs. The audacity of The Conscience of a Conservative was its charge that the post-New Deal political tradition, far from being idealistic, was unworthy of a free society because it treated citizens as mere aggregations of appetites.

In recent years, the intellectual energy in American politics has been concentrated on the right side of the spectrum, and today two kinds of conservatives are at daggers drawn with each other. The last twenty-five years or so produced the rise of “social conservatives,” a group generally congruent with the “religious right.” These conservatives, alarmed by what they consider the coarsening of the culture, believe in “strong government conservatism.” They argue that government can, and urgently must, have an active agenda to defend morals and promote virtue, lest freedom be lost. Other conservatives, the political descendants of Goldwater, agree that good government is, by definition, good for the public’s virtue. They also believe, however, that limited government by its limitations nurtures in men and women the responsibilities that make them competent for, and worthy of, freedom.

Had Goldwater lived to see the republication of his book in this supposedly conservative era, he might have made some characteristically blunt remarks about the impotence of books. This edition of The Conscience of a Conservative comes after a Republican president and a Republican-controlled Congress enacted in 2001 the largest federal intervention in primary and secondary education (the No Child Left Behind law) in American history. And, in 2002, enacted the largest farm subsidies. And, in 2003, enacted the largest expansion of the welfare state (the prescription drug entitlement added to Medicare) since Lyndon Johnson, the president who defeated Goldwater in 1964, created Medicare in 1965. In The Conscience of a Conservative, Goldwater insisted that most Americans embraced conservative principles, and he blamed conservatives for failing to persuade the country of the “practical relevance” of conservatism.

Was he mistaken about what most Americans believe? Are they now ideologically, meaning rhetorically, conservative, but operationally liberal?

If so, Goldwater might say this vindicates his argument: One consequence of unlimited government is unlimited dependency—learned dependency, a degrading addiction of citizens to public provisions.

But that gloomy conclusion could not long withstand Goldwater’s Western cheerfulness. Besides, it does not begin to do justice to the changes conservatism has wrought, or helped to bring about, since Goldwater summoned conservatives to take back the Republican Party. In 1960, the top income tax rate was 90 percent, there was a lifetime entitlement to welfare, the economy was much more regulated than it is now, and the Iron Curtain looked like confirmation of George Orwell’s image of totalitarianism: “Imagine a boot stamping on a human face—forever.” That “forever” expired twenty-five years after Goldwater’s presidential campaign.

Most historians probably think that Goldwater’s 1964 run for the White House was the apogee of his public life, which began with his election to the Phoenix city council in 1949 and lasted until his retirement from the Senate in 1987. Goldwater, I suspect, thought otherwise.

Before and after 1964, he was a man of the Senate; he probably thought of his presidential run as a brief detour in a career otherwise as level as the surface of a Western mesa. The Senate suited him as a venue for taking stands and enunciating views. He was a “conviction politician”—a term later minted to describe a soul mate, Margaret Thatcher—who thought the point of public life was to advance a creed. Therefore, this book, more than his presidential candidacy, was, in a sense, the essence of the public man.

When he delivered his acceptance speech to the 1964 Republican Convention in San Francisco’s Cow Palace, and thundered that “extremism in defense of liberty is no vice” and “moderation in pursuit of justice is no virtue,” a journalist rocked back in his chair and exclaimed: “My God, he’s going to run as Goldwater!” Indeed. Goldwater ran as the author of The Conscience of a Conservative, with the book as his platform.

He had been an eager author of that book, although an author with the assistance of a professional polemicist—L. Brent (“Hell Bent”) Bozell, William Buckley’s brother-in-law. Bozell helped Goldwater weave various speeches and other pronouncements into a coherent argument.

Goldwater was, however, a reluctant presidential candidate, especially after the Kennedy assassination. Even before that, he did not have the monomania requisite for a successful candidate. It has been well said that anyone who is willing to do the arduous things necessary to become president probably is too unbalanced to be trusted with the office. Goldwater preferred flying himself around Arizona to photograph Native Americans to being flown around the country in pursuit of convention delegates.

Before the Kennedy assassination, however, Goldwater rather fancied the idea of challenging Kennedy’s reelection effort. Goldwater liked Kennedy—they had been freshmen senators in 1953—and he suggested to Kennedy that they might share a plane and hopscotch around the country debating each other. After the assassination, Goldwater knew that the outcome of the 1964 election was not in doubt because, as he put it with the pungency that sometimes got him in trouble, the country was not going to assassinate two presidents in less than twelve months. But a merry band of Republican insurgents, many of them associated in one way or another with Buckley’s National Review—which was not yet nine years old when Goldwater was nominated—disregarded his reluctance and launched him on a campaign that would lose forty-four states.

But it was a spectacularly creative loss. In the process, conservatives captured the Republican Party’s apparatus. And in October 1964, when Goldwater was shown a speech he was supposed to deliver to a national television audience, he said: “This is good, but it doesn’t quite sound like me. Get Ronald Reagan to give it.” After Reagan won the presidency, conservatives liked to say that Goldwater won in 1964, but it took sixteen years to count all the votes.

Another way of understanding Goldwater’s constructive defeat involves a dialect that a Marxist might relish. In 1938, a there was backlash in congressional elections against President Franklin Roosevelt’s plan to “pack” the Supreme Court. From 1938 through 1964, there never was a reliably liberal legislating majority in Congress. A coalition of Republicans and conservative, mostly Southern, Democrats held the balance of power. But Goldwater’s landslide defeat swept liberal majorities into the House of Representatives and Senate. For two years, liberalism was rampant, until the 1966 elections began to correct the partisan imbalance. During those two years, when the prestige of government was perhaps higher than ever before or since in American history, the Great Society initiatives became an exercise in political overreaching, made possible by Goldwater’s defeat. Disappointment with the results laid the predicate for Reagan’s victory.

Which was followed by President George Herbert Walker Bush’s “kinder and gentler” conservatism, then William Clinton’s centrism, then George W. Bush’s “compassionate conservatism.” And so continues an American political argument about how much government we want, and how much we are willing to pay for it in the coin of constricted freedom. That argument gathered steam when Goldwater threw down a gauntlet—this book.

Forty-seven years after the publication of The Conscience of a Conservative, Goldwater, a seasoned politician and a child of the West, probably would look equably upon America as, like Phoenix—today approaching four million people—a work forever furiously in progress. He knew that popular government rests on public opinion, which is shiftable sand. With this book, and with his public career that vivified the principles expressed herein, he shifted a lot of sand.


John F. Kennedy’s Thoughts on Death

Landing in New York on a speaking trip, the president impulsively decided not to have a motorcade into Manhattan, so his limousine stopped at ten traffic lights. At one, a woman ran to the car and snapped a photograph inches from his face. A policeman exclaimed, “Oh, my God. She could have been an assassin.” It was November 15, 1963.

On the Sunday night of October 28, 1962, at the conclusion of the Cuban Missile Crisis, John Kennedy quipped to his brother Robert, “This is the night I should go to the theater,” a reference to Lincoln’s visit to Ford’s Theatre after the Civil War was won. Thoughts of death were not new to the man whose father had medicines stored for him in banks around the world. They were to treat chronic illnesses so serious that he had been given the last rites of the Catholic Church at least three times before he became president at age forty-three.

Even if he had not gone to Dallas, he probably would have died long before now. He would have been killed partly by the horrifying cocktails of pills and injections—sometimes six Novocain shots in his back in a day; one drug drove his cholesterol count above 400—mixed by doctors sometimes unaware of what the others were administering just to keep him ambulatory and alert.

The soaring arc of Kennedy’s truncated life combined success achieved by discipline, and sexual recklessness—seventy calls through the White House switchboard to a mistress he shared with a Mafia don; said another woman, Marilyn Monroe, “I think I made his back feel better”—that risked everything.

In President Kennedy: Profile of Power, much the best book on Kennedy, Richard Reeves says that Kennedy—“very impatient, addicted to excitement, living his life as if it were a race against boredom”—was well matched to his moment. He was a man in a hurry at a time when the pulse of communication was accelerating.

In seeking the presidency, Reeves wrote, “he did not wait his turn.” One of the elders he elbowed aside, Adlai Stevenson, said, “That young man! He never says ‘please’…” When a friend urged Kennedy to wait beyond 1960, he said, “No, they will forget me. Others will come along.”

Always there was his fatalistic sense of how perishable everything was, and his ironic awareness of how nothing is what it seems—least of all himself. Campaigning in 1960 as a vessel of “vigor,” his health often forced him to spend about half of the day in bed.

The Kennedy years had, as Reeves writes, “an astonishing density of events,” from the building of the Berlin Wall to the Birmingham church bombing, and the integration of the University of Mississippi a month before the Cuban Missile Crisis. Kennedy was a quick study, with much to learn.

Astonishingly callow when inaugurated, he was unable to stem or even discern the intragovernmental delusions and deceits that propelled the Bay of Pigs invasion just eighty-seven days into his presidency. Much flowed from that debacle. Kennedy said that in order to reverse Nikita Khrushchev’s assessment of him as weak, he had to find somewhere to show U.S. resolve: “The only place we can do that is in Vietnam. We have to send more people there.” Soon he was at the Vienna summit, where Khrushchev, impervious to his charm, concluded that he was “a pygmy.”

Only foreign affairs held Kennedy’s attention. His response to the “freedom riders” who lit a fuse of the civil rights revolution was to ask his civil rights adviser, who was white, “Can’t you get your goddamned friends off those buses?” But foreign affairs were plentiful enough.

Plentiful, and a sure cure for boredom. When on May 30, 1961, Rafael Trujillo, dictator of the Dominican Republic, was assassinated, Kennedy asked Secretary of State Dean Rusk: “Were we involved?” Rusk replied: “I don’t think so. There’s some confusion.”

In 1963, too, the days were eventful. Twenty-two days after a Saigon coup encouraged by the United States—it produced regime change through the assassination of South Vietnam’s two principal leaders—and on the day a ballpoint pen containing poison intended to kill Fidel Castro was scheduled to be delivered by CIA agent Desmond Fitzgerald to a potential assassin, Kennedy awoke in Fort Worth. He was to speak there, then fly to Dallas.

Looking down from his hotel room at the platform from which he would speak, he said to an aide, “With all these buildings around it, the Secret Service couldn’t stop someone who really wanted to get you.” It was Friday, November 22, 1963.

[NOVEMBER 20, 2003]

Eugene McCarthy: The Tamarack Tree of American Politics

I love you so…. Gone? Who will swear you wouldn’t have done good to the country, that fulfillment wouldn’t have done good to you.

—ROBERT LOWELL, “For Eugene McCarthy” (July 1968)

By August 1968, Senator Eugene McCarthy was gone and his supporters were left to wonder how—whether—his fulfillment was connected to doing good to the country. When the Democratic convention nominated another Minnesotan, Hubert Humphrey—who in 1964 won the vice presidential nomination McCarthy had craved—McCarthy went to the south of France, then covered the World Series for Life magazine. Had he campaigned for Humphrey, who narrowly lost, there probably would have been no Nixon presidency.

McCarthy died last Saturday in his ninetieth year, in this city which he sometimes seemed to include in his capacious disdain but which, for a while, he leavened with a distinctive sensibility. In 1980, he endorsed Ronald Reagan, reasoning that Reagan could not be worse than Jimmy Carter. But even in 1968 he had a sometimes ill-disguised disdain for many who flocked to his diffidently unfurled banner.

Disgusted by Vietnam policy, he laconically announced himself “willing” to be an “adequate” president, and went to New Hampshire to unseat his party’s president. McCarthy got 41.9 percent of the vote. Johnson got 49.6 percent—all write-ins; his name was not on the ballot—and three weeks later withdrew from the race.

McCarthy’s 1968 achievement elevated New Hampshire’s primary to the status it has subsequently enjoyed. His death occurred the day the Democratic Party gingerly suggested modifying its primary schedule in a way that might diminish New Hampshire’s potency.

The sacramental status of Iowa’s caucuses and New Hampshire’s primary as the first two nominating events testifies to the power of the mere passage of time to sanctify the accidental, even the unreasonable. Now the Democratic Party suggests allowing one or two states to hold caucuses—not primaries—between Iowa and New Hampshire.

The case against caucuses is that they take hours, often at night, and thus disproportionately attract the ideologically fervid—not what the Democratic Party needs. The case against New Hampshire’s primary is that its power is disproportionate for a state so unrepresentative of America’s demographic complexities. The case for New Hampshire can be put in a name: Gene McCarthy. The small state gives an unknown underdog challenger, practicing retail politics, a fighting chance.

McCarthy’s insurgency, the most luminous memory of many aging liberals, would today be impossible—criminal, actually—thanks to the recent “reform” most cherished by liberals, the McCain-Feingold campaign regulations. McCarthy’s audacious challenge to an incumbent president was utterly dependent on large early contributions from five rich liberals. Stewart Mott’s $210,000 would be more than $1.2 million in today’s dollars. McCain-Feingold codifies two absurdities: Large contributions are inherently evil, and political money can be limited without limiting political speech. McCain-Feingold criminalizes the sort of seed money that enabled McCarthy to be heard. Under McCain-Feingold’s current limit of $2,100 per contributor, McCarthy’s top five contributors combined could have given just $10,500, which in 1968 dollars would have been just $1,834.30. But, then, McCain-Feingold was written by incumbents to protect what they cherish: themselves.

McCarthy first seized national attention with a theatrical act, a gesture of elegant futility. At the 1960 convention, when John Kennedy’s nomination was already certain, McCarthy delivered an eloquent philippic urging a third nomination for the man who had been trounced in 1952 and 1956, Adlai Stevenson.

Witty, elegant, and problematic, Stevenson was the intelligentsia’s darling and a harbinger of liberalism curdled by condescension toward ordinary Americans. When an aide assured Stevenson he had the votes of thinking people, Stevenson quipped: But I need a majority. A majority of the disdained?

McCarthy’s acerbic wit sometimes slid into unpleasantness, as when, after Governor George Romney, the Michigan Republican, said that briefers in Vietnam had “brainwashed” him, McCarthy said that surely a light rinse would have sufficed. McCarthy’s wit revealed an aptitude for condescension, an aptitude that charmed intellectuals but not Americans condescended to.

A talented poet, McCarthy, in his mordant “The Tamarack,” surely summarized his experience of being beaten by Robert Kennedy after New Hampshire:

The tamarack tree is the saddest tree of all;

it is the first tree to invade the swamp,

and when it makes the soil dry enough,

the other trees come and kill it.

Never mind his subsequent lackadaisical presidential campaigns. After 1968, he adhered to the fourth of the commandments in his “10 Commandments”:

Do not relight a candle

whose flame has drowned

in its own excess of wax.

[DECEMBER 13, 2005]

What George McGovern Made

The former bomber pilot’s spry walk belies his eighty-five years; he dresses like a boulevardier—gray slacks, blue blazer, shirt with bright-red stripes and a white collar—and tucks into a robust breakfast. Long ago, he began shaping the Democrats’ presidential nomination process into the one that has his party’s two contenders locked in a long march to Pennsylvania’s April primary. He has seen important aspects of American politics move in his direction in the thirty-six years since he lost forty-nine states to Richard Nixon.

The belittling of George McGovern, especially by Democrats, only waned as memory of him faded after he lost his bid for a fourth Senate term in the 1980 Reagan landslide. But his story is fascinating, and pertinent to current events.

This minister’s son was raised on South Dakota’s parched prairies during the Depression. He remembers hiking home to the town of Mitchell by following the railroad tracks in a blinding dust storm. He was only the second major-party nominee with a Ph.D. (Woodrow Wilson was the first), which he earned at Northwestern University under Arthur Link, Wilson’s foremost biographer.

Like Wilson, also a minister’s son, McGovern was a political moralist. And he was a tenacious politician, who, inspired by the untenacious Adlai Stevenson’s presidential campaign the year before, went to work for the South Dakota Democratic Party in 1953, when it held only 2 of 110 seats in the state legislature. Just four years later, McGovern was in Congress, where his first roll-call vote was in opposition to granting President Eisenhower broad authority for military intervention in the Middle East.

In tumultuous 1968, with the Tet Offensive and two assassinations (of Martin Luther King and Robert Kennedy) in five months, two insurgent candidates, Eugene McCarthy and Robert Kennedy, sought the Democratic nomination. It was won by Vice President Hubert Humphrey, who competed in no primaries. More than one-third of the delegates to the riotous convention in Chicago had been selected in 1967, months before President Lyndon Johnson decided to retire.

McGovern was named chairman of a commission to reform the nomination process, which put the party on a path to the proliferation of caucuses and primaries allocating delegates proportionally rather than winner-take-all—the long, winding path Obama and Clinton are on. In 1972, McGovern became the first winner under the democratized process. Then he was buried by the demos, Nixon vs. McGovern.

Nixon was, McGovern notes, running nationally for the fifth time (only FDR had done that) and was at his pre-Watergate apogee, fresh from the opening to China and a strategic-arms agreement with Moscow. McGovern was bitterly opposed all the way to the Miami convention by the Democratic constituencies he was displacing. He says Barry Goldwater had warned him, “Don’t get fatigued,” but he reached Miami exhausted, lost control of the convention (he delivered his acceptance speech at 2:30 a.m.), and disastrously selected a running mate, Missouri senator Tom Eagleton, who did not disclose previous psychiatric problems and was forced off the ticket.

Still, McGovern thinks he could have won with a running mate then called “the most trusted man in America”—Walter Cronkite. Before choosing Eagleton, McGovern considered asking Cronkite, who recently indicated he would have accepted.

Bruce Miroff, a political scientist and an admirer of McGovern, argues in his new book, The Liberals’ Moment: The McGovern Insurgency and the Identity Crisis of the Democratic Party, that although McGovern’s domestic proposals featured redistributions of wealth, this was Ivy League, not prairie, populism. Branded the candidate of “acid, amnesty, and abortion” (the Democrats’ platform, adopted six months before the Supreme Court in Roe v. Wadelegislated a liberal abortion policy, did not mention abortion), McGovern became the first candidate since the New Deal to lose the Catholic and labor union vote. So 1972, more than 1968, was the hinge of the party’s history. In 1972, Miroff writes, “college-educated issue activists” supplanted the “labor/urban machine coalition.”

George Meany, head of the AFL-CIO, had dropped out of high school at age fourteen. Speaking about McGovern’s 1972 convention, where 39 percent of the delegates had advanced degrees, he said: “We heard from people who look like Jacks, acted like Jills, and had the odor of Johns about them.” The Reagan Democrats of 1980 were incubated eight years earlier.

McGovern won only 14 percent of Southern white Protestants. This, Miroff notes, made Democrats susceptible four years later to the appeal of a pious Southerner. Thus did a disaster compound itself.

In September 1963, McGovern became the only senator who opposed U.S. involvement in Vietnam during the Kennedy administration. He came by his horror of war honorably in thirty-five B-24 missions over Germany, where half the B-24 crews did not survive—they suffered a higher rate of fatalities than did Marines storming Pacific islands. McGovern was awarded a Distinguished Flying Cross with three oak-leaf clusters. In his seventies, he lost a forty-five-year-old daughter to alcoholism. Losing a presidential election, he says softly, “was not the saddest thing in my life.” Time confers a comforting perspective, giving consolations to old age, which needs them.

[FEBRUARY 25, 2008]

Daniel Patrick Moynihan: The Senate’s Sisyphus

Many of America’s largest public careers have been those of presidents. Many, but by no means all. Chief Justice John Marshall was more consequential than all but two presidents—Washington and Lincoln. Among twentieth-century public servants, General George Marshall—whose many achievements included discerning the talents of a Colonel Eisenhower—may have been second in importance only to Franklin Roosevelt. And no twentieth-century public career was as many-faceted, and involved so much prescience about as many matters, as that of Daniel Patrick Moynihan, who died Wednesday at seventy-six.

He was born in Tulsa but spent his formative years on Manhattan’s Lower East Side, from which he rose to Harvard’s faculty and the administrations of Presidents Kennedy, Johnson, Nixon, and Ford, serving as, among other things, ambassador to India and the U.S. representative at the United Nations. Then four Senate terms. Along the way he wrote more books than some of his colleagues read, and became something that, like Atlantis, is rumored to have once existed but has not recently been seen—the Democratic Party’s mind.

His was the most penetrating political intellect to come from New York since Alexander Hamilton, who, like Moynihan, saw over the horizon of his time, anticipating the evolving possibilities and problems of a consolidated, urbanized, industrial nation. A liberal who did not flinch from the label, he reminded conservatives that the Constitution’s framers “had more thoughts about power than merely its limitation.”

But he was a liberal dismayed by what he called “the leakage of reality from American life.” When in 1994 the Senate debated an education bill, Moynihan compared the legislation’s two quantifiable goals—a high school graduation rate of “at least 90 percent” by 2000, and American students “first in the world in mathematics and science”—to Soviet grain production quotas.

The Senate’s Sisyphus, Moynihan was forever pushing uphill a boulder of inconvenient data. A social scientist trained to distinguish correlation from causation, and a wit, Moynihan puckishly said that a crucial determinant of the quality of American schools is proximity to the Canadian border. The barb in his jest was this: High cognitive outputs correlate not with high per-pupil expenditures but with a high percentage of two-parent families. For that, there was the rough geographical correlation that caused Moynihan to suggest that states trying to improve their students’ test scores should move closer to Canada.

For calling attention, four decades ago, to the crisis of the African-American family—26 percent of children were being born out of wedlock—he was denounced as a racist by lesser liberals. Today the percentage among all Americans is 33, among African-Americans 69, and family disintegration, meaning absent fathers, is recognized as the most powerful predictor of most social pathologies.

At the U.N., he witnessed that institution’s inanity (as in its debate about the threat to peace posed by U.S. forces in the Virgin Islands, at that time fourteen Coast Guardsmen, one shotgun, one pistol) and its viciousness (the resolution condemning Zionism as racism). Striving to move America “from apology to opposition,” he faulted U.S. foreign policy elites as “decent people, utterly unprepared for their work.”

Their “common denominator, apart from an incapacity to deal with ideas, was a fear of making a scene, a form of good manners that is a kind of substitute for ideas.” Except they did have one idea, that “the behavior of other nations, especially the developing nations, was fundamentally a reaction to the far worse behavior of the United States.”

Moynihan carried Woodrow Wilson’s faith in international law, but he had what Wilson lacked—an understanding that ethnicity makes the world go ’round. And bleed. The persistence of this premodern sensibility defeats what Moynihan called “the liberal expectancy.” He meant the expectation that the world would become tranquil as ethnicity and religion became fading residues of mankind’s infancy.

Moynihan’s Senate campaigns were managed by as tough-minded and savvy a pol as New York’s rough-and-tumble democracy has ever produced, a person who also is a distinguished archaeologist—his wife, Elizabeth. In his first campaign, in 1976, Moynihan’s opponent was the incumbent, James Buckley, who playfully referred to “Professor Moynihan” from Harvard. Moynihan exclaimed with mock indignation, “The mudslinging has begun!”

His last home was an apartment on Washington’s Pennsylvania Avenue. That “Avenue of Presidents” was transformed from tattiness to majesty and vibrancy by three decades of his deep reflection about, and persistent insistence on, proper architectural expressions of the Republic’s spiritedness and reasonableness, virtues made wonderfully vivid in the life of Daniel Patrick Moynihan.

[MARCH 27, 2003]

John Kenneth Galbraith’s Liberalism as Condescension

John Kenneth Galbraith, the Harvard economist who died last week in his ninety-eighth year, has been justly celebrated for his wit, fluency, public-spiritedness, and public service, which extended from New Deal Washington to India, where he served as U.S. ambassador. Like two Harvard colleagues—historian Arthur Schlesinger Jr. and Senator Pat Moynihan, another ambassador to India—Galbraith was among liberalism’s leading public intellectuals, yet he was a friend and skiing partner of William F. Buckley. After one slalom down a Swiss mountain, inelegantly executed by the six-foot-eight Galbraith, Buckley asked how long Galbraith had been skiing. Thirty years, he said. Buckley mischievously replied: About as long as you have been an economist.

Galbraith was an adviser to presidents (John Kennedy, a former student, and Lyndon Johnson) and presidential aspirants (Adlai Stevenson and Eugene McCarthy). His book The Affluent Society, published in 1958, was a milestone on liberalism’s transformation into a doctrine of condescension. And into a minority persuasion.

In the 1950s, liberals were disconsolate. Voters twice rejected the intelligentsia’s pinup, Stevenson, in favor of Dwight Eisenhower, who elicited a new strain in liberalism—disdain for average Americans. Liberals dismissed the Eisenhower administration as “the bland leading the bland.” They said New Dealers had been supplanted by car dealers. How to explain the electorate’s dereliction of taste? Easy. The masses, in their bovine simplicity, had been manipulated, mostly by advertising, particularly on television, which by 1958 had become the masses’ entertainment.

Intellectuals, that herd of independent minds, were, as usual, in lockstep as they deplored “conformity.” Fear of that had begun when the decade did, with David Riesman’s The Lonely Crowd (1950), which was followed by C. Wright Mills’s White Collar (1951), Sloan Wilson’s novel The Man in the Gray Flannel Suit (1955), William Whyte’s The Organization Man (1956), and Vance Packard’s The Hidden Persuaders (1957).

Galbraith brought to the anticonformity chorus a special verve in depicting Americans as pathetic, passive lumps, as manipulable as clay. Americans were what modern liberalism relishes—victims, to be treated as wards of a government run by liberals. It never seemed to occur to Galbraith and like-minded liberals that ordinary Americans might resent that depiction and might express their resentment with their votes.

Advertising, Galbraith argued, was a leading cause of America’s “private affluence and public squalor.” By that he meant Americans’ consumerism, which produced their deplorable reluctance to surrender more of their income to taxation, trusting government to spend it wisely.

If advertising were as potent as Galbraith thought, the advent of television—a large dose of advertising, delivered to every living room—should have caused a sharp increase in consumption relative to savings. No such increase coincided with the arrival of television, but Galbraith, reluctant to allow empiricism to slow the flow of theory, was never a martyr to Moynihan’s axiom that everyone is entitled to his own opinion but not to his own facts.

Although Galbraith coined the phrase “conventional wisdom,” and thought of himself as the scourge of groupthink, The Affluent Society was the distilled essence of the conventional wisdom on campuses. In the 1960s, that liberalism became a stance of disdain, describing Americans not only as Galbraith had, as vulgar, but also as sick, racist, sexist, imperialist, etc. Again, and not amazingly, voters were not amused when told that their desires—for big cars, neighborhood schools, and other things—did not deserve respect.

But for liberals that was precisely the beauty of Galbraith’s theory. If advertising could manufacture demands for whatever corporations wanted to supply, there was no need to respect markets, which bring supply and demand into equilibrium.

The Affluent Society was the canonical text of modern liberalism’s disparagement of the competence of the average American. This liberalism—the belief that people are manipulable dolts who need to be protected by their liberal betters from exposure to “too much” advertising—is one rationale for McCain-Feingold. That law regulating campaigns embodies the political class’s belief that it knows just the right amount of permissible political speech.

Of course if advertising really could manufacture consumer wants willy-nilly, few new products would fail. But many do. The Affluent Society, postulating the awesome power of manufacturers to manufacture whatever demand they find it convenient to satisfy, was published nine months after Ford Motor Company put all of its marketing muscle behind a new product, the Edsel.

Small wonder that a conservative wit has surmised that the wisdom of economists varies inversely with their heights. Milton Friedman, ninety-three, is five feet tall.

[MAY 4, 2006]

Milton Friedman: Ebullient Master of the Dismal Science

At Oxford University in 1962, a small coterie of students, mostly Americans, merrily rowed against the leftist political currents predominant among intellectuals everywhere. Some of these rowers had been at the University of Chicago, others had come within the ambit of people from there, and all of us were infused with the doctrines of laissez-faire political economy prevalent in that university’s economics department.

A conservative member of Parliament, meeting with these free-market firebrands, began by saying, “Well, presumably we agree that at least the roads should be owned by the government.” The group greeted with stony silence this heresy against limited—very limited—government.

In one of the group’s favorite periodicals—the New Individualist Review, published at the University of Chicago—a theorist argued that the government must own lighthouses because no market mechanism could price a lighthouse’s service. That provoked this spirited rebuttal: When the light sweeps the ocean’s surface, it improves the surface, which becomes the property of the lighthouse owner, who can charge whatever the market will bear for ships to cross the illuminated surface.

Ah, but does the property right lapse when fog obscures the beam of light? Hairs were split as ideological purity hung in the balance.

These contumacious students were, as students frequently are, inebriated by ideas to the point of silliness. But they were early acolytes of the extraordinary man who, as he celebrates his ninetieth birthday this month, merits celebration as America’s most consequential public intellectual of the twentieth century.

By 1962, when he published his great manifesto Capitalism and Freedom, Milton Friedman, then a University of Chicago economist, had done much of the scholarly work for which he would receive the 1976 Nobel Prize in economics. He has been the foremost champion of “monetarism,” the theory that money supply and interest rates can do more than government fiscal policy (“demand management” and other Keynesian fine-tuning measures) to control business cycles. The theory that stable growth of the money supply can control inflation and moderate recessions is an important ingredient in the recipe for modest government.

Capitalism and Freedom inserted into political discourse such (then) novel ideas as flexible exchange rates, a private dimension of Social Security, tuition vouchers to empower parents with school choice, and a flat income tax. Gary Becker (Nobel Prize, 1992), Friedman’s colleague at the University of Chicago and the Hoover Institution, notes that when Friedman began arguing the case, most nations had top tax rates of at least 90 percent (91 percent in America). Today most top rates are 50 percent or less, so the world has moved far toward Friedman’s position.

Friedman was a charter member of the most influential society you have never heard of—the Mont Pelerin Society, named after the Swiss community where this association of laissez-faire thinkers first gathered in 1947. Its animating spirit was Friedrich Hayek, soon to be at the University of Chicago. In 1955, Hayek prompted the founding of the like-minded Institute of Economic Affairs in London, which around 1962 caught the attention of a junior member of Parliament who seventeen years later brought Friedman’s monetarism and respect for markets into No. 10 Downing Street—Margaret Thatcher.

Many intellectuals disdain the marketplace because markets function nicely without the supervision of intellectuals. Their disdain is ingratitude: The vulgar (as intellectuals see them) people who make markets productive make the intellectual class sustainable. As another of Friedman’s Chicago colleagues, George Stigler (Nobel Prize, 1982), says, “Since intellectuals are not inexpensive, until the rise of the modern enterprise system, no society could afford many intellectuals.” So “we professors are much more beholden to Henry Ford than to the foundation which bears his name and spreads his assets.”

Economics is not the “dismal science” when infused with Friedman’s ebullient spirit and expressed in his sprightly prose. So as President George W. Bush said in May when honoring Friedman, it was fortunate for the nation and the world that Friedman “flunked some of his qualifying exams to become an actuary and became an economist instead.”

John Maynard Keynes, whose preeminence among economists Friedman eclipsed, said the world is mostly ruled by the ideas of economists and political philosophers: “Practical men, who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist.” But Friedman is far from defunct as he strides jauntily into his tenth decade, still an intellectual dynamo.

Adam Smith, whose banner Milton Friedman has borne high, said, “There is much ruin in a nation.” There is much less of it in ours than there would have been were it not for Milton Friedman.

[JULY 14, 2002]

Alan Greenspan: High-Achieving Minimalist

Frequently the fate—gratifying, yet melancholy—of consequential public persons is this: They so transform an ominous social landscape that, by the time they leave the public stage, the public no longer remembers the banished dangers, and hence cannot properly value the banisher. So as Alan Greenspan heads to the end, in January, of more than eighteen years as head of the Federal Reserve, recall that thirty years ago the intelligentsia worried that democracies, including this one, had become “ungovernable.”

The worrying was caused by inflation, then thought to be the systemic disease of democracies. The theory was that democratic electorates would reward governments that delivered the pleasure of public spending and would punish those that inflicted the pain of taxation sufficient to pay for that spending. Hence democracies would run chronic deficits. These, it was assumed, both caused inflation and gave government a powerful incentive to tolerate inflation as a means of steadily reducing the real value of its debts—inflation as slow-motion repudiation.

Furthermore, deficit spending—giving the public a dollar’s worth of government goods and services and charging the public only, say, eighty cents for them—produced big government and, by making big government inexpensive, reduced public resistance to making it even bigger. And because of affluent voters’ low and steadily lowering pain thresholds, democracies would not tolerate the discomforts associated with wringing inflation out of the economy.

That supposition was slain by a fact: President Reagan and Paul Volcker, Greenspan’s predecessor, put the country through the rigors of wringing inflation out of the economy, and in 1984 Reagan carried forty-nine states. Between 1945 and 1982 the economy was in recession 22.4 percent of the time. In the 276 months since the recession ended in 1982, it has been in recession fourteen months—just 5.1 percent of the time.

Because of Americans’ low pain threshold, the Reagan-Volcker recession was considered hideous. It was the worst since the Depression, but the economy contracted less than 3 percent. In the years between 1890 and 1945, America’s period of hard learning about managing an industrial economy, three times there were contractions of 5 percent, twice contractions of 10 percent, and twice contractions of 15 percent.

Since 1945, and especially since 1982, we have learned the real secret of managing the economy: Do not try to manage it. If you refrain from trying to “fine-tune” business cycles, the cycles will be less frequent and less severe.

Greenspan’s tenure has illustrated an axiom to which his successor, Ben Bernanke, should subscribe: Minimalist missions by government produce maximum results. He has not defined the Fed’s primary purpose as achieving this or that level of employment or economic growth. Rather, its mission is to preserve the currency as a stable store of value—to control inflation. However, Greenspan’s impeccable credentials as an inflation fighter have enabled him to keep inflation rates low even during very low unemployment without kindling inflationary expectations, which can be self-fulfilling.

America’s economy is so dynamic that in any five-year period, approximately 45 percent of Americans move from one income quintile to another. Twenty percent move up from the bottom quintile in any twelve-month period, and 40 percent to 50 percent move up over ten to twenty years. Because of the constant transformation of dynamic economies, the study of economics has become a science of single instances. Its practitioners are constantly in uncharted waters, reasoning inferentially. Just as astronomers inferred the existence of Pluto from the behavior of known planets, Greenspan inferred a rate of productivity growth higher than most estimates because inflation and unemployment were falling simultaneously.

The Federal Reserve system—to give the devil his due, it is one of Woodrow Wilson’s unregrettable undertakings—annoys some populists who think every U.S. senator and representative should write on his or her bathroom mirror, and read every morning, this thought: “The Fed is a creature of Congress.” Indeed, Congress made it and could dictate to it—could dictate interest rates and the money supply. A terrifying thought.

Greenspan’s famously, at times hilariously, circumspect rhetoric has been prudent because some word, or inflection, or even arched eyebrow could have caused vast sums to slosh in this or that direction in capital markets. His rhetorical style—or perhaps antistyle—is a high-stakes illustration of Voltaire’s idea that men use speech to conceal their thoughts.

Greenspan’s wife has said he had to propose marriage three times before she understood what he was saying. And he was being droll when he said—if he said it; apocrypha collect around legends—that “if I have made myself clear I have misspoken.” His achievements speak clearly for him.

[OCTOBER 25, 2005]

The Not-at-All Dull George Washington

Tonight, after you have given up trying to get the mustard stains off the dog, and after you have treated the fingers singed by sparklers, pour a beer—a Sam Adams would be apposite—and settle down to watch on PBS the documentary biography of the man most responsible for there being an Independence Day. Ninety minutes later, Richard Brookhiser’s Rediscovering George Washington will have convinced you that its subject, whom many historians have managed to mummify into dullness, may have been the most interesting and indispensable American.

“Indispensable”? Advanced thinkers instruct us that we are not supposed to believe anyone is more important than anyone else. Not democratic. We are supposed to prefer “history from below,” meaning “history with the politics left out,” explaining the past not with reference of event-making individuals, but in terms of the holy trinity of today’s obsessions—race, gender, class.

But Brookhiser begins his film standing at Yale in front of a portrait of Washington and says: “An empire might break on that forehead.” Then he explains why an empire did: Washington’s character.

Character, particularly that of someone dead two centuries, is difficult for a camera to capture. Besides, Washington’s army, unlike Caesar’s or Napoleon’s, lost more battles than it won. It is said that America won the war because of its superior retreats, such as the one after the British landed an army on Long Island—an army larger than the population of New York City. Still, Brookhiser’s script summons Washington to life.

Brookhiser’s camera takes us to the Brooklyn scene—now an auto body shop—where brave Marylanders enabled Washington to escape to fight another day. Washington’s next defeat occurred at what is now Thirty-fourth and Lexington Avenue. Brookhiser refutes the myth that Washington’s troops usually fought “Indian style,” in forest skirmishes and ambushes. As in most eighteenth-century battles, close combat—the bayonet—caused most casualties.

Washington understood that the creation of a nation depended on creating a regular army that could slug it out with Britain’s. The tide began to turn in New Jersey, at Monmouth.

The most powerful person in American history, Washington had less formal education than any subsequent president, other than Lincoln. This six-foot-three leader of soldiers who averaged five-foot-eight was charismatic before the term was coined. Abigail Adams, no swooning teenager, described Washington with lines from Dryden: “Mark his majestic fabric. He’s a temple sacred from his birth and built by hands divine.”

He was not the only one of that era who learned showmanship from studying theater and popular entertainments such as Punch and Judy puppet shows. Joseph Addison’s play Cato was the source of Nathan Hale’s dying words (“I only regret that I have but one life to lose for my country”) and Patrick Henry’s “Give me liberty, or give me death!” In the play, Cato held mutinous officers in line by force of personality, as Washington was to do at Newburgh, New York.

Brookhiser, a senior editor of National Review magazine and frequent contributor to other magazines and to C-SPAN, is also known for his slender, sprightly biographies of Washington, Alexander Hamilton, the Adams dynasty. The longest, on Hamilton, is just 240 pages long. His next subject will be Gouverneur Morris. Brookhiser’s capsule summary of Morris is characteristically pithy: “Peg-legged ladies’ man who polished the Constitution’s language.”

Brookhiser does not write “pathographies”—biographies that present historic figures as the sum of their pathologies. His Washington adopted a noble character, then grew into it. Intensely interested in manners, Washington pioneered a civic etiquette suitable for a democracy in which preeminence was to be based on behavior, not birth.

And of the nine presidents who owned slaves, only Washington freed his at his death. Brookhiser visits a family reunion of descendants of those slaves—one of whom is a guide at Mount Vernon. Congress wanted Washington’s body to rest in a room beneath the Capitol rotunda, like Napoleon’s in the Invalides. But Washington’s remains are at Mount Vernon. So, Brookhiser says:

“The Capitol is not his tomb but the people’s house. This reflects his wishes and the goal of his life. Washington wanted to establish a government that would prove that mankind was not made for a master—not even the mastership of a hero’s memory.”

“They wanted me to be another Washington,” whined Napoleon in his exile, as stunned as the rest of the world by Washington’s voluntary yielding of power. The final component of Washington’s indispensability was the imperishable example he gave by proclaiming himself dispensable.

[JULY 4, 2002]

George Washington’s Long Journey Home

His goal, 220 years ago, was to sleep Christmas Eve in his six-foot six-inch bed at his Virginia home on the bank of the Potomac. It would be nicer than some other recent Christmas Eves.

Such as in 1776, when he led soldiers across Delaware River ice floes to one of his greatest—and, truth be told, relatively few—victories, at Trent Town, as Trenton was then known. Only four Americans died that night, two—probably shoeless—from frostbite.

George Washington spent Christmas Eve 1777 with an army leaving bloody footprints in the Valley Forge snow. Six years later, he was heading to a home he had left in 1775 to lead farmers and shopkeepers against the British Empire.

Since Yorktown, Washington, like his embryonic nation, had lived in a peculiar limbo as negotiators, two months’ travel away in Paris, codified peace with Britain. In late November, from headquarters along the Hudson River north of Manhattan island, he began his trek from strenuous public service into a placid future of private enjoyments, or so he thought.

His journey was through a nation deep in the throes—it would be in them for many years—of regime change. To the extent that there was a national regime, he was it, and he was retiring.

In January 1783, Congress had fled Philadelphia, going to ground in Princeton, New Jersey, to escape a mutiny of unpaid soldiers. Congress was a place of empty palaver by representatives of states that retained virtually untrammeled sovereignty. By July 1783, with Congress sitting in Annapolis, only South Carolina—seven decades later, it would be the least cooperative state—had paid its full assessment to the national treasury. Of the other twelve, only Washington’s Virginia had contributed half its quota. The two weightiest states, Pennsylvania and New York, had contributed one-fifth and one-twentieth, respectively.

What united the barely united states was six feet three inches of American in a blue coat and buff trousers, carrying a sword and buckle engraved “1757” that testified to his frontier service for the British against the French, whose fleet, twenty-four years later, sealed the victory at Yorktown. If on his trip home this fifty-one-year-old man had caught a chill and died, as he would do sixteen Decembers later, national unity might have been unattainable.

The story of his triumphal trip home, itself an act of nation building, is well told by historian Stanley Weintraub in his new book General Washington’s Christmas Farewell: A Mount Vernon Homecoming, 1783. It evokes the frail seedling from which the mighty American nation grew. In a seven-year (1775-1781) war in which fewer than forty-five hundred American soldiers died in combat, Washington lost more battles than he won. But he won the battle that mattered most—the last one—and adulation unlike any ever bestowed on an American.

His homeward journey paused at Harlem, a Manhattan village nine miles north of New York City, a community of twenty-one thousand on the island’s southern tip that Washington had never captured. As Washington’s party entered the city, Loyalist emigrants were being ferried to departing British ships in the harbor. A British officer marveled:

“Here, in this city, we have had an army for more than seven years, and yet could not keep the peace of it. Scarcely a day or night passed without tumults. Now we are gone, everything is in quietness and safety. These Americans are a curious, original people; they know how to govern themselves, but nobody else can govern them.”

Then it was four days to Philadelphia, passing along what is now U.S. Route 1 through difficult New Jersey. In 1776, Washington had urged Jerseymen in the village of Newark to join his cause. Thirty did—but three hundred joined the British. In Annapolis, he surrendered his commission after a ball at which, Weintraub reports, fashionable ladies wore their hair in the dress à l’independence—thirteen curls at the neck.

Washington’s journey to Mount Vernon, which he reached after dark, December 24, was a movable feast of florid rhetoric and baked oysters. It also was a foretaste of what was to be, for more than a century, his central place in America’s civic liturgy. Abraham Lincoln wore a ring containing a sliver from the casket Washington was buried in until his body was moved to its current tomb in 1831. At his inauguration in 1897, William McKinley wore a ring containing strands of Washington’s hair. Presidents no longer inspire such reverence, perhaps because America is different, perhaps because presidents are.

[DECEMBER 25, 2003]

John Marshall: The Most Important American Never to Have Been President

A nation’s identity consists of braided memories, which are nourished by diligence at civic commemorations. It is, therefore, disappointing that at this moment of keen interest in the Supreme Court and the office of chief justice, scant attention has been paid to the 250th anniversary of the birth of the nation’s greatest jurist, Chief Justice John Marshall.

The oldest of the family’s fifteen children, he was born September 24, 1755, into Virginia rusticity where women pinned their blouses with thorns. Yet he developed the most urbane and subtle mind of that era of remarkable statecraft. He was a member of Virginia’s ratifying convention, and in nearly thirty-five years as chief justice he founded American constitutional law. That kind of legal reasoning by Supreme Court justices is a continuous exegesis of the Constitution and is sometimes not easily distinguished from a continuing writing of the document.

Marshall is the most important American never to have been president. Because of his shaping effect on the soft wax of the young republic, his historic importance is greater than that of all but two presidents—Washington and Lincoln. Without Marshall’s landmark opinions defining the national government’s powers, the government Washington founded might not have acquired competencies—and society might not have developed the economic sinews—sufficient to enable Lincoln to preserve the Union.

Article I, Section 8, enumerates Congress’s powers, and then empowers Congress “to make all laws which shall be necessary and proper for carrying into execution the foregoing powers.” Marshall’s capacious construction of the “necessary and proper” clause shaped the law, and the nation’s consciousness of itself.

Did Congress have the power—unenumerated but implied—to charter a national bank? In 1819, forty-two years before Lincoln grappled with unprecedented exigencies, Marshall ruled:

“Throughout this vast republic, from the St. Croix to the Gulph of Mexico, from the Atlantic to the Pacific, revenue is to be collected and expended, armies are to be marched and supported. The exigencies of the nation may require that the treasure raised in the north should be transported to the south…. Is that construction of the constitution tobe preferred which would render these operations difficult, hazardous, and expensive?”

Two years later he held that “we are one people” in war, in making peace, and—third, but not of tertiary importance—in “all commercial regulations.” The Framers’ fundamental task was to create a federal government with powers impervious to encroachments by the states. The Framers had been frightened by the states’ excesses in using political power on behalf of debtors against creditors and to limit competition by mercantilistic practices such as granting monopolies. Marshall made constitutional law a bulwark of the sanctity of contracts, the bedrock of America’s enterprise culture. And by protecting the private rights essential to aspirational individualism, Marshall’s court legitimized an inequality—not of opportunity but of outcomes—compatible with a republic’s values.

When in 1801 Marshall was nominated to be chief justice—one of the last things, and much the best thing, President John Adams did—the nation still largely had an Articles of Confederation mentality. Formally, it was a nation; emotionally—hence, actually—it was still in many ways many countries, most states being older than and more warmly embraced than the nation. Marshall’s jurisprudence built the bridge to 1862, the year it became clear that many men would have to die in a protracted conflict to preserve the Union, and that many would be willing to do so.

Marshall had been willing to die to help midwife the nation’s birth, seeing much hard action during the Revolutionary War. Amiably sociable and broadly tolerant, he had friends of vastly different political persuasions, and the only adversary he seems to have steadily disliked was a second cousin named Thomas Jefferson, in part because of Jefferson’s partisan criticisms of Washington, who Marshall celebrated, in an immense biography, as the symbol of a national identity transcending state loyalties.

Among the many recent fine biographies of America’s Founders, none is finer than Jean Edward Smith’s John Marshall: Definer of a Nation (1996). Smith locates Marshall’s greatness in this fact: Unlike Britain’s constitutional documents, which are political documents that it is Parliament’s prerogative to construe, the U.S. Constitution is a legal document construed by courts, not Congress. When judicial supervision of our democracy seems tiresome, consider the alternative.

Marshall’s life of strong, consequential prose had, Smith writes, a poetic coda. Marshall died in Philadelphia, birthplace of the Constitution into which he breathed so much strength and meaning. The Liberty Bell, while tolling his death, cracked. It never rang again.

[SEPTEMBER 25, 2005]

James Madison: Well, Yes, of Course

In 2007, a Princeton Alumni Weekly panel voted—unanimously—that James Madison was the university’s most influential graduate. I was asked to write something about that for the magazine.

I am writing this wee tribute to the greatest Princetonian on a morning which began, as most of my mornings do, with a predawn walk accompanied by my dog. His name is Madison. I am wearing my favorite necktie. It is blue, with silver profiles of James Madison. Later this morning, I shall work on a book I am writing. It is to be titled “The Madisonian Persuasion.” I am not one who needs to be persuaded that Madison merits being ranked as Princeton’s greatest gift to the nation.

Before I turned to journalism—or before I sank to journalism, as my father, a professor of philosophy, put it—I was, briefly, a professor of political philosophy. I cheerfully accepted that I would never be nearly as original and consequential as the philosophic Madison had been. Then I became a newspaper columnist, in which role I have always known that I could never be nearly as original and consequential as was Madison, America’s foremost columnist.

The Federalist Papers, of which Madison wrote the two most important, were, of course, columns written to advance the ratification of the Constitution, in the drafting of which Madison was the most subtle participant. If a student of American thought fully unpacks the premises and implications of Federalist 10 and 51, that student comprehends not only this nation’s political regime but also the Madisonian revolution in democratic theory.

Before Madison, almost all political philosophers who thought about democracy thought that if—a huge if, for most of them—democracy were to be feasible, it would be so only in a small, face-to-face society, such as Pericles’ Athens or Rousseau’s Geneva. This was supposedly true because the bane of democracies was thought to be self-interested factions, and only a small society could be sufficiently homogenous to avoid ruinous factions.

But America in the second half of the eighteenth century, although small compared with what it would become, was in size already a far cry from a Greek polis. Besides, Americans had spacious aspirations. A small nation? They were having none of that. At a time when 80 percent of them lived on a thin sliver of the eastern fringe of the continent, within twenty miles of Atlantic tidewater, what did they call their political assembly? The Continental Congress. They knew, more or less, where they were going: California.

Madison understood the need for philosophic underpinnings for an “extensive republic,” a phrase that seemed oxymoronic to others. He can be said to have had a political catechism, which went approximately like this:

What is the worst outcome of politics? Tyranny.

To what form of tyranny are democracies susceptible? The tyranny of a single, durable majority.

How can this threat be minimized? By a saving multiplicity of factions, so that majorities will be unstable and transitory.

Hence, in Federalist 10, he wrote that “the first object of government” is “the protection of different and unequal faculties of acquiring property.” From these differences arise different factions in their freedom-preserving multiplicity.

Having said in Federalist 10 that “neither moral nor religious motives can be relied on as an adequate control” of factions, Madison turned, in Federalist 51, to the institutional controls established in the Constitution—“this policy of supplying, by opposite and rival interests, the defect of better motives”:

“Ambition must be made to counteract ambition…. It may be a reflection on human nature, that such devices should be necessary to control the abuses of government. But what is government itself but the greatest of all reflections on human nature? If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary…. You must first enable the government to control the governed; and in the next place, oblige it to control itself.” In a masterpiece of understatement, Madison said, “Enlightened statesmen will not always be at the helm.” No kidding. And Madison did not mince words regarding those about whom no one nowadays dares to say a discouraging word—“the people.”

“There is,” he said, “a degree of depravity in mankind which requires a certain degree of circumspection and distrust.” Let the record show that once we had a president who had spoken of the voters’ depravity. Those were the days. Madison did qualify his astringent judgment about the people by acknowledging that there “are other qualities in human nature which justify a certain portion of esteem and confidence.” But notice the carefully measured concession to the public’s sensibilities: “a certain portion,” indeed.

In 1976, the nation’s bicentennial, the presidency was won by someone who, pandering in the modern manner, promised to deliver government “as good as the people themselves.” One can imagine Madison muttering, “Good grief!”

In Washington, the seat of the government that he did so much to design, there is no monument to Madison comparable to the glistening marble temples honoring Jefferson and Lincoln. There is, however, a splendid Madison Building, which is part of the Library of Congress. It was said of the five-foot-four Madison that he contained an astonishing ratio of mind to mass. So it is altogether right that Madison, physically the smallest of the Founders, is honored in his nation’s capital by a repository of learning.

My next dog, if female, will be named Dolley.

[JANUARY 23, 2008]

Longfellow: A Forgotten Founder

One hundred years ago, February 27 was enlivened by events around the nation commemorating what had happened one hundred years before that, in 1807. But last week’s bicentennial of the birth of Henry Wadsworth Longfellow passed largely unnoted, which is noteworthy. It was, naturally, a poet (Shelley) who declared that “poets are the unacknowledged legislators of the world.” Wishful thinking, that, but Plato took poets so seriously as disturbers of the peace that he wanted them expelled from his republic. And Longfellow was, in a sense, an American Founder, a maker of this Republic’s consciousness.

Time was, children learned—in schools; imagine that—the origins of what still are familiar phrases: “Ships that pass in the night,” “Life is real! Life is earnest!” “footprints on the sands of time,” “the patter of little feet,” “the forest primeval,” “Let the dead Past bury its dead!” “In this world a man must either be anvil or hammer,” “Into each life some rain must fall.” Even the first stanza of Longfellow’s serene “The Village Blacksmith”—

Under the spreading chestnut tree,

The village smithy stands

—has a haunting, sinister echo in George Orwell’s 1984. Winston Smith, distraught, thinks he hears a voice singing

Under the spreading chestnut-tree

I sold you and you sold me.

Longfellow was a gifted versifier, and today is dismissed as only a versifier. Well, as Cézanne supposedly said of Monet, “He is only an eye—but what an eye!”

Longfellow was very Victorian—sentimental and moralistic. He in no way foreshadowed twentieth-century poetry’s themes of meaninglessness (“I have measured out my life with coffee spoons”—T. S. Eliot, 1917) and social disintegration (“the blood-dimmed tide is loosed”—William Butler Yeats, 1921). Longfellow wrote for a young nation that was thinking “Let us, then, be up and doing, with a heart for any fate,” before he wrote that exhortation.

He aimed to shape the nation’s identity by making Americans aware of the first European settlers (“Why don’t you speak for yourself, John?”—“The Courtship of Miles Standish”), the Native Americans they displaced (“By the shore of Gitche Gumee”—“The Song of Hiawatha”) and the nation’s birth (“Listen, my children, and you shall hear / Of the midnight ride of Paul Revere”).

“Paul Revere’s Ride” was written in 1860, as events were mocking Longfellow’s great national poem (“The Building of the Ship,” 1849):

Cedar of Maine and Georgia pine

Here together shall combine.

A goodly frame, and a goodly fame,

And the UNION be her name!

Sail on, O UNION, strong and great!

Humanity with all its fears,

With all the hopes of future years,

Is hanging breathless on thy fate!

Longfellow’s civic purposes made him a public figure, the nation’s first literary celebrity. His image decorated cigar boxes and beer-bottle labels. He kept a supply of autographed cards for the many strangers who made pilgrimages to his Cambridge house, where George Washington had lived during the siege of Boston.

Not long ago there still were celebrity poets. Jeffrey Hart, professor emeritus of English at Dartmouth, in his When the Going Was Good: American Life in the Fifties, remembers Robert Frost’s receiving a standing ovation from an overflow house at Carnegie Hall, and Eliot’s reading his poems to an overflow audience at Columbia University, with people outside listening to him over loudspeakers.

The audiences were intense because the issues were large, if abstruse. Frost and Eliot represented dueling sensibilities, the empirical and the transcendental. In contrast, Longfellow intended his narrative and lyric poems—genres disdained by modernists—as inspiriting guides to the nation’s honorable past and challenging future. Yeats ascribed Longfellow’s popularity to his accessibility—“he tells his story or idea so that one needs nothing but his verses to understand it.” This angers today’s academic clerisy. What use is it to readers who need no intermediary between them and the author? And what use is Longfellow to academics who “interrogate” authors’ “texts” to illuminate the authors’ psyches, ideologies, and social situations—the “power relations” of patriarchy, racism, imperialism, etc.? This reduction of the study of literature to sociology, and of sociology to ideological assertion, demotes literature to mere raw material for literary theory, making today’s professoriate, rather than yesterday’s writers, the center of attention.

Dana Gioia, chairman of the National Endowment of the Arts, has written that “Longfellow’s vast influence on American culture paradoxically makes him both central and invisible.” The melancholy fact that the two hundredth birthday of the poet who toiled to create the nation’s memory passed largely unremarked is redundant evidence of how susceptible this forward-leaning democracy is to historical amnesia.

[MARCH 12, 2007]

Ronald Reagan: The Steel Behind the Smile

One measure of a leader’s greatness is this: By the time he dies, the dangers that summoned him to greatness have been so thoroughly defeated, in no small measure by what he did, it is difficult to recall the magnitude of those dangers, or of his achievements. So if you seek Ronald Reagan’s monument, look around, and consider what you do not see.

The Iron Curtain that scarred a continent is gone, as is the Evil Empire responsible for it. The feeling of foreboding—the sense of shrunken possibilities—that afflicted Americans twenty years ago has been banished by a new birth of the American belief in perpetually expanding horizons.

In the uninterrupted flatness of the Midwest, where Reagan matured, the horizon beckons to those who would be travelers. He traveled far, had a grand time all the way, and his cheerfulness was contagious. It was said of Dwight Eisenhower—another much-loved son of the prairie—that his smile was his philosophy. That was true of Reagan, in this sense: He understood that when Americans have a happy stance toward life, confidence flows and good things happen. They raise families, crops, living standards, and cultural values; they settle the land, make deserts bloom, destroy tyrannies.

Reagan was the last president for whom the Depression—the years when America stopped working—was a formative experience. Remarkably, the 1930s formed in him a talent for happiness. It was urgently needed in the 1980s, when the pessimism of the intelligentsia was infecting people with the idea that America had passed its apogee and was ungovernable.

It also was said then that the presidency destroyed its occupants. But Reagan got to the office, looked around, said, “This is fun. Let’s saddle up and go for a ride.” Which he did, sometimes in the middle of the afternoon. Scolds, who thought presidents were only serious when miserable, were scandalized.

In an amazingly fecund twenty-seven-month period, Margaret Thatcher, Pope John Paul II, and Reagan came to office. The pope and the president had been actors. Reagan said he wondered how presidents who have not been actors could function. Certainly the last century’s greatest democratic leaders—Churchill, FDR—mastered the theatrical dimension of politics.

Good actors, including political actors, do not deal in unrealities. Rather, they create realities that matter—perceptions, aspirations, allegiances. Reagan in his presidential role made vivid the values, particularly hopefulness and friendliness, that give cohesion and dynamism to this continental nation.

A democratic leader’s voice should linger in his nation’s memory, an echo of his exhortations. Reagan’s mellifluous rhetoric lingers like a melody that evokes fond memories. Because of demagogues, rhetoric has a tainted reputation in our time. However, Reagan understood that rhetoric is central to democratic governance. It can fuse passion and persuasion, moving free people to freely choose what is noble.

He understood the axiom that people, especially Americans, with their Founders’ creed and vast reservoirs of decency, more often need to be reminded than informed. And he understood the economy of leadership—the need to husband the perishable claim a leader has on the attention of this big, boisterous country.

To some, Reagan seemed the least complicated of men—an open book that the country had completely read. However, he had the cunning to know the advantage of being underestimated. He was more inward than he seemed. And much tougher. The stricken fields of American and world politics are littered with those who did not anticipate the steel behind his smile.

The oldest person ever elected president had a sure sense of modernity, as when he told students at Moscow University that mankind is emerging from the economy of muscle and entering the economy of mind. “The key,” he said, “is freedom,” but freedom grounded in institutions such as courts and political parties. Otherwise “freedom will always be looking over its shoulder. A bird on a tether, no matter how long the rope, can always be pulled back.”

Reagan was a friendly man with one close friend. He married her. He had one other great love, for the American people, a love intense, public, and reciprocated.

Presidents usually enter the White House as shiny and freshly minted dimes and leave tarnished. Reagan left on the crest of a wave of affection that intensified in response to the gallantry with which he met illness in his final years.

Today Americans gratefully recall that at a turbulent moment in their national epic, Reagan became the great reassurer, the steadying captain of our clipper ship. He calmed the passengers—and the sea.

[JUNE 6, 2004]

Reagan and the Vicissitudes of Historical Judgments

Ronald Reagan, unlike all but ten or so Presidents, was a world figure whose career will interest historians for centuries, and centuries hence his greatness will be, and should be, measured primarily by what happened in Europe, as a glorious echo of his presidency, in the three years after he left the White House. What happened was the largest peaceful revolution in history, resulting in history’s largest emancipation of people from tyranny—a tyranny that had deadened life for hundreds of millions of people from the middle of Germany to the easternmost of Russia’s eleven time zones.

However, Reagan will also be remembered for his restoration of American confidence that resulted in a quickening tempo of domestic life. During his first term, the most remarkable run of wealth creation in the history of this or any other nation began. Arguably, it began with a seemingly unrelated event in the first year of his first term.

In 1981, when the nation’s air-traffic controllers threatened to do what the law forbade them to do—strike—Reagan warned that if they did they would be fired. When they struck in August, Reagan announced that the strikers would be terminated in two days. By firing the controllers, Reagan, the only union man—he had been head of the Screen Actors Guild—ever to be president, destroyed a union, the Professional Air Traffic Controllers Organization (PATCO). This has often, and not incorrectly, been called a defining episode of the Reagan presidency because it notified foreign leaders, not least those of the Soviet Union, that he said what he meant and meant what he said.

But now, more than two astonishing decades on, it also is reasonable to conclude that Reagan’s fracas with the controllers had huge economic consequences, domestic and foreign. It altered basic attitudes about relations between business and labor in ways that quickly redounded to the benefit of the nation, and not least the benefit of American workers. It produced a cultural shift, a new sense of what can be appropriate in business management: layoffs can be justifiable even when a company is profitable, if the layoffs will improve productivity and profitability. Within a few years, both AT&T and Procter & Gamble, although quite profitable at the time, implemented large layoffs, without arousing significant protests.

Reagan’s action against the air-traffic controllers came on the eve of the explosive growth of information technologies, and some astute people, including Alan Greenspan, believe that Reagan’s action facilitated that growth.

Since 1981, labor in America has prospered because it is less protected. In theory, it might seem that since the showdown with PATCO, business’s informal protocol about layoffs must have resulted in rising unemployment. The reverse has happened. In the post-PATCO climate of business operations, employers have been more inclined to hire because they know that if the hiring proves to be improvident, those hired can be discharged. The propensity to hire has risen much more than the propensity to fire. In all of America’s post-Civil War era of industrialization, unemployment has never been as low for as long as it has generally been in the years since the extraordinary expansion that began during Reagan’s first term.

Although Reagan entered politics late, running for governor of California at the age of fifty-five, and entered the White House a few weeks shy of his seventieth birthday, older than any other president when elected, he lived so long after leaving office that his tenure already has been subject to striking vicissitudes of historical judgment. Nothing is so irretrievably lost to a society as the sense of fear it felt about a grave danger that was subsequently coped with. So in measuring Reagan’s greatness, insufficient attention is now given to his long-headedness and toughness in putting the Soviet Union on the path to extinction. This is particularly so because the intelligentsia likes nothing less than giving Reagan credit, and so has embraced the theory that the Soviet Union’s extinction was an inevitability that just happened while Reagan was standing around.

So as memories of the Cold War fade, Reagan is remembered more for the tax cutting and deregulating that helped, with the information technologies, to shift the economy into a hitherto unknown overdrive. But the truth is that Reagan always thought that winning the Cold War and revving up the American model of wealth creation were parts of the same project. That project was to convince the watching world that the American social and political model—pluralism, the rule of law, allocation of wealth and opportunity mostly by markets, and maximum diffusion of decision making—is unrivaled. To the extent that anything in history can ever be said to be completed, that project has been.

Reagan always believed that the world was watching America. Indeed, he thought the point of America was to be watched—to be exemplary. Hence the complete sincerity of his reiterated references to the City on the Hill. And when the democratic revolution against communism came, Tiananmen Square in Beijing and Wenceslas Square in Prague and points in between rang with the rhetoric of America’s third and sixteenth presidents. The fortieth president was not surprised.

[JUNE 14, 2004]

John Paul II: “A Flame Rescued from Dry Wood”

In Eastern Europe, where both world wars began, the end of the Cold War began on October 16, 1978, with a puff of white smoke, in Western Europe. It wafted over one of Europe’s grandest public spaces, over Michelangelo’s dome of St. Peter’s, over statues of the saints atop Bernini’s curving colonnade that embraces visitors to Vatican City. Ten years later, when the fuse that Polish workers had lit in a Gdansk shipyard had ignited the explosion that leveled the Berlin Wall, it was clear that one of the most consequential people of the twentieth century’s second half was a Pole who lived in Rome, governing a city-state of 109 acres.

Science teaches that reality is strange—solid objects are mostly space; the experience of time is a function of speed; gravity bends light. History, too, teaches strange truths: John Paul II occupied the world’s oldest office, which traces its authority to history’s most potent figure, a Palestinian who never traveled a hundred miles from his birthplace, who never wrote a book, and who died at thirty-three. And religion, once a legitimizer of political regimes, became in John Paul II’s deft hands a delegitimizer of communism’s ersatz religion.

Between October 16, 1978, and January 20, 1981, the cause of freedom was strengthened by the coming to high offices of Margaret Thatcher, Ronald Reagan, and John Paul II, who, like the president, had been an actor and was gifted at the presentational dimension of his office. This peripatetic pope was seen by more people than anyone in history, and his most important trip came early. It was a visit to Poland that began on June 2, 1979.

In nine days, a quarter of that nation’s population saw him. Marx called religion the opiate of the masses, but it did not have a sedative effect on the Poles. The pope’s visit was the nation’s epiphany, a thunderous realization that the nation was of one mind, mocking the futility of communism’s thirty-five-year attempt to conquer Poland’s consciousness. Between 1795 and 1918, Poland had been erased from the map of Europe, partitioned between Austria, Prussia, and Russia. This gave Poles an acute sense of the distinction between the state and the real nation.

Igor Stravinsky, speaking with a Russian’s stoicism about Poland’s sufferings, said that if you pitch your tent in the middle of New York’s Fifth Avenue, you are going to be hit by a bus. The Poland where John Paul II grew to sturdy, athletic manhood was hit first by Nazism, then communism. Then, benignly, by John Paul II.

It was said that the fin de siècle Vienna of Freud and Wittgenstein was the little world in which the larger world had its rehearsals. In the late 1970s, the Poland of John Paul II and Lech Walesa was like that. The twentieth century’s worst political invention was totalitarianism, a tenet of which is that the masses must not be allowed to mass: Totalitarianism is a mortar and pestle for grinding society into a dust of individuals. Small wonder, then, that Poland’s ruler, General Wojciech Jaruzelski, visibly trembled in the presence of the priest who brought Poland to its feet in the face of tyranny by first bringing Poland to its knees in his presence.

John Paul II almost did not live to see this glorious consummation. In 1981, three of the world’s largest figures—Ronald Reagan, Anwar Sadat, and John Paul II—were shot. History would have taken an altered course if Sadat had not been the only one killed.

Our age celebrates the watery toleration preached by people for whom “judgmental” is an epithet denoting an intolerable moral confidence. John Paul II bristled with judgments, including this: The inevitability of progress is a myth, hence the certainty that mankind is wiser today than yesterday is chimeric.

Secular Europe is, however, wiser because of a man who worked at an altar. Europeans have been plied and belabored by various historicisms purporting to show that individuals are nullities governed by vast impersonal forces. Beginning in 1978, Europeans saw one man seize history by the lapels and shake it.

One of G. K. Chesterton’s Father Brown detective stories includes this passage: “‘I’m afraid I’m a practical man,’ said the doctor with gruff humor, ‘and I don’t bother much about religion and philosophy.’ ‘You’ll never be a practical man till you do,’ said Father Brown.”

A poet made the same point: “A flame rescued from dry wood has no weight in its luminous flight yet lifts the heavy lid of night.” The poet became John Paul II.

[APRIL 3, 2005]

Ayaan Hirsi Ali: An Enlightenment Fundamentalist

While her security contingent waits outside the Georgetown restaurant, Ayaan Hirsi Ali orders what the menu calls “raw steak tartare.” Amused by the redundancy, she speculates that it is intended to immunize the restaurant against lawyers, should a customer be discommoded by that entree. She has been in America only two weeks. She is a quick study.

And an exile and an immigrant. Born thirty-six years ago in Somalia, Hirsi Ali has lived in Ethiopia, Kenya, Saudi Arabia, and the Netherlands, where she settled in 1992 after she deplaned in Frankfurt, supposedly en route to Canada for a marriage, arranged by her father, to a cousin. She makes her own arrangements.

She quickly became a Dutch citizen, a member of parliament, and an astringent critic, from personal experience, of the condition of women under Islam. She wrote the script for, and filmmaker Theo van Gogh directed, Submission, an eleven-minute movie featuring pertinent passages from the Koran (such as when it is a husband’s duty to beat his wife) projected on the bodies of naked women.

It was shown twice before November 2, 2004, when van Gogh, bicycling through central Amsterdam in the morning, was shot by an Islamic extremist who then slit his throat with a machete. Next, the murderer (in whose room was found a disk containing videos of “enemies of Allah” being murdered, including a man having his head slowly sawed off) used another knife to pin a long letter to van Gogh’s chest. The letter was to Hirsi Ali, calling her a “soldier of evil” who would “smash herself to pieces on Islam.”

The remainder of her life in Holland was lived under guard. Neighbors in her apartment building complained that they felt endangered with her there and got a court to order her evicted. She decided to come to America.

Holland evidently tolerates everything except skepticism about the sacramental nature of multiculturalism. One million of the country’s 16 million residents are Islamic, and the political left has appropriated the European right’s traditional celebration of identity grounded in racial and ethnic traditions and culture. But the recoil of many Dutch people from Hirsi Ali suggests that the tolerance about which Holland preens is a compound of intellectual sloth and moral timidity. She was more trouble than the Dutch evidently think free speech is worth.

Her story is told in a riveting new book, Murder in Amsterdam, by Ian Buruma, who is not alone in finding her—this “Enlightenment fundamentalist”—somewhat unnerving and off-putting. Having experienced life circumscribed by tribal and religious communities (as a girl she suffered the genital mutilation called female circumcision), she is a fierce partisan of individualism against collectivism.

She reminds Buruma of Margaret Thatcher’s sometimes abrasive intelligence and fascination with America. He is dismissive of the idea that she is a Voltaire against Islam: Voltaire, he says, offended the powerful Catholic Church, whereas she offends “only a minority that was already feeling vulnerable in the heart of Europe.”

She, however, replies that this is hardly a normal minority. It is connected to Islam’s worldwide adherents. Living sullenly in European “dish cities”—enclaves connected by satellite television and the Internet to the tribal societies they have not really left behind—many members of this minority are uninterested in assimilation into open societies.

She calls herself “a dissident of Islam” because, given what Allah supposedly enjoins and what she knows is right, “the cognitive dissonance is, for me, too much.” She says she is not “a militant atheist,” but the emphasis is on the adjective.

Slender, elegant, stylish, and articulate (in English, Dutch, and Swahili), she has found an intellectual home here at the American Enterprise Institute, where she is writing a book that imagines Muhammad meeting, in the New York Public Library, three thinkers—John Stuart Mill, Friedrich Hayek, and Karl Popper, each a hero of the unending struggle between (to take the title of Popper’s 1945 masterpiece) “The Open Society and Its Enemies.” Islamic extremists—the sort who were unhinged by some Danish cartoons—will be enraged. She is unperturbed.

Neither is she pessimistic about the West. It has, she says, “the drive to innovate.” But Europe, she thinks, is invertebrate. After two generations without war, Europeans “have no idea what an enemy is.” And they think, she says, that leadership is an antiquated notion because they believe that caring governments can socialize everyone to behave well, thereby erasing personal accountability and responsibility. “I can’t even tell it without laughing,” she says, laughing softly. Clearly she is where she belongs, at last.

[SEPTEMBER 21, 2006]

Hugh Hefner: Tuning Fork of American Fantasies

LOS ANGELES—Asked how it feels to have won, Hugh Hefner pauses, looks down and almost whispers, “Wonderful.” Then he says: “I guess if you live long enough…”

Fifty years ago, he was pecking at a typewriter on a card table in his Chicago apartment, preparing the first issue of a magazine he planned to call Stag Party but, because there already was a magazine called Stag, he called it Playboy. The first issue appeared in December 1953. It bore no date because Hefner was not sure there would be a second, such were the troubles the first issue caused with the post office and other defenders of decency.

Four years later, in the nick of time, Searle pharmaceutical company introduced Enovid—“the pill.” Back then Hefner, the tuning fork of American fantasies, said he wanted to provide “a little diversion from the anxieties of the Atomic Age.” But three emblematic books of the supposedly repressed 1950s—Peyton Place, Lolita, and The Kinsey Report (Professor Alfred Kinsey of Indiana University was another Midwestern sexual subversive)—showed that more than geopolitical anxiety was on the mind of Eisenhower’s America.

By 1959, the post office was delivering millions of copies of Hefner’s magazine. Playboy’s rabbit-head logo is now one of the world’s most recognized brands, even in inscrutable China, where Playboy merchandise sells well but the magazine is banned.

Hefner’s daughter Christie, who was born thirteen months before the magazine, says Playboy was “a great idea executed well at exactly the right time.” A no-nonsense executive, she now runs the Chicago-based business she joined twenty-seven years ago, fresh from earning a summa cum laude degree from Brandeis. When she arrived, Playboy was primarily an American magazine publisher. She has made it into an international electronic entertainment company.

The magazine, the twelfth-highest-selling U.S. consumer publication, sells 3.2 million copies monthly. That is slightly less than half its 1970s peak, but its eighteen international editions sell another 1.8 million copies a month, and it remains the world’s bestselling monthly men’s magazine.

Still, it provides only about one-third of Playboy Enterprises’ annual revenues of $277.6 million. Playboy owns six cable networks that deliver to 38 million North American households movies of a sexual explicitness that would have been instantly prosecuted in all forty-eight states in 1953.

The magazine, the mere mention of which used to produce pursings of lips and sharp intakes of breaths, is still Hefner’s preoccupation, but has been overtaken by the libertarian revolution he helped to foment. In 1953, Playboymagazine was pushing the parameters of the permissible, but it is hard to remain iconoclastic when standing waist-deep in the shards of smashed icons.

Born to “puritanical” (Hefner’s words) parents in Chicago, city of broad shoulders, Hefner founded an empire based on breasts. What is it about that protean city? Chicagoan Ray Kroc, entrepreneur of McDonald’s, did his Army training with Chicagoan Walt Disney—two prodigies of mass marketing, the creator of the Big Mac and the creator of Mickey Mouse, in the same Army unit.

Then Chicago produced the Henry Luce of the skin game—Hef, as everyone, including his daughter, calls him. The Chicago boy recalls that the Sears Roebuck mail order catalog—another Chicago innovation—was called “a dream book” because it brought “the dream of urbanity to rural communities. Playboy, for young, single men, is a variation of this.”

Recently, dressed in his black pajamas and merlot-colored smoking jacket—it was 1 p.m.—he looked a bit tuckered, but he had been living what Teddy Roosevelt called “the strenuous life,” although not as TR envisioned it. Hefner’s recent seventy-seventh birthday party had rambled on for more than a week, during which he took to dinner—simultaneously—the seven ladies he is currently dating. As F. Scott Fitzgerald, writing of Jay Gatsby, suggested, “personality is an unbroken series of successful gestures.”

An eleventh-generation descendant of William Bradford, who arrived on the Mayflower to begin a religious errand in the wilderness, Hefner says, “In a real sense we live in a Playboy world.” He lives here in a thirty-room mock-Tudor mansion that sits on six acres of posh Holmby Hills decorated with wandering peacocks, among other fauna.

He says, “I grew up in the Depression and World War II and I looked back to the Roaring Twenties and I thought I’d missed the party.” The party turned out to be a movable feast.

[MAY 29, 2003]

Lawrence Ferlinghetti: The Emeritus Beat as Tourist Attraction

SAN FRANCISCO—America’s gauzy popular culture has the power to envelop even its perfervid critics in a tolerant, domesticating embrace. If they live long enough, these critics run the risk of winding up full not only of years, but of honors. They can, like Lawrence Ferlinghetti, eighty-three, become tourist attractions.

These tourists, he notes, are intellectually upscale. They come in a small but steady trickle, from across the country and around the world, to his City Lights Booksellers & Publishers, next door to a street named after the most famous of the many writers who have hung out there—Jack Kerouac. The store, which is a short walk from the street—actually, an alleyway, which seems right—named Via Ferlinghetti, has been designated by this city a protected landmark. This is not because the wedge-shaped structure built in 1907 is a gem (it is not) but because of its cultural significance, which is primarily its association with Kerouac, Allen Ginsberg, and other designated voices of the Beat Generation.

It was in City Lights that San Francisco police arrested Ferlinghetti on obscenity charges for publishing Ginsberg’s “Howl.” Ferlinghetti’s and Ginsberg’s acquittals helped make possible the American publication of D. H. Lawrence’s Lady Chatterley’s Lover and Henry Miller’s Tropic of Cancer, which involved legal dustups that now seem quaint.

Ferlinghetti publishes as well as sells books. He published “Howl” after first rejecting it. It was after he heard Ginsberg recite it that he sent Ginsberg a telegram repeating words from the letter Emerson sent to Walt Whitman after reading Leaves of Grass—“I greet you at the beginning of a great career.”

Ferlinghetti was born in Bronxville, New York, spent much of his youth in France, and went to the University of North Carolina because his roommate at a prep school in Massachusetts hooked him on the novels of Thomas Wolfe. He began a four-year hitch in the Navy before Pearl Harbor (“I was a good American boy”) and was back near France, on a U.S. Navy submarine chaser, on June 6, 1944. After earning—thank you, GI Bill of Rights—a master’s degree in Victorian literature at Columbia, and a doctorate at the Sorbonne, and after finding that the mailroom at Time magazine was not a promising rung on the ladder of journalism, he headed for here, to start a bookstore, a vocation suggested by life in Paris.

City Lights is in the North Beach district, which once was a scene of San Francisco’s bohemian ferment. Now the district is mostly seedy. Visible from Ferlinghetti’s cluttered office in his bookstore’s second floor is an establishment with a resonant name—Hungry I. It was at a nightclub called the hungry i a few hundred yards from the location of the topless bar now bearing that name—bohemia isn’t what it used to be, but then, what is?—that Mort Sahl and Lenny Bruce performed their political riffs that were the outer edges of dissent in the 1950s and early 1960s.

Bruce lived across the street from City Lights. He once fell out of a window. “No doubt he was on something” is Ferlinghetti’s safe surmise. Sahl used to browse the City Lights magazine rack for ideas for his performances.

Ferlinghetti looks the part of an emeritus Beat—small silver earring, tatty sport coat, blue jeans—and looks askance at tourists (no kidding), Republicans (of course: one of his most popular works was the long 1958 poem “Tentative Description of a Dinner Given to Promote the Impeachment of President Eisenhower”), George W. Bush (“shredding the Constitution,” “dismantling the New Deal”), gentrification (San Francisco has been “dot.conned”), automobiles (“auto-geddon” inflicted by SUVs), and chain stores (does he have Borders and Barnes & Noble in mind?).

But the grouchiness of San Francisco’s first poet laureate seems perfunctory, even cheerful. City Lights is open until midnight seven days a week; books are shelved under ideological categories (look under “Stolen Continents” for American history). It retains what Ferlinghetti says San Francisco itself had when he arrived in 1951, “an island mentality, a sort of offshore territory.” American life has been good to the man whose Coney Island of the Mind was the bestselling poetry book in the 1960s and 1970s. A million copies are in print.

It has been famously said that there are no second acts in American life. Actually, there are. And third, fourth, and fifth acts. But Ferlinghetti has happily stayed with his one act, and the world, or at least a minority steeped in literary nostalgia, is still beating a path to his door.

[JUNE 14, 2002]

Buck Owens’s Bakersfield Sound

BAKERSFIELD, CALIFORNIA—Buck Owens came to this city, a hundred miles north of Los Angeles, at the southern end of the prodigiously fertile San Joaquin Valley, to pick cotton, not a guitar. He came for the same reason lots of others came west from Texas and Oklahoma: happiness was the Dust Bowl in their rearview mirrors.

The Owens family’s rearview mirror was on a 1933 Ford sedan. In 1937, when Buck was eight and John Steinbeck was just beginning to write The Grapes of Wrath, ten Owens family members packed into it and headed west. His parents had been sharecroppers on the southern side of the Red River that separates Texas from Oklahoma. Because the trailer hitch broke in Phoenix, the family lived there for a few years, sometimes traveling to the San Joaquin to pick carrots in Porterville, peaches in Modesto, potatoes and cotton in Bakersfield. During such work, he got the idea that picking a guitar might be more fun.

Which he is doing at seventy-three, in his Crystal Palace nightclub, where he recently began a rollicking hour set with “Okie from Muskogee,” a sixties—actually, an anti-sixties—anthem by another Bakersfield boy, Merle Haggard. (Has there ever been a better name for a country-music singer?)

By sixteen, Owens had begun playing in Phoenix honky-tonks, sometimes with the young Marty Robbins, earning whatever change he could collect by passing a soup bowl. In 1951, he moved to Bakersfield, in Kern County, which produces more oil than Oklahoma, and had plenty of roughnecks to appreciate the country music of rising stars like Bob Wills and Ferlin Husky, who were honing what has come to be called the Bakersfield sound.

That is identified with Owens’s solid-body Fender Telecaster steel guitar. It produces the sharp, twangy, driving, biting sound that seems especially suited to the subjects of what is called “hard country music.” Such music sometimes teeters on the brink of self-caricature, or embraces it (“I was drunk the day my momma got out of prison”), but its essential message is that life is difficult and so are most of the people we meet, including those we marry.

The life that drove many people down Steinbeck’s road to California was hard, and so was the life Owens led chasing stardom. After touring hard—sometimes three hundred nights a year—Owens got off the road in 1980. And he spent too many years associated with the instant kitsch of the television program Hee Haw. As a result, too few fans of country music appreciate how much his Bakersfield sound helped give that music a steely integrity and propel it to the point that Owens could play a much-praised concert in President Johnson’s White House in 1968.

By 1980, however, when the John Travolta movie Urban Cowboy helped make country music fashionable, country music was beginning to lose its edge. More to the point, the Nashville music establishment set out to rub the edge off, to envelop it in a syrup of strings and softening production techniques, the better to appeal to a broader audience that wanted country music that was close kin to soft rock.

But some “new traditionalists” were, and are, having none of that. These include Randy Travis, George Strait, the Dixie Chicks, and Dwight Yoakam—another nifty name for a country singer. Yoakam was born in Kentucky and spent a while in Nashville, but says, “I was drawn to Los Angeles by my earlobes…the country-rock sound, and the Bakersfield sound.”

A related development is the recent emergence of a new category of old-style music called “Americana,” which is the most popular radio format for Internet listeners. Although Americana is not strictly defined (see, a good sampler is the sound track of the movie O Brother, Where Art Thou? a mixture of blues, bluegrass, gospel, folk, and mountain music. Sales of that CD, released in 2000, are heading toward 7 million. Americana encompasses new performers, such as Alison Krauss’s Union Station and Nickel Creek, and hardy perennials like Johnny Cash. One song written years ago by Homer Joy probably would qualify as Americana:

Spent some time in San Francisco

Spent a night there in the can

Well, they threw this drunkard in my jail cell

Took fifteen dollars from that man

But I left him my best watch and my house key

Don’t like folks sayin’ that I steal

Then I headed out to Bakersfield.

When Buck Owens is onstage, singing that song, the years fall away. He is as energized by the audience as his guitar is by electricity, and the young man in flight from the cotton fields is present again.

Parts of his life resemble hard-country lyrics (his fourth divorce is not going well), but he is now an icon in the community he first saw when picking cotton. Today, as he drives the streets of Bakersfield, he can steer his pickup truck down Buck Owens Boulevard.

Bakersfield, although prosperous, is still a place where billboards proclaim, tough times never last but tough people do. So does hard-country music, and Buck Owens’s Bakersfield sound.

[DECEMBER 9, 2002]

Andrew Nesbitt: Seventy-nine-Pound Master of Tourette Syndrome

COPPELL, TEXAS—Even in what passes for repose, your basic eleven-year-old boy resembles the former Yugoslavia—a unity of sorts, but with fidgeting and jostling elements. Andrew Nesbitt is like that, only more so, because he has Tourette syndrome.

He also has something to teach us about the power of a little information and a lot of determination. And about how life can illuminate philosophy, which is supposed to do the illuminating.

He is seventy-nine pounds of shortstop and relief pitcher—a closer, no less, which is a high-stress vocation. Stress often triggers Tourette symptoms. Hitting a thrown ball with a round bat is hard enough, and so is throwing the ball over a seventeen-inch-wide plate with the game on the line. Hard enough, even if you do not have an inherited neurological disorder that causes recurrent physical and phonic tics.

The physical tics can include involuntary muscle spasms—blinking, clapping, hopping, and the more or less violent twitching of shoulders and flailing of limbs. The vocalizations are usually grunts, hisses, barks, and other meaningless sounds. Rarely, and not at all in Andrew’s case, there is the compulsive utterance of obscenities.

At the benighted school he attended last year, teachers could not—would not—understand that he did not have a mischievous penchant for bad behavior. They frequently banished him from the classroom to sit in the hall.

When he was younger, his parents had to hold his thrashing head so he could eat. Playing soccer, he sometimes bruised his behind by kicking himself with backward leg spasms. This year, he says, Mrs. Marill Myers, his math and homeroom teacher, “asks me if it’s a tic.” She gives him a jump rope to use to subdue unmanageable energy. Or pauses to briefly rub his back. Not complicated, really.

He was five, standing on a swimming pool diving board, when his mother first saw him jerking his head and shrugging his shoulders oddly. He is bright as a new dime—at ten months he had a fifty-word vocabulary—but his gross and fine motor problems became so bad that in fourth grade hip spasms would throw him out of his desk chair.

A visiting columnist is Andrew’s excuse for taking a break from the work part of a sixth-grader’s day in Coppell Middle School West (math, English—the school stuff) and savoring anticipation of the good parts, such as lunch, baseball, and lacrosse practice. He is dressed conservatively, even formally, as his age cohort understands such matters: red T-shirt reaching almost to his knees and blue shorts that aren’t short—they reach below his knees, toward his white sneakers.

Nowadays, he says, “I sometimes hold the tics in when I’m batting.” Extreme concentration also helps Mike Johnston, a Pittsburgh Pirates reliever, contain his Tourette symptoms: “I’ll sometimes stare at something until my eyes water.” Johnston, who was awakened in a Chicago hotel on an off-day by the thoughtless columnist, chats on the phone with Andrew, who is asking important questions, such as: “Have you ever pitched to A-Rod?” Johnston gets important information from Andrew: cap and jersey sizes. Pirates gear is on the way.

Last year, Andrew came close to exhaustion from dread of teachers’ incomprehension and from some children’s cruelty. This year, Andrew’s teachers and classmates are better informed. What causes his odd behavior may have caused similar behavior by some high-achievers—probably Samuel Johnson, perhaps Mozart. Even more impressive, Jim Eisenreich, formerly of the Twins, Royals, Phillies, Marlins, and Dodgers, has Tourette syndrome, as does Tim Howard, current goalie for Manchester United soccer club, the world’s most famous sports team.

The mind-body dichotomy is a perennial puzzlement for philosophers. Most people say, “I have a body.” Perhaps we should say, “I am a body.” People who say the latter mean that the mind, the soul—whatever we call the basis of individual identity—is a “ghost in the machine,” a mysterious emanation of our physicality. They may be right. But were Andrew given to paddling around in deep philosophic water—if he were, he would not be your basic boy—he might reply:

“No way. Wisdom is encoded in our common language. We all have, to some extent, a complex, sometimes adversarial, relationship with our physical selves. And I more than most people know that it is correct to say, ‘I have a body.’ There is my body, and then there is me, trying to make it behave.”

Let the philosophers contend about the mind-body distinction. If you think Andrew has it wrong, spend a day in his sneakers.

[APRIL 25, 2004]

Simeon Wright’s Grace

ALSIP, ILLINOIS—In a cemetery here, a few miles southwest of Chicago’s city limits, Simeon Wright, sixty-two, a trim, articulate semiretired pipe fitter, and a deacon in the Church of God in Christ, recently attended a ceremony at an unquiet grave. The gravestone has a weatherproof locket with a photograph of a boy, and these carved words:



JULY 25, 1941 AUGUST 28, 1955

Wright participated in a service for the reinterment of the body of the boy with whom Wright, then twelve, was sharing a bed in the Mississippi home of Wright’s father fifty years ago. It was the night that lit the fuse of the civil rights revolution.

The eulogy delivered at the reinterment—Emmett’s remains had been exhumed as part of the reopened investigation of his murder—was by Wheeler Parker, a barber and minister in nearby Argo, Illinois. The night of August 28, 1955, Parker, then sixteen, also was sleeping in the Wright home. Two white men, one with a .45 caliber pistol, shone a flashlight in Parker’s face and one of them said, “Where’s the fat boy from Chicago?”

A few weeks before, Wright’s father, a preacher in the vicinity of Money, Mississippi, had come to Chicago to deliver a eulogy for a former parishioner, one of the hundreds of thousands of black Mississippians of the great migration—an $11, sixteen-hour ride on the Illinois Central to Chicago. A week or so later, Mamie Till—Emmett’s mother—put Emmett on the Illinois Central to visit his great uncle, and cousin Simeon.

Three days into his visit, at the ramshackle Bryant’s Grocery and Meat Market, Emmett, fourteen, whistled at a white woman, Carolyn Bryant. Three nights later, her husband Roy and his half brother J. W. Milam came for Emmett. Simeon’s father pleaded, “Why not give the boy a whipping and leave it at that?” They beat him to an unrecognizable pulp, knocked out his right eye, shot him, tied a seventy-five-pound cotton gin fan around his neck with barbed wire, and threw his body into the Tallahatchie River.

Mississippi authorities had made the Chicago undertaker agree to keep Emmett’s casket nailed closed, but they met their match in his mother. An estimated fifty thousand Chicagoans saw the body in the open casket. Jet magazine ran a picture. “When people saw what had happened to my son,” Mamie Till said, “men stood up who had never stood up before.” And one woman refused to stand up: sixty-nine days after the acquittal of Emmett’s murderers, and three hundred miles away, Rosa Parks refused to surrender her seat on a bus.

Twenty-six days after murdering Emmett, Bryant and Milam were acquitted by a jury that waited sixty-seven minutes, a juror said, to “drink a pop” before embracing the defense argument that the body might not have been Emmett’s. Bryant and Milam later told Look magazine they killed Emmett. They said they took turns smashing him across the head with the .45. But the trial was the first event to turn the gaze of American journalism to the causes of the coming civil rights storm.

A week after the acquittal, Simeon Wright’s father, who testified against Bryant and Milam, left his car at the railroad station and went to Chicago. He never returned to Mississippi.

Others besides Bryant and Milam, both dead, may have been complicit in the killing. But beyond DNA proof that it was Emmett’s body, it is unclear what forensic evidence his remains might provide to the Mississippi district attorney who sought the disinterment.

Martin Luther King came to Chicago in January 1966, but Simeon Wright says: “I didn’t qualify for Dr. King’s march. They told us that if bricks and things were thrown at us, we couldn’t retaliate. I couldn’t do that…. Now I’m back to what he was teaching.”

Wright says friends who only recently discovered his connection with the Till case say, “He’s so easy going!” He says, “I guess they think I’d be angry all the time. You don’t live long that way.” At the reinterment, he recited the first verse to “Taps,” which concludes: “All is well, safely rest, God is nigh.” He says, “It got a little emotional then.”

Where do they come from, people like Simeon Wright, people of such resilience and grace? From Mississippi and Illinois. And everywhere else. They are all around us.

What has this country done—what can any country do—to deserve such people? Wrong question. They are this country.

[JUNE 19, 2005]