The Master Switch: The Rise and Fall of Information Empires - Tim Wu (2010)

Part III. The Rebels, the Challengers, and the Fall

IN THE SMALL CRACKS of the twentieth century’s empires, challengers were slowly born over the decades of dominance. Interestingly, each of these would come to life as a tiny irrelevancy, a speck off the map. Small-town entrepreneurs invented the community antennas that would become cable television. A failing UHF broadcaster from Atlanta, Ted Turner, pioneered the idea of the cable network. Filmmakers until then excluded from all but the most obscure theaters would remake Hollywood, damaged by television and the antitrust division of the Justice Department. And an impractical, highly abstract academic project became, eventually, the first universal network: the Internet.

Part III tells the story of how information monopolies disintegrate. An industry is dominated by one ruler, an oligarchy or trust of some sort. What forces can break such hegemony?

Chapter 11. The Right Kind of Breakup

Toward the end of World War II, before the atomic bombs were dropped on Hiroshima and Nagasaki, the Sandia National Laboratories were founded in New Mexico. Situated near the better known Los Alamos labs, Sandia was to extend the basic work of the Manhattan Project into more sophisticated weapons development. The labs’ ongoing mission was to serve as the “steward” of the United States’ nuclear arsenal. What may seem surprising is that this top-secret effort should have been overseen not by the Department of Defense or some other government agency but, as late as 1992, by the telephone company. It all began when President Truman wrote a letter to AT&T subsidiary Western Electric. “In my opinion,” Truman wrote in 1949, “you have here an opportunity to render an exceptional service in the national interest.”1

Perhaps no other arrangement more clearly bespeaks the trust and intimacy that existed for decades between the U.S. government and the nation’s great communications empires than the privilege enjoyed by the authorized telephone monopoly. Nor was Sandia Laboratories AT&T’s only contribution to the Cold War. AT&T built a system of towers across the top of Canada and Alaska designed to warn of approaching ICBMs, a secret radio network to provide communications for Air Force One, and at least sixty hardened underground bunkers housing emergency equipment. Indeed, so essential to the common defense did AT&T seem that the Defense Department would intervene forcefully to prevent the company’s breakup by antitrust suit in 1956, citing a “hazard to national security.” Fittingly, during the 1950s, AT&T, for its part, adopted the notably Orwellian slogan “Communications Is the Foundation of Democracy.”2

AT&T’s relationship with the federal government may have been a uniquely intimate entanglement of interests. But, in fact, the blessing of the state, implicit or explicit, has been crucial to every twentieth-century information empire. We have seen how it influenced the course of radio (initially for military reasons) and later of television. In the case of Hollywood, and the government’s decades-long acceptance of that industrial concentration, there may have been no national-security imperative beyond the morale of weary soldiers lifted by celluloid apparitions of Betty Grable. But as long as the studios had friends in Washington, their empire was secure. In every information industry, the government mediated what would have otherwise surely been a more tumultuous course of the Cycle.

Theorists of industrial evolution, Schumpeter foremost, have always understood the alternation of birth and destruction to be a natural inevitability of markets. Nothing, the theory goes, can stop an idea whose time has come. But what if in an otherwise free-market setting, an industrial entity enjoys a special forbearance or favor of the state. What if, as with AT&T, that favor amounts to its being a virtual organ of government? Can the natural ecology of the market still function, or is industrial creativity arrested? The power of state patronage or sufferance to any degree would seem to be more than any would-be competitor, even one armed with a technical breakthrough, can overcome. Herein lies the greatest complication to Schumpeter’s idea of how capitalism works.

You cannot expect creative destruction to proceed normally in such circumstances: unseating such a monopolist thus becomes less a question of market dynamics than of politics. After the Second World War, the state would twice abandon its habit of tolerance and sponsorship, intervening in communications industries to break up dominant players. In 1984, it would have another run at Bell, this time finishing the job it had aborted in 1956. But even before the first attempt to break up Bell, in 1948, the government would take action against another information empire, forcing the Hollywood studios to sell their theaters and thus precipitating the collapse of the carefully designed studio system.

Both breakups—that of AT&T and that of the studios—would generate significant controversy, at the time and later. For in each case, there would be those who saw the dismemberment as a senseless summary execution of a robust, if restrictive, industry. With Bell particularly, a case in which the Justice Department had deferred action until a combination of the monopoly’s arrogance and technological stagnation made it seem to many ludicrously overdue, there were nonetheless those who would regard—indeed, still do regard—the breakup as the crime of the century. In 1988 two Bell engineer-managers, Raymond Kraus and Al Duerig, would write a book called The Rape of Ma Bell, decrying how “the nation’s largest and most socially minded corporation was defiled and destroyed.” Barry Goldwater, the conservative icon and candidate for president, put it this way: “I fear that the breakup of AT&T is potentially the worst thing to happen to our national interests in telecommunications that will ever occur.”3

The critics have a point: a federal breakup is an act of aggression and arguably punishes success. In the short term, the consequences of the state’s interventions in both communications cases were ugly indeed. Each industry lapsed into an immediate period of chaos and experienced a drop in product quality. The decline of the film industry, which had been so grand and powerful in the 1930s and 1940s, would last into the 1970s. And in the immediate aftermath of the AT&T breakup, consumers saw a drop-off in service quality utterly unexampled since the formation of the Bell system. In fact, the “competitive” industries that replaced the imperial monopolies were often not as efficient or successful as their predecessors, failing to deliver even the fail-safe benefit of competition: lower prices.

Whether sanctioned by the state or not, monopolies represent a special kind of industrial concentration, with special consequences flowing from their dissolution. Often the useful results are delayed and unpredictable, while the negative outcomes are immediate and obvious. Deregulating air travel, for instance, implied a combination of greater choice, lower prices, and, alas, smaller seats, among other downgrades, as one might have more or less foreseen. The breakup of Paramount, by contrast, and the fall of the studio system ushered in something less expected: the collapse of the Production Code system of film censorship. While not the only factor transforming film in the 1960s and 1970s, the end of censorship certainly contributed to an astonishing period of experimentation and innovation. Likewise, the breakup of Bell laid the foundation for every important communications revolution since the 1980s onward. There was no way of knowing that thirty years on we would have an Internet, handheld computers, and social networking, but it is hard to imagine their coming when they did had the company that buried the answering machine remained intact.

The case for industry breakups comes from Thomas Jefferson’s idea that occasional revolutions are important to the health of any system. As he wrote in 1787, “a little rebellion now and then is a good thing, and as necessary in the political world as storms in the physical.… It is a medicine necessary for the sound health of government.”

Let us now evaluate the success of the government’s first breakup of an information empire. It is not a tale to rival the epic of AT&T’s breakup, which we take up in greater detail later. But it is the first crack in the ancien régime of state connivance with information industries and as such a fitting place to start.

THE STUDIOS

By the 1940s the Hollywood studio system had been perfected as a machine for producing, distributing, and exhibiting films at a guaranteed rate of return—if not on every film, on the product in the aggregate. Each of the five major studios had by then undergone full vertical integration, with its own production resources (including not just lots and cameras but actors, directors, and writers as human cogs as well), distribution system, and proprietary theaters. There was much to say about this setup in terms of efficiency, which was effectively an assembly line for film. Out of the factory came a steady supply of films of reliable quality; yet on the other hand, like any factory, the studios did not admit a lot of variety in their product. Henry Ford famously refused to issue his Model T car in any color but black, and while Hollywood didn’t go that far, there was a certain sameness, a certain homogeneity to the films produced in the 1930s through the 1950s. That homogeneity was buttressed by the ongoing precensorship under the Production Code, which ensured that films would not stray too far from delivering the “right” messages: marriage was good, divorce bad; police good, gangsters bad—leaving no room for, say, The Godfather, let alone its sequels.

The cornerstone of the studio system was the victory Zukor won over the large first-run theater in major cities and the ongoing block booking system. In America’s ninety-two largest cities, the studios owned more than 70 percent of them. And though these first-run movie palaces comprised less than 20 percent of all the country’s theaters, they accounted for most of the ticket revenue.4 As the writer Ernest Borneman put it in 1951, “control of first run theaters meant, in effect, control of the screen.”

The man inspired to challenge this system was Thurman Arnold, a Yale law professor turned trustbuster with some rather striking ideas about industrial concentration. Arnold, whose name continues to grace one of Washington, D.C.’s most prestigious law firms (Arnold & Porter), was by today’s standards an antitrust radical, a fundamentalist who believed the law should be enforced as written. In The Folklore of Capitalism (1937), Arnold compared the role of U.S. antitrust law to statutes concerning prostitution: he deemed that both existed more to flatter American moral vanity than to be enforced.5

His language may have been strong, but Arnold had a point. By the time he took over the antitrust department in the 1930s, the United States, once a nation of small businesses and farms, was dominated by monopolies and cartels in nearly every industry. As the economist Alfred Chandler famously described it, the American economy was now dominated by the “visible hand” of managerial capitalism.6 This despite the fact that the text of the Sherman Act, the main antitrust law, wasn’t (and isn’t) all that ambiguous. The law explicitly made monopolization and deals in restraint of trade illegal. A nonlawyer can understand this from reading sections one and two of the Act:*

Every contract, combination in the form of trust or otherwise, or conspiracy, in restraint of trade or commerce among the several States, or with foreign nations, is declared to be illegal.

Every person who shall monopolize, or attempt to monopolize … any part of the trade or commerce among the several States, or with foreign nations, shall be deemed guilty of a felony.

Arnold, as soon as he gained Senate confirmation, acted quickly to implement his literalist view of the antitrust laws. His aim was to bring quick, high-visibility lawsuits breaking up cartels in whose evils American citizens could easily understand. His first lawsuits were brought against the car industry (GM, Ford, and Chrysler, the “Big Three”); the American Medical Association, which he charged with preventing competition among health plans; and most relevant for us, the film industry. Arnold’s 1938 lawsuit against Hollywood charged twenty-eight separate violations of the Sherman Act and demanded that the film studios “divorce” their first-run theaters. And he repeatedly denounced the film industry as “distinctly un-American,” and characterized its structure as a “vertical cartel like the vertical cartels of Hitler’s Germany, Stalin’s Russia.”7

A decade would intervene, with various near-settlements and consent decrees, but the Antitrust Division finally achieved what Arnold wanted. In 1948, the United States Supreme Court agreed with the Justice Department’s petition that Hollywood was an illegal conspiracy in restraint of trade, whose proper remedy lay in uncoupling the studios from the theaters. The Court’s ruling by Justice William O. Douglas readily accepted Arnold’s contention that the first-run theaters were the key to the matter, and with that acceptance disappeared any hope the studios might prevail. The Court ruled that they had undeniably fixed prices and, beginning in 1919 with Zuckor’s Paramount, unfairly discriminated against independent theaters by selling films in block. There were various other offenses, but that was enough. Over the next several years, every studio would be forced to sell off its theaters.8

For the new information industries of the twentieth century, the Paramount decision was the first experience they would have of the awesome power of the state. The government had induced a paroxysm of creative destruction, seizing an industry by the throat. The infractions were indisputable, but there was nevertheless a degree of arbitrariness in the exercise of state power. Was this, after all, not the same government that had encouraged and supported the broadcast networks and the Bell system in their hegemonic forms? It was indeed, but Thurman Arnold was a different head of the hydra. Stripped of their control over exhibition, the Hollywood studios lost their guaranteed audiences. The business as they knew it would have to be entirely rethought.

In the short term came the chaos of breakup without the economic efficiencies. Robert Crandall, an economist at the Brookings Institution and a critic of the antitrust laws, has argued that the Paramount decree, as it was known, failed to lower the price of theater tickets.* And while there may never be a good time to sustain such a body blow, the action came at an especially bad moment for the studios; the arrival of television and the rise of suburbs after the war cut sharply into film viewer-ship and revenues from the key urban markets. Still, in some sense the Paramount decree may have been just the bitter pill that the already listless studios needed: losing the first-run advantage would force them to reorganize and change the way films were made sooner rather than later. Institutional inertia being what it is, systems are rarely fixed unless they are broken, and this one, against its will, was broken utterly.9

Whatever its immediate consequences, the Paramount decision launched a transformation of American film as cultural institution, throwing the industry back into the open state it had emerged from in the 1920s. As Arnold had hoped, the independence of theaters cleared the way for independent producers, and even for foreign filmmakers, long excluded, to now sell to theaters directly. But the most profound effects of the decree would not emerge for decades. The industry would remain in an odd stasis through the 1950s and into the early 1960s. Eventually, though, as the mode of film production changed, returning to a decentralized style not seen since the 1910s and 1920s, so, too, did the product. After the decree, films were increasingly made one at a time rather than from a mold, according to the vision of a director or producer. “What replaced film production in the dismantled studios was a transformed system,” writes the economist Richard Caves, “with most inputs required to make a film coming together only in a one-shot deal.… the same ideal list of idiosyncratic talents rarely turns up for two different films.”10

With the fall of the studios, perhaps even more decisive than the transformation of production structure was the end of the censorship system. The power of the old Production Code written by Daniel Lord and enforced by Joseph Breen was effectively abrogated when the studios lost control over what the theaters showed.11

A very different type of production was feasible once theaters were free to buy unapproved films and ignore the regime that the studios had enforced in exchange for Breen’s blessing. Producers took the cue, creating darker, countercultural, and controversial works—everything the Code prohibited. The Code itself was still around, but it had lost its bite. In 1966, Jack Valenti, the new, forty-five-year-old head of the MPAA, decided he wanted to “junk it at the first opportune moment.” He noticed something obvious in retrospect: “There was about this stern, forbidding catalogue of ‘Do’s’ and ‘Don’ts’ the odious smell of censorship.”12

Valenti instituted the familiar ratings system (G, PG, R, X) in 1968, and far from marking a return to restraint, it was a license to make films patently unsuitable for children—even to the point of being what is euphemistically called “adult.” At the same time, the freedom to import European films had its own influence on American production. Seeing the popularity of foreign offerings—typically moodier, more cerebral, and erotically explicit—the desperate studios were forced to invest in a new kind of American film. The result is known by film historians as the New Hollywood era, among its emblematic products Bonnie and Clyde, Easy Rider, and Midnight Cowboy, all edgy, defiant affairs announcing a new day for the industry and the culture.*

So great was the range of experimentation in film in the 1970s that for a time, as surprising as it sounds now, X-rated films—that is, pornography—went through “normal” theatrical releases. The most famous example is 1972’s Deep Throat, which played in basically the same kind of theaters and made the same kind of money that a Hollywood blockbuster might today. Here was the medium as far as it could get from the days when the Production Code required preapproval of all films and obliged filmmakers, as a matter of course, to give audiences the “right” answers to all social questions.

Of course not every production of the period, which lasted until the early 1980s, would prove well made or enduring. Nevertheless the freedom to fail and sometimes to offend was extremely salutary for the medium in the era following the age of guaranteed success. What greatness did result came because directors and producers were allowed to experiment and probe the limits of what film could be. Whatever the merits of the individual outcome, the variety of ideas, in style and substance, was the widest it had been since before the 1934 Code.13

Antitrust action rarely takes the promotion of such variety and cultural innovation as one of its goals. The purpose of the statutes is to facilitate competition, not cultural or technological advancement (they were, after all, enacted under Congress’s constitutional authority over interstate commerce). Innovation in an expressive form isn’t ordinarily something one can patent, nor can creativity be satisfactorily quantified. But in considering whether government action was worthwhile, let us not, particularly where information and culture industries are concerned, fall into the trap of looking to results that only econometrics can reveal.

Films are not screwdrivers. As with all information industries, the merits of a breakup cannot be reduced to its effect on consumer prices, which may be slow to decline amid the inefficiencies and chaos of the immediate aftermath. But who would deny there are intangible costs to censorship? It is useful to consider whether Hollywood would be the peerless cultural export that it is were the industry not open to the full variety of sensibilities and ideas a pluralistic society has to offer.

* The argument that the text is ambiguous comes from the idea that the law would make so much illegal that it couldn’t possibly mean what it literally says.

* Of course, there is no knowing whether prices would have risen even higher were the industry still intact but operating under new market pressures.

* It is perhaps difficult to imagine that even without the antitrust action of the Roosevelt administration, Hollywood would not have evolved with the national mood in the 1960s. Changing sensibilities might well have upended the Code. But one shouldn’t underestimate the capacity of an entrenched industry to avoid the risk of innovation, the initial resistance to features providing perhaps the most stunning example in the history of film.