Sun in a Bottle: The Strange History of Fusion and the Science of Wishful Thinking - Charles Seife (2008)


They started at once, and went about among the Lotus-eaters, who did them no hurt, but gave them to eat of the lotus, which was so delicious that those who ate of it left off caring about home, and did not even want to go back and say what had happened to them, but were for staying and munching lotus with Lotus-eaters without thinking further of their return; nevertheless, though they wept bitterly I forced them back to the ships and made them fast under the benches.


Bubble fusion, like cold fusion, imploded under charges of fraud and scientific misconduct. Though both methods still have their supporters, both have now been swept to the fringes of science. Without a spectacular reversal of fortune, that is where they will remain.

Hot fusion now enjoys a monopoly. Mainstream scientists who hope for fusion energy almost unanimously pin their hopes upon inertial confinement fusion or magnetic fusion. Tabletop fusion and muon-catalyzed fusion are not going to lead to energy production. Bubble fusion and cold fusion were delusions. There are no other options.

Despite that distinction, since the 1990s fusion scientists have had to fight, with increasing desperation, to keep hot-fusion research alive. Now, two multibillion-dollar projects, one in California and one in France, will determine the future of fusion. If the projects succeed, they will allow nations around the world to free themselves from dependence on oil. But if they fail, it is possible that no amount of money will be sufficient to realize mankind’s ambition to bottle the sun.

When it was conceived at the Geneva summit in 1985, the International Thermonuclear Experimental Reactor (ITER) quickly became magnetic fusion’s best hope of achieving breakeven. Europe and Japan joined in the effort, and along with the Soviet Union and the United States, the four parties, together, agreed to pool their resources to build an enormous tokamak. It was to be the most ambitious international scientific project ever attempted.

Not only was ITER supposed to achieve breakeven; it was supposed to attain ignition and sustained burn. In theory, after the reaction was started, the plasma would heat itself and provide fusion energy as long as it had fresh fuel to consume; it would be like a furnace or a boiler, just needing periodic restoking while it provided continuous power. Though ITER would cost $10 billion, it would finally end the half measures of the individual countries’ domestic fusion efforts. The cooperating world powers were confident that they would finally end the research phase of magnetic fusion. They would finally be building essentially a working reactor. After so many disappointments and failed promises, scientists from around the globe would usher in the era of fusion energy. It was a golden vision, but it wouldn’t last.

A decade later, the USSR was no more. The United States was the only superpower left. Japan was in the throes of an economic crisis. Science budgets everywhere were declining, and in the United States the money available for fusion research was plummeting. ITER was in deep trouble.

In truth, ITER’s trouble began at birth. Nobody had ever pulled off an international scientific project of such an enormous scale. Figuring out how to compress and ignite a plasma was only one of the problems that ITER proponents had to solve. Perhaps even trickier was the problem of distribution and containment of pork.

Politicians like to see direct benefits from the money they spend. This means they want cash to flow into the hands of the people who elect them. That is the law of pork-barrel politics—why Congress so regularly funds ridiculous multimillion-dollar projects like useless bridges in Alaska. New Mexico congressmen tend to be munificent to Los Alamos; California senators back Livermore; New Jersey politicians support Princeton. It’s similar in other countries. Politicians always like to spend money to benefit their constituents.

ITER provided a porky dilemma. No matter where the ITER partners put the reactor, three of the four parties were going to have to spend their money on a machine in another country. Even if these partners managed to build much of the equipment domestically, cash (and talent) would have to flow overseas. This isn’t good pork-barrel politics. The country where the reactor would be built would get the lion’s share of the benefits of the project, and the others would see their money flow into the hands of a rival.

Even a decade after the Geneva meeting, nobody had agreed where ITER would be built. Rather than consolidating multiple international efforts into one big project, the need to distribute the pork among the parties led to just the opposite: duplication of effort. There were three centers—one in Germany, one in Japan, and one in the United States—devoted to designing the reactor.

Declining budgets made matters much worse. Fusion scientists in the United States had been making drastic cuts to their research program. They obliterated almost everything that wasn’t part of a tokamak project; the nation put almost all its magnetic fusion eggs in the tokamak basket. Many fusion scientists thought that other configurations (including some new ones like “spheromaks”) might lead to a working reactor faster than a tokamak would. In their view, cutting off research for these alternatives was shortsighted and premature. The tokamak shouldn’t be the only game in town. Thus, they were against ITER. They didn’t want to wager everything on a single enormous tokamak. Moreover, they weren’t alone in their wariness of the international reactor. Even tokamak physicists felt threatened, because the domestic fusion program would have to be gutted in favor of the enormous international collaboration. The already stretched budgets would have to accommodate ITER. Congress would not provide additional funds for more big domestic experiments, and the existing ones would be quickly shut down to cut expenses. Laboratories like Princeton’s would become superfluous without a major machine to experiment with. There would only be one big machine in the world, and it would likely be overseas.

Thus, by the mid-1990s, ITER had a large number of opponents: non-tokamak fusion scientists who resented the single-minded concentration on tokamaks, tokamak physicists who were afraid of having the domestic fusion program shipped overseas, and most important of all, politicians who saw taxpayer money flowing into the hands of other countries’ governments. Everybody, in theory, liked the idea of a huge international fusion effort. In practice, though it was unpopular, and budgets were still in free fall.

By 1995, the magnetic fusion budget had been hovering around $350 million per year. The President’s Committee of Advisors on Science and Technology (PCAST), an independent panel of experts that counseled the president on all matters scientific, gave Bill Clinton a grave warning about the fusion budget. At $320 million per year, the domestic program would be crippled, and ITER—as planned—would be too expensive to support; it would have to be renegotiated at a lower cost. A demonstration fusion power plant would be at least forty years away. If the budget dropped below $320 million, the consequences were almost too horrible to contemplate. The committee tried to envision a worthwhile fusion program with lower levels of funding but came to the following conclusion:

We find that this cannot be done. Reducing the U.S. fusion R&D program to such a level would leave room for nothing beyond the core program of theory and medium-scale experiments ... no contribution to an international ignition experiment or materials test facility, no [new domestic tokamak], little exploitation of the remaining scientific potential of TFTR, and little sense of progress toward a fusion energy goal. With complete U.S. withdrawal, international fusion collaboration might well collapse—to the great detriment of the prospects for commercializing fusion energy as well as the prospects for future U.S. participation in major scientific and technological collaborations of other kinds.

When Congress passed the 1996 budget, magnetic fusion got about $240 million. It did not take long for things to unravel completely.

In the meantime, the projected costs for ITER were skyrocketing, and scientists raised new doubts about whether it would achieve ignition at all. Despite the rosy picture painted by the design team, some physicists predicted that new instabilities would cool the plasma faster than expected, meaning ITER would fail, just as generations of fusion machines had failed before it. If ITER was going to fail to achieve ignition and sustained burn, then, some physicists began to argue, domestic devices could fail just as well at half the price. The American scientists (as well as their Japanese counterparts, who were also cash strapped) started talking about scaling it back, making it into a less-ambitious experiment at a lower cost. ITER-Lite, as the plan was known, would only cost $5 billion. However, ITER-Lite would be unable to achieve ignition and sustained burn. It would be just another incremental improvement on existing devices.

Though ITER-Lite was cheaper, it would defeat the main benefit of pooling four countries’ resources. No longer would the countries be leapfrogging over what domestic programs had been able to accomplish on their own. ITER-Lite would not be a great advance over previous designs. It would just be a massive, more expensive version of what everyone else had already built.

In late 1997, Japan asked for a three-year delay in construction. It was a terrible sign, and the designers scrambled to bring down ITER’s costs. Physicists and engineers proposed various versions of ITER-Lite, but without the promise of ignition and sustained burn the troubled project was doomed. The United States decided it wanted out.

In 1998, the House Appropriations Committee noted angrily that “after ten years and a U.S. contribution of $345 million, the partnership has yet to select a site” for ITER, and slashed all funding for the project. (They even questioned whether a tokamak was the best way to achieve fusion energy.) In July, the United States allowed the ITER agreement to expire, refusing to sign an extension that the other parties had signed; in October, the U.S. pulled its scientists out of the ITER work site in Germany. ITER was dead, at least for the United States.

When ITER died, America’s dream of fusion energy was officially deferred. Since the inception of the magnetic fusion effort in the United States, the government had considered it an “energy program”—Congress funded it in hopes of generating energy in the not-too-distant future. As ITER entered its death throes, the Office of Management and Budget changed magnetic fusion research into a “science program.” This meant that the program’s funding was no longer officially tied to the goal of building a fusion power plant. It was just pure research, science for science’s sake. Consequently, it became a lower priority for Congress. An energy program was easy to drum up support for, but pure science was always iffier.

By the turn of the millennium, magnetic fusion was but a shadow of what it had been in the 1980s. The U.S. magnetic fusion budget stabilized at approximately $240 million, which was worth less every year as inflation nibbled away at the value of the dollar. The golden age of magnetic fusion was over in America.

Scientists in Europe, Russia, and Japan struggled to keep the ITER project alive without the United States’ participation. They quickly decided that ITER, as originally envisioned, would be impossible to build. The three parties settled upon an ITER-Lite design. Gone was hope of ignition and sustained burn. Gone was the hope of a great leap toward fusion energy. And without the United States, even a drastically reduced ITER would be decades away.

In the meantime, fusion scientists had to make do with their increasingly obsolete tokamaks. They did their best to put a positive spin on a bad situation. Even as the original plans for ITER were dying, European and Japanese researchers finally claimed they had achieved the long-sought-after goal of breakeven. It was not as impressive as ignition and sustained burn, but if true, scientists had finally broken the fifty-year-old jinx and gotten more energy out of a controlled fusion reaction than they had put in.

In August 1996 and again in June 1998, researchers at Japan’s JT- 60 tokamak insisted that they had achieved “breakeven plasma conditions” and claimed their tokamak was producing 5 watts for every 4 that it consumed. A closer look showed that this wasn’t quite what happened. JT-60 was using a plasma made of deuterium, so the fusion reactions in the plasma were entirely between deuterium and deuterium. These are less energetic than deuterium-tritium reactions. If you really want to get a magnetic fusion reactor producing lots of energy, you will use a mixture of deuterium and tritium as the fuel rather than pure deuterium. JT-60’s “breakeven plasma conditions” did not really mean that the tokamak had reached breakeven. Instead, the JT-60 had reached pressures, temperatures, and confinement times that, according to calculations, would mean breakeven if researchers had used a deuterium-tritium mix rather than just deuterium as fuel. Every time JT-60 reached its “breakeven conditions,” it was still consuming much more energy than it produced. So much for Japan’s claim. What about Europe’s?

JET, the big European tokamak, actually used deuterium-tritium mixtures in attempts to achieve breakeven. In September 1997, scientists loaded up a such a mixture into the reactor, heated it, compressed it, and . . . and what? What happened? It depends on whom you ask.

Some people insist that JET reached breakeven. Britain’s Parliamentary Office on Science and Technology, for instance, states blandly in a pamphlet that “Breakeven was demonstrated at the JET experiment in the UK in 1997.” This is a myth, just like the myth about JT-60. In truth, JET got 6 watts out for every 10 it put in. It was a record, and a remarkable achievement, but a net loss of 40 percent of energy is not the hallmark of a great power plant. Scientists would claim—after twiddling with the definition of the energy put into the system—that the loss was as little as 10 percent. This might be so, but it still wasn’t breakeven; JET was losing energy, not making it.

National magnetic fusion programs are unable to achieve breakeven, let alone ignition and sustained burn. The national tokamaks like JET and JT-60 are reduced to setting lesser records: the highest temperature, the longest confinement, the highest pressure. However, these records are all but meaningless. Without getting beyond breakeven, the dream of a fusion reactor will remain out of reach. All the glowing press releases in the world won’t turn an energy-loss machine into a working fusion reactor.

Laser fusion scientists didn’t suffer nearly as much in the 1990s as their magnetic fusion counterparts. As magnetic fusion budgets sank, laser fusion ones rose, because laser fusion scientists had a secret weapon: nuclear bombs.

Publicly, laser fusion scientists billed their experiments as a way to free the world from its energy problems. What John Emmett, a Livermore laser scientist, declared to Time magazine in 1985 was typical: “Once we crack the problem of fusion, we have an assured source of energy for as long as you want to think about it. It will cease to be a reason for war or an influence on foreign affairs.” Emmett’s optimistic vision was no different from what fusion researchers had been promising since the 1950s. Just like their magnetic fusion counterparts, laser fusion scientists had promised, again and again, unlimited, clean energy. Just like their magnetic fusion counterparts, laser fusion scientists had been disappointed again and again as instabilities and other problems demolished their overly optimistic predictions. Shiva had failed, and by the 1990s, so had Nova. Inertial confinement fusion’s story was paralleling magnetic fusion’s, down to the shattered dreams and broken promises.

Less loudly, though, scientists were pushing laser fusion for a completely different reason. They weren’t really going after unlimited energy: they were pursuing laser fusion as a matter of national security. Without a working laser fusion facility, they argued, America’s nuclear weapons arsenal would be in grave danger. Congress was sold. Even as magnetic fusion scientists were wringing their hands in the mid-1990s, their laser fusion brethren were rolling in money—thanks, in part, to the danger posed by the test ban. On September 23, 1992, the United States detonated its last nuclear bomb, Julin Divider, before ceasing testing altogether. Throughout the 1990s, the world’s nuclear powers were negotiating a permanent ban on nuclear testing. Though a few nations conducted a small number of such tests while the discussions went on, the United States held firm. No nuclear explosions.

Of course, nuclear testing was the way weapons designers evaluated their new warheads; no nuclear testing means no new types of nuclear warheads—more or less. There’s some debate about whether the United States could manufacture slight variants on old weapons designs without resorting to underground detonations. However, it is certain that any sizable design change wouldn’t be considered reliable until it was subjected to a full-scale nuclear test.

It’s not a huge problem if the United States can’t design new nuclear weapons; the ones on hand are sufficient for national security.76 Instead, the test ban presented a more insidious problem. Without periodic nuclear testing, weaponeers argued, they could not be certain that the weapons in the nuclear stockpile would work. Nuclear bombs, like any other machines, decay over time. Their parts age and deteriorate. Since nuclear weapons use exotic radioactive materials, which undergo nuclear decay as well as physical decay, engineers don’t have a firm understanding of how such a device ages. An engineer can moth-ball a tank or airplane and be certain that it will still function fifty or a hundred years from now. Not so for nuclear warheads. So, to assure the reliability of the nuclear stockpile, engineers would take aged weapons and detonate them to see how well they worked. With a test ban, though, scientists could no longer do this. Many weaponeers insisted there was no way to guarantee that the weapons in the nuclear stockpile would still work in ten or twenty or thirty years. So what was the government to do? Enter the Science-Based Stockpile Stewardship program. Weapons scientists assured federal officials that with a set of high-tech experimental facilities they could ensure the reliability of the nation’s arsenal. Some facilities would concentrate on the chemical explosives that set off the devices. Some would study how elements like plutonium and uranium respond to shocks. But the jewel in the stockpile stewardship’s crown would be NIF, the National Ignition Facility at the Lawrence Livermore National Laboratory.

NIF is the successor to Nova. According to its designers, NIF, ten times more powerful than Nova, will zap a pellet of deuterium and tritium with 192 laser beams, pouring enough energy into the pellet to achieve breakeven. It will also ignite and have what is called propagating burn: at the center of the pellet, the fuel will begin fusing, and the energy from those fusions will heat the fuel and induce nearby nuclei to fuse. And of course, the fusion will produce more energy than the lasers put in. This is the same promise the designers made with Nova. And Shiva. But while Shiva cost $25 million and Nova cost about $200 million, in the early 1990s NIF was projected to cost more than $600 million. That number increased to more than $1 billion by the time the facility’s construction started in 1997. That was just the beginning.

As late as June 1999, NIF managers swore to the Department of Energy that everything was peachy, that the project, which was scheduled to be finished in 2003, was on budget and on schedule. This was a lie. Within a few months, officials at Livermore had to admit to enormous problems and cost overruns. Some of the issues were simple oversights. The laser facility, for instance, had problems with dust settling on the laser glass. Dust motes would scatter the laser light and burst into flame, etching the glass. To fix this problem, NIF engineers had to start assembling laser components in clean rooms and tote them around by robotic trucks with superclean interiors—at enormous cost. That was just one of the issues that had to be solved with piles of money.

It was as if everything that could possibly go wrong with NIF was, in fact, going wrong. Some of the issues were minor annoyances: a brief delay in construction followed when workers found mammoth bones on the NIF site. Some were major: the glass supplier was having difficulty producing glass pure enough to use in the laser, forcing a revamp of the entire manufacturing process. Some were just bizarre. The head of NIF, Michael Campbell, was forced to resign in 1997 when officials discovered he had lied about earning a PhD from Princeton University.

Some problems were unexpected but easy to deal with, such as an issue with the capacitors, the devices that store the energy used to pump the laser glass. These devices were packed so full of energy that occasionally one would spontaneously vaporize. It would explode, spraying shrapnel around the room. Engineers solved the problem by putting a steel shield around the capacitors; when one exploded, flapper doors would open and the debris would spray toward the floor.

Some problems were more complex. For example, scientists had long since gone from infrared to green to ultraviolet light to reduce the disproportional heating of electrons compared with nuclei, but ultraviolet light at such high intensities was extremely nasty to optics. It would pit anything it came into contact with. The laser would damage itself every time it would fire. The solution was less than perfect: at NIF’s full power, the optics will have to be replaced every fifty to one hundred shots or so, an extremely expensive prospect.

Furthermore, scientists were still struggling to deal with the Rayleigh-Taylor instability—the one that turns small imperfections on the surface of the fuel pellet into large mountains and deep valleys, destroying any hope of compressing the fuel to the point of ignition. Not only did scientists have to zap the target very carefully—so that the energy shining on the target was the same intensity on every part of the pellet—they also had to ensure that the pellet was extremely smooth. Even tiny imperfections on its surface would quickly grow and disrupt the collapsing plasma. To have any hope of achieving ignition, NIF’s target pellets—about a millimeter in size—cannot have bumps bigger than fifty nanometers high. It’s a tough task to manufacture such an object and fill it with fuel. Plastics, such as polystyrene, are relatively easy to produce with the required smoothness, but they don’t implode very well when struck with light. Beryllium metal implodes nicely, but it’s hard to make a metal sphere with the required smoothness. It was a really difficult problem that wasn’t getting any easier as NIF scientists worked on it.

The cost of the star-crossed project ballooned from about $1 billion to more than $4 billion; the completion date slipped from 2003 to 2008. Worst of all, even if everything worked perfectly, even if NIF’s lasers delivered the right power on target, nobody knew whether the pellet would ignite and burn. As early as the mid-1990s, outside reviewers, such as the JASON panel of scientists, warned that it was quite unlikely that NIF would achieve breakeven as easily as advertised. The prospects for breakeven grew worse as time passed. By 2000, NIF officials, if pressed, might say that the laser had a fifty-fifty shot of achieving ignition. NIF critics, on the other hand, were much less kind. “From my point of view, the chance that [NIF] reaches ignition is zero,” said Leo Mascheroni, one of NIF’s main detractors. “Not 1%. Those who say 5% are just being generous to be polite.” The truth is probably somewhere in between, but nobody will know for sure until NIF starts doing full-scale experiments with all 192 beams.

If NIF fails to ignite its pellets, and if it fails to reach breakeven, laser fusion experiments will still be absorbing energy rather than producing it; the dream of fusion energy will be just as far away as before.77Furthermore, analysts argued, NIF wouldn’t be terribly useful for stockpile stewardship without achieving breakeven. And NIF’s contribution to stockpile stewardship is crucial for... what, exactly? It’s hard to say for sure. Assume that NIF achieves ignition. For a brief moment, it compresses,confines, and heats a plasma so that it fuses, the fusion reaction spreads, and it produces more energy than it consumes. How does that translate into assuring the integrity of America’s nuclear stockpile?

At first glance, it is not obvious how it would contribute at all. Most of the problems with aging weapons involve the decay of the plutonium “pits” that start the reaction going. Will the pits work? Are they safe? Can you remanufacture old pits or must you rebuild them from scratch? These issues are relevant only to a bomb’s primary stage, the stage powered by fission, not fusion (except for the slight boost given by the injection of a little fusion fuel at the center of the bomb). The fusion happens in the bomb’s secondary stage, and there doesn’t seem to be nearly as much concern about aging problems with a bomb’s secondary. If the primary is where most of the problems are, what good does it do to study fusion reactions at NIF? NIF’s results would seem to apply mostly to the secondary, not the primary.

Since so much about weapons work is classified, it is hard to see precisely what problems NIF is intended to solve. But some of the people in the know say that NIF has a point. The “JASONs,” for example, argue that NIF does help maintain the stockpile—but not right away. NIF will contribute to science-based stockpile stewardship, the panel wrote in 1996, “but its contribution is almost exclusively to the long-term tasks, not to immediate needs associated with short-term tasks.” That is, NIF will help eventually, but it is not terribly useful in the short term.

What are those long-term tasks? Two years earlier, the JASON panel was a little more explicit. NIF would help a bit with understanding what happens when tritium in a primary’s booster decays. (However, since tritium has a half-life of only twelve years, it stands to reason that weapons designers periodically must replace old tritium in weapons with fresh tritium. This is probably routine by now.) NIF will also help scientists understand the underlying physics and “benchmark” the computer codes—like LASNEX—that simulate imploding and fusing plasma. (But why is this important if you are not designing new weapons? The ones in the stockpile already presumably work just fine, so you presumably don’t need a finer understanding of plasma physics to maintain them.) The JASON members have access to classified information, but even so, their justifications for NIF seem a little thin—at first. And then JASON lists one more contribution that NIF makes to stockpile stewardship: “NIF will contribute to training and retaining expertise in weapons science and engineering, thereby permitting responsible stewardship without further underground tests.” That’s the main reason for NIF.

With the moratorium in place, nuclear tests are at an end. New scientists entering the program will never have a chance to design a bomb and test it. They will never have a chance to study a live nuclear explosion. All they have left are computer simulations and experiments that mimic one part of a nuclear explosion. NIF would be the only facility that mimics the explosion of a secondary; it would give young scientists a chance to study secondary physics without ever seeing a nuclear test. And that’s the point of NIF. NIF is essentially a training ground for weapons scientists. As old ones retire and new ones grow up without ever having seen a nuclear test, NIF is a way to give them some level of experience so that America doesn’t lose its nuclear expertise.

NIF isn’t truly about energy. It is not about keeping our stockpile safe, at least not directly. It is about keeping the United States’ weapons community going in the absence of nuclear tests. However, it is contributing next to nothing to the stockpile stewardship program at the moment, and the program is heading toward a crisis. Weaponeers are complaining that the United States is increasingly unable to vouch for its nuclear arsenal, and the government seems to be slowly slouching toward a resumption of nuclear detonations.

A number of ominous signs suggest that nuclear testing might begin again before too long. The debate in the early 2000s about the new Robust Nuclear Earth Penetrator warhead was an indication that the government was thinking beyond the test ban; before deploying the weapon, it almost certainly would need a test. Even though Congress strangled that program, it has blessed the Department of Energy’s campaign to design yet another warhead. The Reliable Replacement Warhead (RRW), as it is called, is supposed to obviate the need for nuclear testing because it would be a hardier device less susceptible to aging. It would be able to assure the reliability of the nation’s nuclear arsenal for decades without nuclear tests. The only problem is that the RRW would probably require a few nuclear tests before anyone was convinced of its reliability in the first place. It’s a paradox: to maintain the nuclear test ban, the United States might have to resume testing.

A debate is also ongoing about shortening the time it will take to prepare the Nevada nuclear test site for a resumption of underground tests. President George W. Bush tried to make the site ready to resume testing within eighteen months, rather than maintain the previous twenty-four-month lead time. But going to a higher level of readiness announced to the world that the nation was moving toward ending the moratorium, and this could hamstring American attempts to stem the proliferation of nuclear weapons around the world. Year after year, the president put money for eighteen-month readiness in the budget; year after year, Congress took it out. Even without the cash, though, the National Nuclear Security Administration, the organization inside the Department of Energy responsible for nuclear weapons, lists eighteen-month test-site readiness as an integral part of the stockpile stewardship program.

The stockpile stewardship program will soon reach a crisis point. Will the federal government be able to assure the reliability of the stockpile without testing nuclear weapons as the program originally promised? Or will it fail, forcing a resumption of testing, breaching the moratorium in place for over a decade? The move toward renewed nuclear testing is happening now, and NIF, if it helps with stockpile stewardship at all, will do so indirectly and in the distant future. The nontesting regime might well be in tatters by the time scientists get any benefit from the multibillion-dollar machine supposedly designed to uphold it.

NIF is the state of the art in laser fusion, yet it is a deeply troubled project. It is vastly more expensive than originally projected. Even if it works perfectly, it won’t keep the country’s nuclear arsenal working or the nontesting policy alive. For a decade, experts have questioned whether it would be sufficiently powerful to achieve ignition and breakeven—and if the history of laser fusion is any guide, NIF, like Nova, will fail to reach its goal. Yet NIF marches on. Laser fusion scientists won’t give up their decades-old dream to put a star in a bottle. And if they fail, as it appears they will, after spending more than $4 billion, there is little hope that they can sucker the government into building yet another bigger and better laser machine.

In 2002, five years after the United States abruptly left the ITER project, fusion scientists were about to get a serious case of déjà vu.

The American departure shook the ITER collaboration—and branded the United States as an unreliable partner when it came to international science—but the project limped along. Russia, Europe, and Japan continued designing an international fusion reactor. The plans they came up with were much less ambitious than the original ITER. The plasma in the reactor would span twelve meters rather than sixteen meters. It would not achieve ignition and sustained burn—the plasma would never be fusing enough to keep itself warm—but if all went well, the reactor would be able to keep a plasma confined for up to an hour and produce ten times as much power as it consumed. (It would finally achieve breakeven—for real, this time.) It would cost half as much as the original ITER: $5 billion, rather than $10 billion.78

The American magnetic fusion program, in the meantime, was in ruins. There was no big domestic tokamak, just a few lesser ones in Boston and in San Diego. The big domestic tokamak, TFTR, had been shut down in 1997 to make room for ITER. Princeton, once home of the $100 million giants, was reduced to working on a tiny, $25 million spherical torus. Plans existed for larger machines, such as billion-dollar tokamaks, but they were just dreams; there was no chance they would be built. The United States was rapidly retreating from the cutting edge of magnetic fusion. Instead of getting a robust domestic program along with an enormous international reactor, American fusion scientists had neither. By 2002, with slim pickings at home, those scientists began to eye the slimmed-down ITER project, argued that many of the design flaws of the original machine had been fixed, and asked to rejoin the collaboration. At a cost of only about $1 billion, they argued, the United States could become an ITER partner again. The request worked its way up the food chain—from the scientists to a fusion advisory panel, to the head of the Department of Energy’s Office of Science, to the secretary of energy, to the president. The answer was yes.

In early 2003, President Bush announced that the United States was back in the collaboration. The Americans would rejoin ITER.79

Even though the machine’s design had been revamped and the collaboration had expanded—China, South Korea, and Canada had joined in—the same problems that haunted the first incarnation of ITER remained. For one thing the partners were still fighting over where the machine would be built.

Japan and Europe were the main contenders. Each attacked the other’s proposal. Japan complained that the proposed European site in the south of France was too far from a port. The French argued that the Japanese site was prone to earthquakes. Most scientists in the United States understandably seemed to prefer a laboratory a short drive from the French Riviera to one near a dismal brackish lake in the north of Japan, but the United States officially backed the Japanese site. Some Europeans hinted, darkly, that American support of Japan over France was political payback for France’s criticism of the Iraq war. The Japanese accused the Europeans of circulating a nasty anonymous memo to the ITER parties that faulted the Japanese choice of site. China and Russia backed France. Canada pulled out of the collaboration entirely. Europe threatened to do so as well. In early 2005, more than three years after the United States had reentered the collaboration, ITER was deadlocked and on the brink of unraveling once again.

Back at the Capitol, Congress once again was getting very annoyed at the delay—and another old debate reopened. American fusion scientists started bickering about whether it was wise to decimate the domestic fusion program to fund an international reactor. The Department of Energy slashed its domestic programs to finance ITER; Congress restored the domestic funds and threatened to completely cut off money for the international reactor. ITER was about to collapse entirely.

Luckily for ITER’s backers, the Japanese blinked just in time. Japan agreed that the French would host the reactor, but in return Europe would pay half the reactor’s cost and would use Japanese companies for many of its manufacturing contracts. Furthermore, Japan would get to host a $600 million facility devoted to researching advanced materials for fusion reactors, materials that could withstand the intense heat and radiation inside a tokamak as well as reduce the amount of radioactive waste when the reactor vessel needed to be replaced. The debate was over. ITER would be sited in Cadarache, France. The American government, for its part, managed to find a way to fund its share: the fusion budget was increased to support ITER as well as the (modest) domestic program. India joined the collaboration. Everything seemed to be hunky-dory again.

On November 21, 2006, representatives of the seven ITER partner states signed the formal agreement. Everybody took the opportunity to wax poetic about what fusion power meant for the future. French president Jacques Chirac bubbled about ITER as a “hand held out to future generations”:

The ambition is huge! To control nuclear fusion. To control the tremendous amount of energy generated at one hundred million degrees and to design sufficiently resistant materials for the purpose. To produce as much energy from a litre of seawater as a litre of oil or a kilo of coal.

It is a glorious vision. Unlimited energy—a tiny star bottled in a magnetic jar—would liberate mankind from the fear of global warming and from the impending energy crisis.

If ITER fails, it will probably mean the end of tokamaks. The likelihood of using magnets to confine and heat a plasma would seem slimmer than ever. However, there’s no reason to assume that ITER, like generations of machines before it, will be a disappointment. If nothing goes wrong, ITER will begin experiments in 2018 or so.80 And if ITER works as planned when scientists turn it on, it will light the way to a fusion reactor. If, miraculously, no more instabilities crop up that prevent scientists from bottling their plasma, fusion energy will be within reach. Scientists would then build a demonstration fusion power plant that would begin operations in 2035 or 2040. After five decades of broken promises, lies, delusions, and self-deception, it will finally be true. Fusion energy will be thirty years away.