The Battery: How Portable Power Sparked a Technological Revolution - Henry Schlesinger (2010)

Chapter 18. Always On

“Lashing out the action, returning the reaction…Battery is here to stay!”

—Metallica

Through the 1980s and 1990s batteries seemed to keep pace with all of the consumer gadgets and gizmos entering the market. New battery chemistries were coming into use, and the technique by which manufacturers rolled battery components tightly to increase density and create the different cells—known in the industry as a jelly roll—allowed them to pack more and more power into smaller packages. A jelly roll D cell can increase surface area to an impressive thirty inches. For a long while, this seemed to work and batteries settled into five types of rechargeable chemistries for consumer electronics: the alkaline, nickel cadmium (NiCd), nickel-metal hydride (NiMH), and lithium-ion (Li-ion).

Electronic devices evolved at an amazing rate during the 1980s and 1990s, driven, at least in part, by Moore’s Law. Battery engineers, still working under the iron rule of Faraday’s Law, weren’t as lucky. They found themselves constantly trading off between energy density, longevity, and size. The problem was keeping up as the use of portable products increased.

Alkalines are more efficient than standard dry cells, but are primarily suited to low-drain tasks like television remotes and toys. With devices such as portable music players seeing increased use, what was needed was a rechargeable battery.

The first generation of serious rechargeables included the NiCds, which were popular for a while; however, they were not only toxic but also prone to the notorious memory effect—if you charged them before they were completely discharged, their ability to hold a charge would diminish precipitously. Fortunately for all concerned, they’re in the process of being phased out in favor of the more environmentally friendly NiMH for things like digital cameras and electric razors.

AND THEN THERE’S LITHIUM, WHICH now powers a good portion of our electronics but has always proved problematic. First discovered in 1800 in a Swedish iron mine in the form of petalite ore or lithium aluminum silicate, lithium came to the attention of scientists in a less than pure state. It took more than a decade before a young chemist, Johan Arfwedson (sometimes Arfvedson), working in the lab of Jöns Jakob Berzelius, famous for figuring out atomic weights and devising the system of chemical notation we use today, to even classify it. Because the substance seemed alkaline, he gave it the misleading name lithos(Greek for stone) to distinguish it from salts found in organic matter, like plants. In fact, it was an alkali metal.

Not much happened until Humphry Davy with his supercharged voltaic pile, and another member of the Royal Institution, W. T. Brande, managed to isolate it through electrolysis. What they found when they applied a good jolt of electricity to lithium chloride was a very reactive silvery metal that was highly flammable and quickly oxidized when exposed to air. The lightest, least dense solid metal, lithium is very much a metal with attitude. It didn’t take long to realize that you couldn’t leave the stuff lying around the lab like, say, lead or copper. You had to store it in oil.

Chemists loved lithium, calling it the “mystery metal,” though its uses seemed limited. It wasn’t until the 1960s that the idea of the lithium battery gained some traction for use in pacemakers. Then, in the 1970s Exxon researchers began working on lithium batteries in earnest, as did Lew Urry, who already had the alkaline battery to his credit at Eveready.

The potential benefits of a lithium battery were obvious—high voltage, “high energy density,” and a chemical side step around Faraday’s Law that was holding traditional batteries back. The possibility for extended battery life and higher charges was there, but was it worth the effort? Did the consumer market really need a new battery, particularly one made up of this strange, highly flammable metal? After a few initial efforts, America’s leading battery companies abandoned lithium altogether. Transistor radios, flashlights, and all manner of toys ran just fine with the standard batteries. Going lithium would require retooling plants or making enormous investments in new plants. And, too, lithium wasn’t the kind of material you want stored in a warehouse. For battery manufacturers it was an easy decision: let someone else do it.

It was around that time that the government, the army, the navy, and even NASA stepped in by funding research while the FAA established safety guidelines. If high output and long life weren’t absolute necessities for consumer products, they were certainly welcomed for things like emergency locators for planes and satellites and on the battlefield. Unfortunately, lithium had not gotten any easier to work with over the years. Several fires and at least one death were reported. Progress was made, but the batteries remained a highly specialized product. That is, until Sony and the Asahi Chemical Company entered the picture.

Picking up the research where American scientists left off, they brought the first lithium-ion (Li-ion) battery to market in the early 1990s. Very much the right battery at the right time, the Li-ion batteries presented a major technological advance. Not only did they not contain lithium in its dangerous form—just the ions—they also produced a solid-state chemical reaction, which meant very little self-discharge. Engineers began calling the new configuration a “rocking chair” battery for the way the lithium ions rocked back and forth between the two electrodes. They could sit around for a long time before going “bad.” They seemed a perfect match for the new age of portability that very rapidly evolved from the AA-powered Walkman to laptops, cell phones, iPods, and PDAs. Lightweight and moldable to fit most devices, Li-ion batteries also don’t suffer from the dreaded memory effect. Although they generally don’t last beyond three years, neither do most of the products they power.

In a little more than a decade, extended life and higher charges had progressed from a nice luxury to a decisive issue. Sony was soon joined by South Korean manufacturers and even Chinese companies as the Far East became the center of Li-ion battery manufacturing.

What is interesting is that American companies were not “beaten” at the rechargeable battery market by low wages, but seem to have made the decision not to aggressively participate on a large scale. Some place blame on the vertical integration of the Asian electronics manufacturers or the low profit margins compared to primary batteries.

This state of affairs has caused no little concern among electronics manufacturers in the United States. While America continues to act as an innovator of the new technologies, there is some question as to how long that will last. Asian manufacturers have not only perfected the manufacturing processes but are also funneling significant amounts of capital into their R&D programs. All of this has happened quietly, in large part because batteries are increasingly black-boxed, hidden away from the consumer, without brand names or the benefit of advertising. Still integral but unseen, batteries are on the verge of some very large technological advances.