FIRE AND ICE - WAYS AND MEANS OF NATURE - Death by Black Hole: And Other Cosmic Quandaries - Neil deGrasse Tyson

Death by Black Hole: And Other Cosmic Quandaries - Neil deGrasse Tyson (2014)

SECTION 3. WAYS AND MEANS OF NATURE

Chapter 19. FIRE AND ICE

When Cole Porter composed “Too Darn Hot” for his 1948 Broadway musical Kiss Me Kate, the temperature he was bemoaning was surely no higher than the mid-nineties. No harm in taking Porter’s lyrics as an authoritative source on the upper temperature limit for comfortable lovemaking. Combine that with what a cold shower does to most people’s erotic urges, and you now have a pretty good estimate of how narrow the comfort zone is for the unclothed human body: a range of about 30 degrees Fahrenheit, with room temperature just about in the middle.

The universe is a whole other story. How does a temperature of 100,000,000,000,000,000,000,000,000,000,000 degrees grab you? That’s a hundred thousand billion billion billion degrees. It also happens to be the temperature of the universe a teeny fraction of a second after the big bang—a time when all the energy and matter and space that would turn into planets, petunias, and particle physicists was an expanding fiery ball of quark-gluon plasma. Nothing you’d call a thing could exist until there was a multibillion-fold cooling of the cosmos.

As the laws of thermodynamics decree, within about one second after the big bang, the expanding fireball had cooled to 10 billion degrees and ballooned from something smaller than an atom to a cosmic colossus about a thousand times the size of our solar system. By the time three minutes had passed, the universe was a balmy billion degrees and was already hard at work making the simplest atomic nuclei. Expansion is the handmaiden to cooling, and the two have continued, unabated, ever since.

Today the average temperature of the universe is 2.73 degrees Kelvin. All the temperatures mentioned so far, aside from the ones that involve the human libido, are stated in degrees Kelvin. The Kelvin degree, known simply as the kelvin, was conceived to be the same temperature interval as the Celsius degree, but the Kelvin scale has no negative numbers. Zero is zero, period. In fact, to quash all doubts, zero on the Kelvin scale is dubbed absolute zero.

The Scottish engineer and physicist William Thomson, later and better known as Lord Kelvin, first articulated the idea of a coldest possible temperature in 1848. Laboratory experiments haven’t gotten there yet. As a matter of principle, they never will, although they’ve come awfully close. The unarguably cold temperature of 0.0000000005 K (or 500 picokelvins, as metric mavens would say) was artfully achieved in 2003 in the lab of Wolfgang Ketterle, a physicist at MIT.

Outside the laboratory, cosmic phenomena span a staggering range of temperatures. Among the hottest places in the universe today is the core of a blue supergiant star during the hours of its collapse. Just before it explodes as a supernova, creating drastic neighborhood-warming effects, its temperature hits 100 billion K. Compare that with the Sun’s core: a mere 15 million K.

Surfaces are much cooler. The skin of a blue supergiant checks in at about 25,000 K—hot enough, of course, to glow blue. Our Sun registers 6,000 K—hot enough to glow white, and hot enough to melt and then vaporize anything in the periodic table of elements. The surface of Venus is 740 K, hot enough to fry the electronics normally used to drive space probes.

Considerably further down the scale is the freezing point of water, 273.15 K, which looks downright warm compared with the 60 K surface of Neptune, nearly 3 billion miles from the Sun. Colder still is Triton, one of Neptune’s moons. Its icy nitrogen surface sinks to 40 K, making it the coldest place in the solar system this side of Pluto.

Where do Earth-beings fit in? The average body temperature of humans (traditionally 98.6 degrees F) registers slightly above 310 on the Kelvin scale. Officially recorded surface temperatures on Earth range from a summer high of 331 K (136 F, at Al ‘Aziziyah, Libya, in 1922) to a winter low of 184 K (-129 F, at Base Vostok, Antarctica, in 1983). But people can’t survive unassisted at those extremes. We suffer hyperthermia in the Sahara if we don’t have shelter from the heat, and hypothermia in the Arctic if we don’t have boatloads of clothing and caravans of food. Meanwhile, Earth-dwelling extremophile microorganisms, both thermophilic (heat-loving) and psychrophilic (cold-loving), are variously adapted to temperatures that would fry us or freeze us. Viable yeast, wearing no clothes at all, has been discovered in 3-million-year-old Siberian permafrost. A species of bacterium locked in Alaskan permafrost for 32,000 years woke up and started swimming as soon as its medium melted. And at this very moment, assorted species of archaea and bacteria are living out their lives in boiling mud, bubbling hot springs, and undersea volcanoes.

Even complex organisms can survive in similarly astonishing circumstances. When provoked, the itsy-bitsy invertebrates known as tardigrades can suspend their metabolism. In that state, they can survive temperatures of 424 K (303 degrees F) for several minutes and 73 K (-328 degrees F) for days on end, making them hardy enough to endure being stranded on Neptune. So the next time you need space travelers with the “right stuff,” you might want to choose yeast and tardigrades, and leave your astronauts, cosmonauts, and taikonauts* at home.

IT’S COMMON TO confuse temperature with heat. Heat is the total energy of all the motions of all the molecules in your substance of choice. It so happens that, within the mixture, the range of energies is large: some molecules move quickly, others move slowly. Temperature simply measures their average energy. For example, a cup of freshly brewed coffee may have a higher temperature than a heated swimming pool, but all the water in the pool holds vastly more heat than does the lone cup of coffee. If you rudely pour your 200-degree coffee into the 100-degree pool, the pool won’t suddenly become 150 degrees. And whereas two people in a bed are a source of twice as much heat as one person in a bed, the average temperatures of their two bodies—98.6 and 98.6—do not normally add up to an undercover oven whose temperature is 197.2 degrees.

Scientists in the seventeenth and eighteenth centuries considered heat to be closely linked with combustion. And combustion, as they understood it, happened when phlogiston, a hypothetical earthlike substance characterized mainly by its combustibility, was removed from an object. Burn a log in the fireplace, air carries off the phlogiston, and the dephlogisticated log reveals itself as a pile of ashes.

By the late eighteenth century the French chemist Antoine-Laurent Lavoisier had replaced phlogiston theory with caloric theory. Lavoisier classified heat, which he called caloric, as one of the chemical elements, and contended that it was an invisible, tasteless, odorless, weightless fluid that passed between objects through combustion or rubbing. The concept of heat was not fully understood until the nineteenth century, the peak of the industrial revolution, when the broader concept of energy took shape within the new branch of physics called thermodynamics.

ALTHOUGH HEAT as a scientific idea posed plenty of challenges to brilliant minds, both scientists and nonscientists have intuitively grasped the concept of temperature for millennia. Hot things have a high temperature. Cold things have a low temperature. Thermometers confirm the connection.

Although Galileo is often credited with the invention of the thermometer, the earliest such device may have been built by the first-century A.D. inventor Heron of Alexandria. Heron’s book Pneumatica includes a description of a “thermoscope,” a device that showed the change in the volume of a gas as it was heated or cooled. Like many other ancient texts, Pneumatica was translated into Latin during the Renaissance. Galileo read it in 1594 and, as he later did when he learned of the newly invented telescope, he immediately constructed a better thermoscope. Several of his contemporaries did the same.

For a thermometer, scale is crucial. There’s a curious tradition, beginning early in the eighteenth century, of calibrating the temperature units in such a way that common phenomena get assigned fraction-friendly numbers with many divisors. Isaac Newton proposed a scale from zero (melting snow) to 12 (the human body); 12 is, of course, evenly divisible by 2, 3, 4, and 6. The Danish astronomer Ole Rømer offered a scale from zero to 60 (60 being divisible by 2, 3, 4, 5, 6, 10, 12, 15, 20, and 30). On Rømer’s scale, zero was the lowest temperature he could achieve with a mixture of ice, salt, and water; 60 was the boiling point of water.

In 1724 a German instrument maker named Daniel Gabriel Fahrenheit (who developed the mercury thermometer in 1714) came up with a more precise scale, splitting each degree of Rømer’s into four equal parts. On the new scale, water boiled at 240 degrees and froze at 30, and human body temperature was about 90. After further adjustments, the span from zero to body temperature became 96 degrees, another winner in the divisibility department (its divisors are 2, 3, 4, 6, 8, 12, 16, 24, 32, and 48). The freezing point of water became 32 degrees. Still further tuning and standardization saddled fans of the Fahrenheit scale with a body temperature that isn’t a round number, and a boiling point of 212 degrees.

Following a different path, in 1742 the Swedish astronomer Anders Celsius proposed a decimal-friendly centigrade scale for temperature. He set the freezing point at 100 and the boiling point at zero. This was not the first or last time an astronomer labeled a scale backward. Somebody, quite possibly the chap who manufactured Celsius’s thermometers, did the world a favor and reversed the numbering, giving us the now-familiar Celsius scale. The number zero seems to have a crippling effect on some people’s comprehension. One night a couple of decades ago, while I was on winter break from graduate school and was staying at my parents’ house north of New York City, I turned on the radio to listen to classical music. A frigid Canadian air mass was advancing on the Northeast, and the announcer, between movements of George Frideric Handel’s Water Music, continually tracked the descending outdoor temperature: “Five degrees Fahrenheit.” “Four degrees.” “Three degrees.” Finally, sounding distressed, he announced, “If this keeps up, pretty soon there’ll be no temperature left!”

In part to avoid such embarrassing examples of innumeracy, the international community of scientists uses the Kelvin temperature scale, which puts zero in the right place: at the absolute bottom. Any other location for zero is arbitrary and does not lend itself to play-by-play arithmetic commentary.

Several of Kelvin’s predecessors, by measuring the shrinking volume of a gas as it cooled, had established-273.15 degrees Celsius (-459.67 degrees F) as the temperature at which the molecules of any substance have the least possible energy. Other experiments showed that-273.15 C is the temperature at which a gas, when kept at constant pressure, would drop to zero volume. Since there is no such thing as a gas with zero volume,-273.15 C became the unattainable lower limit of the Kelvin scale. And what better term to use for it than “absolute zero”?

THE UNIVERSE AS a whole acts somewhat like a gas. If you force a gas to expand, it cools. Back when the universe was a mere half-million years old, the cosmic temperature was about 3,000 K. Today it is less than 3 K. Inexorably expanding toward thermal oblivion, the present-day universe is a thousand times larger, and a thousand times cooler, than the infant universe.

On Earth, you normally measure temperatures by cramming a thermometer into a creature’s orifice or letting the thermometer touch an object in some other, less intrusive way. This form of direct contact enables the moving molecules within the thermometer to reach the same average energy as the molecules in the object. When a thermometer sits idle in the air instead of performing its labors inside a rib roast, it’s the average speed of the colliding air molecules that tell the thermometer what temperature to register.

Speaking of air, at a given time and place on Earth the air temperature in full sunlight is basically the same as the air temperature under a nearby tree. What the shade does is shield you from the Sun’s radiant energy, nearly all of which passes unabsorbed through the atmosphere and lands on your skin, making you feel hotter than the air would by itself. But in empty space, where there is no air, there are no moving molecules to trigger a thermometer reading. So the question “What is the temperature of space?” has no obvious meaning. With nothing touching it, the thermometer can only register the radiant energy from all the light, from all sources, that lands upon it.

On the daytime side of our airless Moon, a thermometer would register 400 K (260 degrees F). Move a few feet into the shadow of a boulder, or journey to the Moon’s night side, and the thermometer would instantly drop to 40 K (-390 degrees F). To survive a lunar day without wearing a temperature-controlled space suit, you would have to do pirouettes, alternately baking and then cooling all sides of your body, just to maintain a comfortable temperature.

WHEN THE GOING gets really cold and you want to absorb maximum radiant energy, wear something dark rather than reflective. The same holds for a thermometer. Rather than debate how to dress it in space, assume the thermometer can be made perfectly absorbent. If you now place it in the middle of nowhere, such as halfway between the Milky Way and the Andromeda galaxy, far from all obvious sources of radiation, the thermometer will settle at 2.73 K, the current background temperature of the universe.

A recent consensus among cosmologists holds that the universe will expand forever and ever. By the time the cosmos doubles in size, its temperature will drop by half. By the time it doubles again, its temperature will halve once more. With the passage of trillions of years, all the remaining gas will have been used to make stars, and all the stars will have exhausted their thermonuclear fuels. Meanwhile, the temperature of the expanding universe will continue to descend, approaching ever closer to absolute zero.