Death by Black Hole: And Other Cosmic Quandaries - Neil deGrasse Tyson (2014)


Chapter 12. SPEED LIMITS

Including the space shuttle and Superman, a few things in life travel faster than a speeding bullet. But nothing moves faster than the speed of light in a vacuum. Nothing. Although as fast as light moves, its speed is decidedly not infinite. Because light has a speed, astrophysicists know that looking out in space is the same as looking back in time. And with a good estimate for the speed of light, we can come close to a reasonable estimate for the age of the universe.

These concepts are not exclusively cosmic. True, when you flick on a wall switch, you don’t have to wait around for the light to reach the floor. Some morning while you’re eating breakfast and you need something new to think about, though, you might want to ponder the fact that you see your kids across the table not as they are but as they once were, about three nanoseconds ago. Doesn’t sound like much, but stick the kids in the nearby Andromeda galaxy, and by the time you see them spoon their Cheerios they will have aged more than 2 million years.

Minus its decimal places, the speed of light through the vacuum of space, in Americanized units, is 186,282 miles per second—a quantity that took centuries of hard work to measure with such high precision. Long before the methods and tools of science reached maturity, however, deep thinkers had thought about the nature of light: Is light a property of the perceiving eye or an emanation from an object? Is it a bundle of particles or a wave? Does it travel or simply appear? If it travels, how fast and how far?

IN THE MID-FIFTH century B.C. a forward-thinking Greek philosopher, poet, and scientist named Empedocles of Acragas wondered if light might travel at a measurable speed. But the world had to wait for Galileo, a champion of the empirical approach to the acquisition of knowledge, to illuminate the question through experiment.

He describes the steps in his book Dialogues Concerning Two New Sciences, published in 1638. In the dark of night, two people, each holding a lantern whose light can be rapidly covered and uncovered, stand far apart from each other, but in full view. The first person briefly flashes his lantern. The instant the second person sees the light, he flashes his own lantern. Having done the experiment just once, at a distance of less than a mile, Galileo writes:

I have not been able to ascertain with certainty whether the appearance of the opposite light was instantaneous or not; but if not instantaneous it is extraordinarily rapid—I should call it momentary. (p. 43)

Fact is, Galileo’s reasoning was sound, but he stood much too close to his assistant to time the passage of a light beam, particularly with the imprecise clocks of his day.

A few decades later the Danish astronomer Ole Rømer diminished the speculation by observing the orbit of Io, the innermost moon of Jupiter. Ever since January 1610, when Galileo and his brand-new telescope first caught sight of Jupiter’s four brightest and largest satellites, astronomers had been tracking the Jovian moons as they circled their huge host planet. Years of observations had shown that, for Io, the average duration of one orbit—an easily timed interval from the Moon’s disappearance behind Jupiter, through its reemergence, to the beginning of its next disappearance—was just about 42.5 hours. What Rømer discovered was that when Earth was closest to Jupiter, Io disappeared about 11 minutes earlier than expected, and when Earth was farthest from Jupiter, Io disappeared about 11 minutes later.

Rømer reasoned that Io’s orbital behavior was not likely to be influenced by the position of Earth relative to Jupiter, and so surely the speed of light was to blame for any unexpected variations. The 22-minute range must correspond to the time needed for light to travel across the diameter of Earth’s orbit. From that assumption, Rømer derived a speed of light of about 130,000 miles a second. That’s within 30 percent of the correct answer—not bad for a first-ever estimate, and a good deal more accurate than Galileo’s “If not instantaneous….”

James Bradley, the third Astronomer Royal of Great Britain, laid to rest nearly all remaining doubts that the speed of light was finite. In 1725 Bradley systematically observed the star Gamma Draconis and noticed a seasonal shift in the star’s position on the sky. It took him three years to figure it out, but he eventually credited the shift to the combination of Earth’s continuous orbital movement and the finite speed of light. Thus did Bradley discover what is known as the aberration of starlight.

Imagine an analogy: It’s a rainy day, and you’re sitting inside a car stuck in dense traffic. You’re bored, and so (of course) you hold a big test tube out the window to collect raindrops. If there’s no wind, the rain falls vertically; to collect as much water as possible, you hold the test tube in a vertical position. The raindrops enter at the top and fall straight to the bottom.

Finally the traffic clears, and your car hits the speed limit again. You know from experience that the vertically falling rain will now leave diagonal streaks on the car’s side windows. To capture the raindrops efficiently, you must now tip the test tube to the angle that matches the rain streaks on the windows. The faster the car moves, the larger the angle.

In this analogy, the moving Earth is the moving car, the telescope is the test tube, and incoming starlight, because it does not move instantaneously, can be likened to the falling rain. So to catch the light of a star, you’ll have to adjust the angle of the telescope—aim it at a point that’s slightly different from the actual position of the star on the sky. Bradley’s observation may seem a bit esoteric, but he was the first to confirm—through direct measurement rather than by inference—two major astronomical ideas: that light has a finite speed and that Earth is in orbit around the Sun. He also improved on the accuracy of light’s measured speed, giving 187,000 miles per second.

BY THE LATE nineteenth century, physicists were keenly aware that light—just like sound—propagates in waves, and they presumed that if traveling sound waves need a medium (such as air) in which to vibrate, then light waves need a medium too. How else could a wave move through the vacuum of space? This mystical medium was named the “luminiferous ether,” and the physicist Albert A. Michelson, working with chemist Edward W. Morley, took on the task of detecting it.

Earlier, Michelson had invented an apparatus known as an interferometer. One version of this device splits a beam of light and sends the two parts off at right angles. Each part bounces off a mirror and returns to the beam splitter, which recombines the two beams for analysis. The precision of the interferometer enables the experimenter to make extremely fine measurements of any differences in the speeds of the two light beams: the perfect device for detecting the ether. Michelson and Morley thought that if they aligned one beam with the direction of Earth’s motion and made the other transverse to it, the first beam’s speed would combine with Earth’s motion through the ether, while the second beam’s speed would remain unaffected.

Turns out, M & M got a null result. Going in two different directions made no difference to the speed of either light beam; they returned to the beam splitter at exactly the same time. Earth’s motion through the ether simply had no effect on the measured speed of light. Embarrassing. If the ether was supposed to enable the transmission of light, yet it couldn’t be detected, maybe the ether didn’t exist at all. Light turned out to be self-propagating: neither medium nor magic was needed to move a beam from one position to another in the vacuum. Thus, with a swiftness approaching the speed of light itself, the luminiferous ether entered the graveyard of discredited scientific ideas.

And thanks to his ingenuity, Michelson also further refined the value for the speed of light, to 186,400 miles per second.

BEGINNING IN 1905, investigations into the behavior of light got positively spooky. That year, Einstein published his special theory of relativity, in which he ratcheted up M & M’s null result to an audacious level. The speed of light in empty space, he declared, is a universal constant, no matter the speed of the light-emitting source or the speed of the person doing the measuring.

What if Einstein is right? For one thing, if you’re in a spacecraft traveling at half the speed of light and you shine a light beam straight ahead of the spacecraft, you and I and everybody else in the universe who measures the beam’s speed will find it to be 186,282 miles per second. Not only that, even if you shine the light out the back, top, or sides of your spacecraft, we will all continue to measure the same speed.


Common sense says that if you fire a bullet straight ahead from the front of a moving train, the bullet’s ground speed is the speed of the bullet plus the speed of the train. And if you fire the bullet straight backward from the back of the train, the bullet’s ground speed will be its own minus that of the train. All that is true for bullets, but not, according to Einstein, for light.

Einstein was right, of course, and the implications are staggering. If everyone, everywhere and at all times, is to measure the same speed for the beam from your imaginary spacecraft, a number of things have to happen. First of all, as the speed of your spacecraft increases, the length of everything—you, your measuring devices, your spacecraft—shortens in the direction of motion, as seen by everyone else. Furthermore, your own time slows down exactly enough so that when you haul out your newly shortened yardstick, you are guaranteed to be duped into measuring the same old constant value for the speed of light. What we have here is a cosmic conspiracy of the highest order.

IMPROVED METHODS OF measuring soon added decimal place upon decimal place to the speed of light. Indeed, physicists got so good at the game that they eventually dealt themselves out of it.

Units of speed always combine units of length and time—50 miles per hour, for instance, or 800 meters per second. When Einstein began his work on special relativity, the definition of the second was coming along nicely, but definitions of the meter were completely clunky. As of 1791, the meter was defined as one ten-millionth the distance from the North Pole to the equator along the line of longitude that passes through Paris. And after earlier efforts to make this work, in 1889 the meter was redefined as the length of a prototype bar made of platinum-iridium alloy, stored at the International Bureau of Weights and Measures in Sèvres, France, and measured at the temperature at which ice melts. In 1960, the basis for defining the meter shifted again, and the exactitude increased further: 1,650,763.73 wavelengths, in a vacuum, of light emitted by the unperturbed atomic energy-level transition 2p10 to 5d5 of the krypton-86 isotope. Obvious, when you think about it.

Eventually it became clear to all concerned that the speed of light could be measured far more precisely than could the length of the meter. So in 1983 the General Conference on Weights and Measures decided to define—not measure, but define—the speed of light at the latest, best value: 299,792,458 meters per second. In other words, the definition of the meter was now forced into units of the speed of light, turning the meter into exactly 1/299,792,458 of the distance light travels in one second in a vacuum. And so tomorrow, anyone who measures the speed of light even more precisely than the 1983 value will be adjusting the length of the meter, not the speed of light itself.

Don’t worry, though. Any refinements in the speed of light will be too small to show up in your school ruler. If you’re an average European guy, you’ll still be slightly less than 1.8 meters tall. And if you’re an American, you’ll still be getting the same bad gas mileage in your SUV.

THE SPEED OF LIGHT may be astrophysically sacred, but it’s not immutable. In all transparent substances—air, water, glass, and especially diamonds—light travels more slowly than it does in a vacuum.

But the speed of light in a vacuum is a constant, and for a quantity to be truly constant it must remain unchanged, regardless of how, when, where, or why it is measured. The light-speed police take nothing for granted, though, and in the past several years they have sought evidence of change in the 13.7 billion years since the big bang. In particular, they’ve been measuring the so-called fine-structure constant, which is a combination of the speed of light in a vacuum and several other physical constants, including Planck’s constant, pi, and the charge of an electron.

This derived constant is a measure of the small shifts in the energy levels of atoms, which affect the spectra of stars and galaxies. Since the universe is a giant time machine, in which one can see the distant past by looking at distant objects, any change in the value of the fine-structure constant with time would reveal itself in observations of the cosmos. For cogent reasons, physicists don’t expect Planck’s constant or the charge of an electron to vary, and pi will certainly keep its value—which leaves only the speed of light to blame if discrepancies arise.

One of the ways astrophysicists calculate the age of the universe assumes that the speed of light has always been the same, so a variation in the speed of light anywhere in the cosmos is not just of passing interest. But as of January 2006, physicists’ measurements show no evidence for a change in the fine-structure constant across time or across space.