Idiom’s Delight: Bad Astronomy in Everyday Language - Bad Astronomy Begins at Home - Bad Astronomy: Misconceptions and Misuses Revealed, from Astrology to the Moon Landing “Hoax” - Philip Plait

Bad Astronomy: Misconceptions and Misuses Revealed, from Astrology to the Moon Landing “Hoax” - Philip Plait (2002)

Part I. Bad Astronomy Begins at Home

Chapter 3. Idiom's Delight: Bad Astronomy in Everyday Language


One of the reasons I loved astronomy when I was a kid was because of the big numbers involved. Even the nearest astronomical object, the Moon, was 400,000 kilometers away! I would cloister myself in my room with a pencil and paper, and painstakingly convert that number into all kinds of different units like feet, inches, centimeters, and millimeters. It was fun, even though it branded me as a geek. That's all changed, of course. As an adult I use a computer to be a geek a million times faster than I ever could when I was a kid.

The fun really was in the big numbers. Unfortunately, the numbers get too big too fast. Venus, the nearest planet to the Earth, never gets closer than 42 million kilometers from us. The Sun is 150,000,000 (150 million) kilometers away on an average day, and Pluto is about 6,000,000,000 (6 billion) kilometers away. The nearest star to the Sun that we know of, Proxima Centauri, is a whopping 40,000,000,000,000 (40 trillion) kilometers away! Try converting that to centimeters. You'll need a lot of zeros.

There is a way around using such unwieldy numbers. Compare these two measurements: (1) I am 17,780,000,000 Angstroms tall. (2) 1 am 1.78 meters tall. Clearly (2) is a much better way to ex press my height. An Angstrom is a truly dinky unit: 100 million of them would fit across a single centimeter. Angstroms are used to measure the sizes of atoms and the wavelengths of light, and they are too awkward to use for anything else.

The point is that you can make things easy on yourself if you change your unit to something appropriate for the distances involved. In astronomy there aren't too many units that big! But there is one that's pretty convenient. Light! Light travels very fast, so fast that no one could accurately measure its speed until the nineteenth century. We now know it travels about 300,000 kilometers every second. That's a million times the speed of sound! No wonder no one could measure it until recently.

So, astronomers use light itself as a big unit. It took the Apollo astronauts 3 days to go to the Moon in their slowpoke capsule, but it takes a beam of light just 1.3 seconds to zip through the same trip. So we say the Moon is 1.3 light-seconds away. Light takes 8 minutes to reach the Sun; the Sun is 8 light-minutes away. Distant Pluto is about 6 light-hours away.

A light-minute or -hour may be useful for solar system work, but it's small potatoes on the scale of our Galaxy. Light doesn't travel far enough in only one minute. For galactic work, you need a light-year, the distance a beam of light travels in one year. It's equal to about 10 trillion kilometers, which is a long way. Proxima Centauri is 4.2 light-years away; the light leaving a presidential inauguration might not reach Proxima Centauri until after the president leaves office at the end of the term!

The light-year is the standard yardstick of astronomers. The problem is that pesky word "year." If you're not familiar with the term, you might think it's a time unit like an hour or a day. Worse, since it's an astronomical term, people think it's a really long time, like it's a lot of years. It isn't. It's a distance.

That doesn't stop its misuse. The phrase "light-years ahead" is a common advertising slogan used to represent how advanced a product is, as if it's way ahead of its time.

I can picture some advertising executive meeting with his team, telling them that saying their product is "years more advanced than the competition" just doesn't cut it. One member of the ad team timidly raises a hand and says, "How about if we say `lightyears' instead?"

It sounds good, I'll admit. But it's wrong. And more bad astronomy is born.

Worse, one Internet service provider even claims it's "light-years faster than a regular connection." They're using it as a speed!

Not surprisingly, Hollywood is a real offender here. In the first Star Wars movie, for example, Han Solo brags to Obi Wan Kenobi and Luke Skywalker that he could make the Kessel Run in "less than twelve parsecs." Like a light-year, a parsec is another unit of distance used by astronomers; it's equal to 3.26 light-years (that may sound like a silly unit, but it's actually based on an angular measure using the size of the Earth's orbit). Han's claim is like runners saying that they run a 10-kilometer race in 8 kilometers! It doesn't make sense. Astute fans of Star Wars may notice that Obi Wan gets a pained look on his face when Han says that line. Maybe he is wincing at his pilot's braggadocio; I choose to think Obi Wan knows his units.


If you go far from the lights of a city on a clear night and wait long enough, chances are you'll see a shooting star. The proper name for it is a meteor. Of course, meteors aren't stars at all. They are tiny bits of gravel or dust that have evaporated off the surface of comets during their long voyages around the Sun. Some are the shrapnel from collisions between asteroids. Most of them are very small; an average one is about the size of a grain of sand.

While they are out in space, these specks are called meteoroids. They orbit the Sun as the Earth does, and sometimes their paths cross ours. When one does, the little piece of flotsam enters our atmosphere, and the tremendous pressure generated by its travel through our air causes it to heat up tremendously, so hot that it glows. That glow is what we call a meteor. If it impacts the ground, it's called a meteorite.

These three names cause a lot of confusion. A meteoroid glows as a meteor when it moves through the air, and it becomes a meteorite when it hits the ground. I got into an argument once with a friend about what to call meteors during various parts of their travel. I said they are meteorites when they hit the ground. He asked, "What if they hit a house and stop on the second floor?" I countered that the house is in direct contact with the Earth, so it's still a meteorite. He rebutted by asking, "What if it hit an airplane and stopped?"

I had to scratch my head over that one. Is it a meteorite when the plane lands? What if the plane crashes? At this point we decided we were being silly, and decided to just go outside and look for meteors. That may have saved our friendship.

Anyway, meteors start off in space and then fall to the Earth. They appear dramatically, flashing into our view, and burn out suddenly as they descend through the atmosphere toward the ground, sometimes leaving a long trail of glowing ash behind them. They start off bright, then fade away.

Enter bad astronomy. I was reading a major metropolitan newspaper one day and was amused when it referred to a Russian offi- cial's "meteoric rise" in the political structure of that country. Of course, the reporter meant that the official appeared out of nowhere and made a quick, brilliant rise to the top of his heap. The real meaning of the phrase, however, is just the opposite: were we to be literal, the official would have made a sudden, eye-catching appearance in the political arena and then quickly burned himself out as he descended the ranks. He may have left a trail behind him, and even made quite an impact in the end!


I had the misfortune one morning to wake up to the radio playing the song "Dream Weaver." I'll admit I used to love that song when I was a kid, but as a friend of mine likes to say, "We are not responsible for songs we liked when we were 15 years old." Anyway, as the tired, hackneyed verses went on, one in particular caught my ear: "Fly me away to the bright side of the moon, and meet me on the other side."

Of course, there is a bright side of the Moon, and you can go to it. But if you sit still, you can only be there for two weeks, max. The bright side, and therefore the dark side as well, is not a fixed place, but appears to move as the Moon rotates.

Seen from the surface of the Earth, the Moon does not appear to rotate. It seems to show the same face to us all the time. Actually it does spin; it's just that it spins once for every time it goes around the Earth. Its rotation teams up with its revolution in such a way that it always shows that one face to us. We call that face the near side of the Moon. The other side, the one we never see, is called the far side. The far side of the Moon has only been seen by probes or by astronauts who have actually orbited the Moon. Since it's remote and not well known, the far side of the Moon has become synonymous with something terribly far away or unexplored.

The problem is, people confuse the far side with the dark side. You almost never hear the phrase "far side of the Moon." It's always the "dark side of the Moon." This phrase isn't really wrong, but it is inaccurate.

Like the Earth, the Moon spins. The Earth spins once every 24 hours, so that someone standing on its surface sees the Sun go up and down once a day. As seen from outside the Earth, that person is on the dark side of the Earth when he or she is on the half that is facing away from the Sun. But the dark side is not a permanent feature! Wait a few hours, and the Earth spins enough to bring that person back into the sunlight. He or she is now on the bright side of the Earth. No part of the Earth is on the dark side forever.

The same goes for the Moon, except its day is 29 of our Earthdays long. Someone on the Moon will see the sun set two weeks after it rises! Since half the Moon is in sunlight and half in darkness, there is technically a dark side to the Moon, but it changes as the Moon rotates. Except near the poles, a single point on the Moon is in sunlight, then in darkness, for two weeks.

You can see that the dark side of the Moon is simply just the night side of the Moon. It is no more a fixed feature than the night side of the Earth. Sometimes the far side is the dark side, but it's also sometimes the bright side. It just depends on when you look.

One of the best selling music albums of all time is Pink Floyd's Dark Side of the Moon. It may be popular, but astronomically it's in eclipse.

Incidentally, at the end of that album there is a quiet voiceover: "There is no dark side of the moon. As a matter of fact, it's all dark." In a sense that line is correct: the Moon is actually very dark, only reflecting less than 10 percent of the sunlight that hits it. That makes it about as dark as slate! The reason it looks so bright is that it is in full sunlight, and that means there's a lot of light hitting it. Ironically, even though six Apollo missions landed on the near side of the Moon, they only explored the tiniest fraction of the surface. In essence, even the near side of the Moon is largely unexplored, and it's still very far away.

Now, to be honest, there may be a part of the Moon that's always dark. Near the poles there are deep craters with raised rims around them. From that region the Sun is always near the horizon, just like at the poles on Earth. Since the craters on the Moon can be deep, the Sun may always be hidden by the rim of the crater. Sunlight never reaches the bottom of such craters! There is tantalizing evidence of ice at the bottom of such craters, untouched by the warming rays of the Sun. If it's true, there are two major implications. One is that the ice can be used by lunar colonists for air and water, negating the need to carry it along with them from Earth. That saves a vast amount of money, fuel, and effort.

The other implication is that the phrase "dark side of the Moon" actually has a limited truth to it-as far as the dark crater bottoms go! Maybe I need to start a "Not-So-Bad Astronomy" web site.


Sometimes the advertising executives we discussed earlier aren't satisfied with being "light-years ahead" of their competitors. They come up with a product so revolutionary that it leaves the others in the dust. It's more than light-years ahead, it's a whole new product. How to describe it?

Sometimes they say it's a "quantum leap" ahead of the others. But how big a leap is that, really?

The nature of matter has been a mystery for thousands of years (and really, it still is). Contrary to our modern bias that ancient people were not as smart as we are now, the ancient Greeks theorized about the existence of atoms. The thinker Democritus deduced that if you split a rock in half, then do it again, and again, and again, eventually you might come to a point where you simply cannot split it any more. That tiniest part he called an atom, meaning "indivisible."

This knowledge was interesting but of no fundamental meaning until thousands of years later. The advent of better technology let us investigate these tiny atoms. At first, it was thought that the atom looked like a solid little ball, but experiments soon showed that there were two separate parts-a nucleus in the middle made of particles called protons and neutrons, and an outer part containing particles called electrons. One model had the atom looking like a miniature solar system, with the nucleus acting like the Sun and the electrons orbiting like little planets.

This model sparked a flurry of science-fiction stories in which the solar system itself was just an atom in a greater universe of matter. This concept was really just a model, not designed to be a true picture of reality. Nevertheless, the idea still persists today in much of the public's mind.

However, the model turned out to be incorrect. At the very beginning of the twentieth century, a new physics was born. It was called quantum mechanics, and it postulated a horde of weird theories. One of them is that electrons are not free to orbit as they wish but instead are confined to specific distances from the nucleus. These distances are like steps in a staircase. You can be on the bottom step, or on the second or third step, but you can't be on the second-and-a-half step; there isn't any such place. If you are on the bottom step and try to get to the second, either you have enough energy to get there or you stay put.

So it goes for electrons. They stick to their specific orbit unless they get enough energy to jump to the next one. If even 99 percent of the energy needed to jump comes their way, they cannot do it. They need exactly the right amount to move to that next step, that next level. This jump became known as a quantum leap.

In reality, a quantum leap is a teeny-tiny jump. The distances are fantastically small, measured as billionths of a centimeter or less.

So you might conclude that an ad bragging about a product being a quantum leap over other products is silly, since it means it's ahead by only 0.00000000001 centimeters!

You might be surprised to find out that I have no problem with this phrase. I don't think it's bad at all! The actual distance jumped may be small, but only on our scale. To an electron it truly is a quantum leap, a sudden jump from one stage to the next. The phrase itself has nothing to do with the absolute distance the electron moved, but everything to do with its being a major leap forward, skipping the intervening space and landing in a new spot far ahead of where it was.

Sometimes people say that when something is easy, it isn't exactly rocket science. But in this case, maybe it is!