What Is Reality - The Grand Design - Stephen Hawking, Leonard Mlodinow

The Grand Design - Stephen Hawking, Leonard Mlodinow (2010)

Chapter 3. What Is Reality?

FEW YEARS AGO the city council of Monza, Italy, barred pet owners from keeping goldfish in curved goldfish bowls. The measure’s sponsor explained the measure in part by saying that it is cruel to keep a fish in a bowl with curved sides because, gazing out, the fish would have a distorted view of reality. But how do we know we have the true, undistorted picture of reality? Might not we ourselves also be inside some big goldfish bowl and have our vision distorted by an enormous lens? The goldfish’s picture of reality is different from ours, but can we be sure it is less real?

The goldfish view is not the same as our own, but goldfish could still formulate scientific laws governing the motion of the objects they observe outside their bowl. For example, due to the distortion, a freely moving object that we would observe to move in a straight line would be observed by the goldfish to move along a curved path. Nevertheless, the goldfish could formulate scientific laws from their distorted frame of reference that would always hold true and that would enable them to make predictions about the future motion of objects outside the bowl. Their laws would be more complicated than the laws in our frame, but simplicity is a matter of taste. If a goldfish formulated such a theory, we would have to admit the goldfish’s view as a valid picture of reality.

A famous example of different pictures of reality is the model introduced around AD 150 by Ptolemy (ca. 85—ca. 165) to describe the motion of the celestial bodies. Ptolemy published his work in a thirteen-book treatise usually known under its Arabic title, Almagest. The Almagest begins by explaining reasons for thinking that the earth is spherical, motionless, positioned at the center of the universe, and negligibly small in comparison to the distance of the heavens. Despite Aristarchus’s heliocentric model, these beliefs had been held by most educated Greeks at least since the time of Aristotle, who believed for mystical reasons that the earth should be at the center of the universe. In Ptolemy’s model the earth stood still at the center and the planets and the stars moved around it in complicated orbits involving epicycles, like wheels on wheels.

This model seemed natural because we don’t feel the earth under our feet moving (except in earthquakes or moments of passion). Later European learning was based on the Greek sources that had been passed down, so that the ideas of Aristotle and Ptolemy became the basis for much of Western thought. Ptolemy’s model of the cosmos was adopted by the Catholic Church and held as official doctrine for fourteen hundred years. It was not until 1543 that an alternative model was put forward by Copernicus in his book De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres), published only in the year of his death (though he had worked on his theory for several decades).

Copernicus, like Aristarchus some seventeen centuries earlier, described a world in which the sun was at rest and the planets revolved around it in circular orbits. Though the idea wasn’t new, its revival was met with passionate resistance. The Copernican model was held to contradict the Bible, which was interpreted as saying that the planets moved around the earth, even though the Bible never clearly stated that. In fact, at the time the Bible was written people believed the earth was flat. The Copernican model led to a furious debate as to whether the earth was at rest, culminating in Galileo’s trial for heresy in 1633 for advocating the Copernican model, and for thinking “that one may hold and defend as probable an opinion after it has been declared and defined contrary to the Holy Scripture.” He was found guilty, confined to house arrest for the rest of his life, and forced to recant. He is said to have muttered under his breath “Eppur si muove,” “But still it moves.” In 1992 the Roman Catholic Church finally acknowledged that it had been wrong to condemn Galileo.

So which is real, the Ptolemaic or Copernican system? Although it is not uncommon for people to say that Copernicus proved Ptolemy wrong, that is not true. As in the case of our normal view versus that of the goldfish, one can use either picture as a model of the universe, for our observations of the heavens can be explained by assuming either the earth or the sun to be at rest. Despite its role in philosophical debates over the nature of our universe, the real advantage of the Copernican system is simply that the equations of motion are much simpler in the frame of reference in which the sun is at rest.

A different kind of alternative reality occurs in the science fiction film The Matrix, in which the human race is unknowingly living in a simulated virtual reality created by intelligent computers to keep them pacified and content while the computers suck their bioelectrical energy (whatever that is). Maybe this is not so far-fetched, because many people prefer to spend their time in the simulated reality of websites such as Second Life. How do we know we are not just characters in a computer-generated soap opera? If we lived in a synthetic imaginary world, events would not necessarily have any logic or consistency or obey any laws. The aliens in control might find it more interesting or amusing to see our reactions, for example, if the full moon split in half, or everyone in the world on a diet developed an uncontrollable craving for banana cream pie. But if the aliens did enforce consistent laws, there is no way we could tell there was another reality behind the simulated one. It would be easy to call the world the aliens live in the “real” one and the synthetic world a “false” one. But if—like us—the beings in the simulated world could not gaze into their universe from the outside, there would be no reason for them to doubt their own pictures of reality. This is a modern version of the idea that we are all figments of someone else’s dream.

These examples bring us to a conclusion that will be important in this book: There is no picture- or theory-independent concept of reality. Instead we will adopt a view that we will call model-dependent realism: the idea that a physical theory or world picture is a model (generally of a mathematical nature) and a set of rules that connect the elements of the model to observations. This provides a framework with which to interpret modern science.

Philosophers from Plato onward have argued over the years about the nature of reality. Classical science is based on the belief that there exists a real external world whose properties are definite and independent of the observer who perceives them. According to classical science, certain objects exist and have physical properties, such as speed and mass, that have well-defined values. In this view our theories are attempts to describe those objects and their properties, and our measurements and perceptions correspond to them. Both observer and observed are parts of a world that has an objective existence, and any distinction between them has no meaningful significance. In other words, if you see a herd of zebras fighting for a spot in the parking garage, it is because there really is a herd of zebras fighting for a spot in the parking garage. All other observers who look will measure the same properties, and the herd will have those properties whether anyone observes them or not. In philosophy that belief is called realism.

Though realism may be a tempting viewpoint, as we’ll see later, what we know about modern physics makes it a difficult one to defend. For example, according to the principles of quantum physics, which is an accurate description of nature, a particle has neither a definite position nor a definite velocity unless and until those quantities are measured by an observer. It is therefore not correct to say that a measurement gives a certain result because the quantity being measured had that value at the time of the measurement. In fact, in some cases individual objects don’t even have an independent existence but rather exist only as part of an ensemble of many. And if a theory called the holographic principle proves correct, we and our four-dimensional world may be shadows on the boundary of a larger, five-dimensional space-time. In that case, our status in the universe is analogous to that of the goldfish.

Strict realists often argue that the proof that scientific theories represent reality lies in their success. But different theories can successfully describe the same phenomenon through disparate conceptual frameworks. In fact, many scientific theories that had proven successful were later replaced by other, equally successful theories based on wholly new concepts of reality.

Traditionally those who didn’t accept realism have been called anti-realists. Anti-realists suppose a distinction between empirical knowledge and theoretical knowledge. They typically argue that observation and experiment are meaningful but that theories are no more than useful instruments that do not embody any deeper truths underlying the observed phenomena. Some anti-realists have even wanted to restrict science to things that can be observed. For that reason, many in the nineteenth century rejected the idea of atoms on the grounds that we would never see one. George Berkeley (1685-1753) even went as far as to say that nothing exists except the mind and its ideas. When a friend remarked to English author and lexicographer Dr. Samuel Johnson (1709-1784) that Berkeley’s claim could not possibly be refuted, Johnson is said to have responded by walking over to a large stone, kicking it, and proclaiming, “I refute it thus.” Of course the pain Dr. Johnson experienced in his foot was also an idea in his mind, so he wasn’t really refuting Berkeley’s ideas. But his act did illustrate the view of philosopher David Hume (1711-1776), who wrote that although we have no rational grounds for believing in an objective reality, we also have no choice but to act as if it is true.

Model-dependent realism short-circuits all this argument and discussion between the realist and anti-realist schools of thought.

According to model-dependent realism, it is pointless to ask whether a model is real, only whether it agrees with observation. If there are two models that both agree with observation, like the goldfish’s picture and ours, then one cannot say that one is more real than another. One can use whichever model is more convenient in the situation under consideration. For example, if one were inside the bowl, the goldfish’s picture would be useful, but for those outside, it would be very awkward to describe events from a distant galaxy in the frame of a bowl on earth, especially because the bowl would be moving as the earth orbits the sun and spins on its axis.

We make models in science, but we also make them in everyday life. Model-dependent realism applies not only to scientific models but also to the conscious and subconscious mental models we all create in order to interpret and understand the everyday world. There is no way to remove the observer—us—from our perception of the world, which is created through our sensory processing and through the way we think and reason. Our perception—and hence the observations upon which our theories are based—is not direct, but rather is shaped by a kind of lens, the interpretive structure of our human brains.

Model-dependent realism corresponds to the way we perceive objects. In vision, one’s brain receives a series of signals down the optic nerve. Those signals do not constitute the sort of image you would accept on your television. There is a blind spot where the optic nerve attaches to the retina, and the only part of your field of vision with good resolution is a narrow area of about 1 degree of visual angle around the retina’s center, an area the width of your thumb when held at arm’s length. And so the raw data sent to the brain are like a badly pixilated picture with a hole in it. Fortunately, the human brain processes that data, combining the input from both eyes, filling in gaps on the assumption that the visual properties of neighboring locations are similar and interpolating. Moreover, it reads a two-dimensional array of data from the retina and creates from it the impression of three-dimensional space. The brain, in other words, builds a mental picture or model.

The brain is so good at model building that if people are fitted with glasses that turn the images in their eyes upside down, their brains, after a time, change the model so that they again see things the right way up. If the glasses are then removed, they see the world upside down for a while, then again adapt. This shows that what one means when one says “I see a chair” is merely that one has used the light scattered by the chair to build a mental image or model of the chair. If the model is upside down, with luck one’s brain will correct it before one tries to sit on the chair.

Another problem that model-dependent realism solves, or at least avoids, is the meaning of existence. How do I know that a table still exists if I go out of the room and can’t see it? What does it mean to say that things we can’t see, such as electrons or quarks—the particles that are said to make up the proton and neutron—exist? One could have a model in which the table disappears when I leave the room and reappears in the same position when I come back, but that would be awkward, and what if something happened when I was out, like the ceiling falling in? How, under the table-disappears-when-I-leave-the-room model, could I account for the fact that the next time I enter, the table reappears broken, under the debris of the ceiling? The model in which the table stays put is much simpler and agrees with observation. That is all one can ask.

In the case of subatomic particles that we can’t see, electrons are a useful model that explains observations like tracks in a cloud chamber and the spots of light on a television tube, as well as many other phenomena. It is said that the electron was discovered in 1897 by British physicist J. J. Thomson at the Cavendish Laboratory at Cambridge University. He was experimenting with currents of electricity inside empty glass tubes, a phenomenon known as cathode rays. His experiments led him to the bold conclusion that the mysterious rays were composed of minuscule “corpuscles” that were material constituents of atoms, which were then thought to be the indivisible fundamental unit of matter. Thomson did not “see” an electron, nor was his speculation directly or unambiguously demonstrated by his experiments. But the model has proved crucial in applications from fundamental science to engineering, and today all physicists believe in electrons, even though you cannot see them.

Quarks, which we also cannot see, are a model to explain the properties of the protons and neutrons in the nucleus of an atom. Though protons and neutrons are said to be made of quarks, we will never observe a quark because the binding force between quarks increases with separation, and hence isolated, free quarks cannot exist in nature. Instead, they always occur in groups of three (protons and neutrons), or in pairings of a quark and an anti-quark (pi mesons), and behave as if they were joined by rubber bands.

The question of whether it makes sense to say quarks really exist if you can never isolate one was a controversial issue in the years after the quark model was first proposed. The idea that certain particles were made of different combinations of a few sub-subnuclear particles provided an organizing principle that yielded a simple and attractive explanation for their properties. But although physicists were accustomed to accepting particles that were only inferred to exist from statistical blips in data pertaining to the scattering of other particles, the idea of assigning reality to a particle that might be, in principle, unobservable was too much for many physicists. Over the years, however, as the quark model led to more and more correct predictions, that opposition faded. It is certainly possible that some alien beings with seventeen arms, infrared eyes, and a habit of blowing clotted cream out their ears would make the same experimental observations that we do, but describe them without quarks. Nevertheless, according to model-dependent realism, quarks exist in a model that agrees with our observations of how subnuclear particles behave.

Model-dependent realism can provide a framework to discuss questions such as: If the world was created a finite time ago, what happened before that? An early Christian philosopher, St. Augustine (354-430), said that the answer was not that God was preparing hell for people who ask such questions, but that time was a property of the world that God created and that time did not exist before the creation, which he believed had occurred not that long ago. That is one possible model, which is favored by those who maintain that the account given in Genesis is literally true even though the world contains fossil and other evidence that makes it look much older. (Were they put there to fool us?) One can also have a different model, in which time continues back 13.7 billion years to the big bang. The model that explains the most about our present observations, including the historical and geological evidence, is the best representation we have of the past. The second model can explain the fossil and radioactive records and the fact that we receive light from galaxies millions of light-years from us, and so this model—the big bang theory—is more useful than the first. Still, neither model can be said to be more real than the other.

Some people support a model in which time goes back even further than the big bang. It is not yet clear whether a model in which time continued back beyond the big bang would be better at explaining present observations because it seems the laws of the evolution of the universe may break down at the big bang. If they do, it would make no sense to create a model that encompasses time before the big bang, because what existed then would have no observable consequences for the present, and so we might as well stick with the idea that the big bang was the creation of the world.

A model is a good model if it:

1. Is elegant

2. Contains few arbitrary or adjustable elements

3. Agrees with and explains all existing observations

4. Makes detailed predictions about future observations that can disprove or falsify the model if they are not borne out.

For example, Aristotle’s theory that the world was made of four elements, earth, air, fire, and water, and that objects acted to fulfill their purpose was elegant and didn’t contain adjustable elements. But in many cases it didn’t make definite predictions, and when it did, the predictions weren’t always in agreement with observation. One of these predictions was that heavier objects should fall faster because their purpose is to fall. Nobody seemed to have thought that it was important to test this until Galileo. There is a story that he tested it by dropping weights from the Leaning Tower of Pisa. This is probably apocryphal, but we do know he rolled different weights down an inclined plane and observed that they all gathered speed at the same rate, contrary to Aristotle’s prediction.

The above criteria are obviously subjective. Elegance, for example, is not something easily measured, but it is highly prized among scientists because laws of nature are meant to economically compress a number of particular cases into one simple formula. Elegance refers to the form of a theory, but it is closely related to a lack of adjustable elements, since a theory jammed with fudge factors is not very elegant. To paraphrase Einstein, a theory should be as simple as possible, but not simpler. Ptolemy added epicycles to the circular orbits of the heavenly bodies in order that his model might accurately describe their motion. The model could have been made more accurate by adding epicycles to the epicycles, or even epicycles to those. Though added complexity could make the model more accurate, scientists view a model that is contorted to match a specific set of observations as unsatisfying, more of a catalog of data than a theory likely to embody any useful principle.

We’ll see in Chapter 5 that many people view the “standard model,” which describes the interactions of the elementary particles of nature, as inelegant. That model is far more successful than Ptolemy’s epicycles. It predicted the existence of several new particles before they were observed, and described the outcome of numerous experiments over several decades to great precision. But it contains dozens of adjustable parameters whose values must be fixed to match observations, rather than being determined by the theory itself.

As for the fourth point, scientists are always impressed when new and stunning predictions prove correct. On the other hand, when a model is found lacking, a common reaction is to say the experiment was wrong. If that doesn’t prove to be the case, people still often don’t abandon the model but instead attempt to save it through modifications. Although physicists are indeed tenacious in their attempts to rescue theories they admire, the tendency to modify a theory fades to the degree that the alterations become artificial or cumbersome, and therefore “inelegant.”

If the modifications needed to accommodate new observations become too baroque, it signals the need for a new model. One example of an old model that gave way under the weight of new observations was the idea of a static universe. In the 1920s, most physicists believed that the universe was static, or unchanging in size. Then, in 1929, Edwin Hubble published his observations showing that the universe is expanding. But Hubble did not directly observe the universe expanding. He observed the light emitted by galaxies. That light carries a characteristic signature, or spectrum, based on each galaxy’s composition, which changes by a known amount if the galaxy is moving relative to us. Therefore, by analyzing the spectra of distant galaxies, Hubble was able to determine their velocities. He had expected to find as many galaxies moving away from us as moving toward us. Instead he found that nearly all galaxies were moving away from us, and the farther away they were, the faster they were moving. Hubble concluded that the universe is expanding, but others, trying to hold on to the earlier model, attempted to explain his observations within the context of the static universe. For example, Caltech physicist Fritz Zwicky suggested that for some yet unknown reason light might slowly lose energy as it travels great distances. This decrease in energy would correspond to a change in the light’s spectrum, which Zwicky suggested could mimic Hubble’s observations. For decades after Hubble, many scientists continued to hold on to the steady-state theory. But the most natural model was Hubble’s, that of an expanding universe, and it has come to be the accepted one.

In our quest to find the laws that govern the universe we have formulated a number of theories or models, such as the four-element theory, the Ptolemaic model, the phlogiston theory, the big bang theory, and so on. With each theory or model, our concepts of reality and of the fundamental constituents of the universe have changed. For example, consider the theory of light. Newton thought that light was made up of little particles or corpuscles. This would explain why light travels in straight lines, and Newton also used it to explain why light is bent or refracted when it passes from one medium to another, such as from air to glass or air to water.

The corpuscle theory could not, however, be used to explain a phenomenon that Newton himself observed, which is known as Newton’s rings. Place a lens on a flat reflecting plate and illuminate it with light of a single color, such as a sodium light. Looking down from above, one will see a series of light and dark rings centered on where the lens touches the surface. This would be difficult to explain with the particle theory of light, but it can be accounted for in the wave theory.

According to the wave theory of light, the light and dark rings are caused by a phenomenon called interference. A wave, such as a water wave, consists of a series of crests and troughs. When waves collide, if those crests and troughs happen to correspond, they reinforce each other, yielding a larger wave. That is called constructive interference. In that case the waves are said to be “in phase.” At the other extreme, when the waves meet, the crests of one wave might coincide with the troughs of the other. In that case the waves cancel each other and are said to be “out of phase.” That situation is called destructive interference.

In Newton’s rings the bright rings are located at distances from the center where the separation between the lens and the reflecting plate is such that the wave reflected from the lens differs from the wave reflected from the plate by an integral (1, 2, 3,…) number of wavelengths, creating constructive interference. (A wavelength is the distance between one crest or trough of a wave and the next.) The dark rings, on the other hand, are located at distances from the center where the separation between the two reflected waves is a half-integral (½, 1½, 2½,…) number of wavelengths, causing destructive interference—the wave reflected from the lens cancels the wave reflected from the plate.

In the nineteenth century, this was taken as confirming the wave theory of light and showing that the particle theory was wrong. However, early in the twentieth century Einstein showed that the photoelectric effect (now used in television and digital cameras) could be explained by a particle or quantum of light striking an atom and knocking out an electron. Thus light behaves as both particle and wave.

The concept of waves probably entered human thought because people watched the ocean, or a puddle after a pebble fell into it. In fact, if you have ever dropped two pebbles into a puddle, you have probably seen interference at work, as in the picture above. Other liquids were observed to behave in a similar fashion, except perhaps wine if you’ve had too much. The idea of particles was familiar from rocks, pebbles, and sand. But this wave/particle duality—the idea that an object could be described as either a particle or a wave—is as foreign to everyday experience as is the idea that you can drink a chunk of sandstone.

Dualities like this—situations in which two very different theories accurately describe the same phenomenon—are consistent with model-dependent realism. Each theory can describe and explain certain properties, and neither theory can be said to be better or more real than the other. Regarding the laws that govern the universe, what we can say is this: There seems to be no single mathematical model or theory that can describe every aspect of the universe. Instead, as mentioned in the opening chapter, there seems to be the network of theories called M-theory. Each theory in the M-theory network is good at describing phenomena within a certain range. Wherever their ranges overlap, the various theories in the network agree, so they can all be said to be parts of the same theory. But no single theory within the network can describe every aspect of the universe—all the forces of nature, the particles that feel those forces, and the framework of space and time in which it all plays out. Though this situation does not fulfill the traditional physicists’ dream of a single unified theory, it is acceptable within the framework of model-dependent realism.

We will discuss duality and M-theory further in Chapter 5, but before that we turn to a fundamental principle upon which our modern view of nature is based: quantum theory, and in particular, the approach to quantum theory called alternative histories. In that view, the universe does not have just a single existence or history, but rather every possible version of the universe exists simultaneously in what is called a quantum superposition. That may sound as outrageous as the theory in which the table disappears whenever we leave the room, but in this case the theory has passed every experimental test to which it has ever been subjected.