Human Universe - Brian Cox, Andrew Cohen (2014)
WHO ARE WE?
But why, some say, the moon?
Why choose this as our goal?
And they may well ask why climb the highest mountain?
Why, 35 years ago, fly the Atlantic? …
We choose to go to the moon.
President John F. Kennedy
Astronaut John Young was once asked how he would feel if his epitaph read ‘John Young: The Ultimate Explorer’. Young smiled, and in a test pilot’s drawl replied, ‘I’d feel sorry for the guy who wrote it’. Young was, and still is, a hero of mine. My first vivid memory of live space exploration was watching Space Shuttle Columbia climb on a tower of bright vapour into a blue Cape sky on 12 April 1981. It was midday in Manchester, the Easter holidays, and I was 13 years old. Because of a two-day launch delay, Columbia’s test flight took place precisely 20 years to the day after Yuri Gagarin made his black-and-white voyage into orbit on 12 April 1961, but Young and his co-pilot, Bob Crippen, in their orange spacesuits, were astronauts from the colour age, the future – as distant from the Russian hero as gleaming white-winged Columbia was from Vostok 1. Equidistant from both was Apollo, which Young flew to the Moon. Twice. It was the age of optimism, the age of wonder, the golden age when the ape went into space. When unflappable aviator Young, whose pulse rate did not increase during the launch of NASA’s only manned spacecraft ever to have flown without an unmanned test flight, piloted Columbia back for a flawless manual landing at Edwards Air Force Base two days later, he turned to Crippen and said ‘We’re not too far away – the human race isn’t – from going to the stars’.
In 2014 the stars feel further away than they did in 1981; the International Space Station is a wonderful piece of engineering that has allowed us to learn how to live and work in near-Earth orbit, but it is no closer to the stars than Columbia. Its construction is no mean achievement; one of the most important things to realise about engineering at the edge is that the only way to learn is to actually do it. You can’t think your way into space; you have to fly there. But I can’t help but feel, in the words of Billy Bragg, that the space race is over and we’ve all grown up too soon.
It was different in Gagarin’s day. Nobody is born to be a spaceman. We’re apes, honed by natural selection to operate in the Great Rift Valley. Gagarin’s father was a carpenter and his mother was a milkmaid. Both worked on a collective farm. Gagarin’s first job at the age of 16 was in a steel mill, but after showing an aptitude for flight as an air cadet he joined the military when 21 and was posted to the First Chkalovsk Air Force Pilots School in Orenburg. Rising through the ranks, he made a name for himself as a skilled and intelligent aviator, and in early 1960 he was chosen along with 19 other elite pilots for the newly established space programme. Standing just 5 foot 2 inches tall, Gagarin had the right stuff and was perfect for the tiny Vostok spacecraft, whose single-seat crew compartment was only 2.3m in external diameter. After a year of training, Nikolai Kamanin, head of the cosmonaut programme, chose Gagarin ahead of his rival, Gherman Titov, just four days before the flight. The history books are filled with the names of great men and women whose presence in the collective memory of humanity was assured by the slimmest of margins. Gagarin, alongside Armstrong, will be remembered for as long as there are humans in the cosmos; the name of the equally brilliant Titov, Russia’s second cosmonaut, has faded away.
Gagarin’s flight was a true journey into the unknown. Strapped on top of the Vostok-K rocket, which flew 13 times and made it into space on 11 occasions, the 27-year-old performed like a true test pilot. Despite a two-hour delay during which every component of the spacecraft hatch was taken apart and rebuilt while Gagarin remained strapped into his seat, his heart rate was recorded at 64 beats per minute just before launch. This is not to say that Gagarin wasn’t fully aware of what he was about to do. Before boarding, Gagarin made one of the great speeches of the age.
‘Dear friends, both known and unknown to me, fellow Russians, and people of all countries and continents, in a few minutes a mighty spaceship will carry me into the far-away expanses of space. What can I say to you in these last minutes before the start? At this instant, the whole of my life seems to be condensed into one wonderful moment. Everything I have experienced and done till now has been in preparation for this moment. You must realise that it is hard to express my feeling now that the test for which we have been training long and passionately is at hand. I don’t have to tell you what I felt when it was suggested that I should make this flight, the first in history. Was it joy? No, it was something more than that. Pride? No, it was not just pride. I felt great happiness. To be the first to enter the cosmos, to engage single-handed in an unprecedented duel with nature – could anyone dream of anything greater than that? But immediately after that I thought of the tremendous responsibility I bore: to be the first to do what generations of people had dreamed of; to be the first to pave the way into space for mankind. This responsibility is not toward one person, not toward a few dozen, not toward a group. It is a responsibility toward all mankind – toward its present and its future. Am I happy as I set off on this space flight? Of course I’m happy. After all, in all times and epochs the greatest happiness for man has been to take part in new discoveries. It is a matter of minutes now before the start. I say to you, “Until we meet again”, dear friends, just as people say to each other when setting out on a long journey. I would like very much to embrace you all, people known and unknown to me, close friends and strangers alike. See you soon!’
It’s too easy to attach trite labels to human actions – magnificent, horrific and everything in between – based on a simplified view of their causes. One can argue that the rockets carried aloft the egos of the superpowers alongside the astronauts, and this is surely right. But Gagarin spoke these words, and I challenge anyone to read them and not detect sincerity. All our actions mask a morass of motivations, worthy and less so, and the greatest human adventures are no less noble for that.
At 9.07am local time, Gagarin blasted off from Baikonur Cosmodrome in Kazakhstan, as every Russian cosmonaut has done since. Within 10 minutes, he was orbiting Earth at an altitude of 380 kilometres. His route took him across the Siberian wastes and the Pacific Ocean above the Hawaiian islands, past the tip of South America and into the South Atlantic, where he was greeted by a second sunrise before a 42-second de-orbit burn over the Angolan coast slowed Vostok 1 into a parabolic orbit and an 8-g deceleration inside Earth’s thickening atmosphere. The journey once around his home world took 1 hour and 48 minutes. Gagarin ejected from the capsule 7 kilometres above ground and, as planned, cosmonaut and spacecraft completed the final descent apart. Gliding back to Earth by parachute, Gagarin landed 280 kilometres away from the intended landing site near the Russian city of Engels. Dressed in orange spacesuit and white helmet, a farmer and his daughter bore sole witness to his historic return. ‘When they saw me in my space suit and the parachute dragging alongside as I walked, they started to back away in fear,’ recollected Gagarin later. ‘I told them, don’t be afraid, I am a Soviet citizen like you, who has descended from space and I must find a telephone to call Moscow!’
Primates appeared relatively recently in the history of life on Earth. Studies of mitochondrial DNA suggest the Strepsirrhini suborder, containing the ancestors of Madagascar’s lemurs, diverged from our own Haplorhini suborder approximately 64 million years ago, which implies that a common ancestor was present before this time, but not a great deal earlier. The first complete primate fossil found to date is that of a tree-dwelling creature known as Archicebus achilles, dated at 55 million years old. Discovered in the fossil beds of central China in 2013, this tiny creature would have been no bigger than a human hand, making it not only the oldest but also one of the smallest known primates.
Our family, known as the Hominidae, or more commonly the great apes, share a common ancestor with Old World monkeys around 25 million years ago, and during the making of Human Universe we filmed a rare species of these distant cousins in the Ethiopian Highlands. The road out of Addis towards the 3000-metre Guassa Plateau is excellent to a point, and then not excellent. The scenery, on the other hand, improves with altitude. Golden grasses illuminated by shifting lambent light through dark clouds cling to near-vertical mountainsides framing pristine villages along the high valley floors. It is fresh, cold and insect-less on the peaks above the Rift; a place to drink tea and eat shiro, a spiced Ethiopian stew of chickpeas and lentils. After a night in the cold but magnificently desolate Guassa community lodge, we set off at dawn to intercept the gelada baboons on their way back to their caves and ledges from early-morning foraging expeditions on the higher slopes.
The gelada baboons are a species of Old World monkey found only in the Ethiopian Highlands. They are the only surviving species of the genus Theropithecus that once thrived across Africa and into Southern Europe and India. The males in particular are powerful, long-haired animals, weighing over 20 kilograms with a bright red flash of skin on their white chests. I was told not to look them in the eye, so I didn’t. Fifty thousand years ago, as our planet emerged from the last ice age, the gelada retreated into the highlands above the Rift where they still live, uniquely amongst extant primates, as graminivores, on a diet made up almost entirely of the tough high-altitude grasses and occasional herbs.
EVOLUTION OF HOMINIDS
These hominid evolutionary trees trace our genetic history as humans to the Old World monkeys that roamed Earth 25 million years ago. Discoveries of various remains, including those of the famous Australopithecus afarensis skeleton, commonly called Lucy, have helped us piece together an idea of our ancestry. It is believed that around 7 or 8 million years ago we split from the chimpanzees and the process of evolution into bipedal Homo sapiens began as these monkeys started to spend more time on the ground than in the trees.
EVOLUTIONARY TREE OF MONKEYS AND PRIMATES
They approach with nonchalant agility in small groups, which reflect the most complex social structure of any non-human primate. Most of the groups I saw contained one or two males and perhaps eight or ten females and their young. These are referred to as reproductive units, and clearly defined hierarchies exist within them. Females usually remain in the same unit for life, but males move between them every four or five years. There are also male-only units of ten or fifteen individuals. These social units are arranged into higher groupings known as bands, herds and communities. The community we encountered numbered several hundred individuals who wandered past in their little tribes, females and young pausing to eat, groom and play whilst the larger males eyed us closely.
Despite the 25-million-year separation in evolutionary time, the gelada are very easy to anthropomorphise, especially from a vantage point amongst them, probably because their behaviour seems reminiscent of our own and their babies are cute. Like us, they spend most of their time on the ground and operate in social groups. Some researchers familiar with the geladas claim they exhibit the most sophisticated communication behaviour of any non-human primate, employing gestures and a range of different vocalisations strung together into sequences communicating reassurance, appeasement, solicitation, aggression and defence. For all their sophistication, however, the gelada are a long way from possessing anything more complex than the simplest of human characteristics and abilities. This is, of course, an utterly obvious observation – they are monkeys! But what isn’t obvious is why. The gelada separated from our common ancestor at the same time, but that self-evident statement leads to a deeper question: what is it that happened to our ancestors during those 25 million years that led us to the stars and left them on the hillsides of the Guassa Plateau eating grass?
LUCY IN THE SKY
I am an aviation geek. I love aircraft. As I set off to film the African scenes for ‘ApeMan SpaceMan’, I noticed that the Ethiopian Airlines Boeing 787 I boarded at London Heathrow, bound for Addis Ababa, registration ET-AOS, was named ‘Lucy’. On the morning of 24 November 1974, Donald Johanson and a team of archaeologists were searching for bone fragments at a site near the Awash River in Ethiopia. The area was known to be a site rich in rare hominid fossils, but on that particular morning, Johanson and his graduate student Tom Gray found little to inspire them. As is often the way in science, however, a dash of serendipity, coupled with an experienced scientist who understands how to increase the chances of receiving its benefits, made a seminal contribution to the understanding of human evolution. Johanson shouldn’t even have been there – he had planned to spend time back at the camp updating his field notes – but as they prepared to leave, Johanson decided to wander over to a previously excavated gully and have one last look. Even though they’d surveyed the area before, this time Johanson’s eye was drawn to something lying partially hidden on the slope. Closer inspection revealed it to be an arm bone and a host of other skeletal fragments – a piece of skull, a thigh bone, vertebra, ribs and jaw all emerged from the ground and, crucially, they were all part of a single female skeleton. The find triggered a three-week excavation, during which every last scrap of fossil AL 288-1 was recovered. They named it Lucy, after track 3, side one, of Sgt. Pepper’s Lonely Hearts Club Band, because this was 1974 and they played it a lot on their tape recorder. ‘Home taping kills music’, they used to say back then, but it also names airliners.
Lucy lived 3.2 million years ago in the open savannah of Ethiopia’s Afar Depression. Standing just over 1 metre tall and weighing less than 30 kilograms, she would have looked more ape-like than a human. Her brain was small, about one-third of the size of a modern human’s and not much larger than a chimpanzee’s. The anatomy of her knee, the curve of her spine and the length of her leg bones suggest that Lucy regularly walked upright on two legs, although there are a handful of scientists who would disagree. What is generally agreed upon, however, is that Lucy was a member of the extinct hominin species Australopithecus afarensis, and she was either one of our direct ancestors, or very closely related to them. Her bipedalism was probably an evolutionary adaptation caused by climate change in the Rift. As the number of trees reduced and the landscape became more savannah-like, the arboreal existence of our more distant ancestors became less favoured, and the increasing distances between trees selected for Australopithecus’s upright gait made travel across the ground more efficient.
In ‘Who Speaks for Earth?’, the thirteenth chapter of Carl Sagan’s Cosmos, there are two pictures set side by side. One is of footprints covered by volcanic ash 3.7 million years ago near Laetoli, in Tanzania, probably made by an Australopithecus afarensis like Lucy. Some 400,000 kilometres away and 3.7 million years later, another hominin footprint was left in the dust of the Sea of Tranquility. Together, they speak eloquently of our unlikely, magnificent ascent from the Rift Valley to the stars. The remainder of this chapter deals with the 3 million years between Lucy and the Moon. The timescale is ridiculously small: less than a tenth of one per cent of the period of time during which life has existed on Earth. Lucy was little more than an upright chimpanzee; an animal, a genetic survival machine. We bring art, science, literature and meaning to the Earth; we are a world away, and yet separated by the blink of an eye. ‘Our obligation to survive is owed not just to ourselves but also to that Cosmos, ancient and vast, from which we spring,’ wrote Sagan. I’d like to add that we owe it to Lucy as well.
FROM THE NORTH STAR TO THE STARS
Before astrology was consigned to the status of trifling funfair entertainment by science, it was believed that the position of the planets against the distant stars had a profound effect on people’s daily lives. If you don’t know what the stars or planets actually are, this is at least within the bounds of reason, but as our understanding of physics improved, so it became clear that there is no way that the position of a distant planet relative to the fixed stars can have any effect on the behaviour of a human being on the surface of the Earth. The planets can and do affect the Earth’s motion through the solar system over timescales far greater than those of human lifetimes, though, and recent research suggests that long-term changes in Earth’s orientation and orbit may have played a crucial role in hominid evolution.
Polaris is a true giant, almost 50 times the diameter of our sun. It is also a Cepheid variable, one of the valuable standard candles upon which the astronomical distance scale rests. At a distance of only 434 light years, it is both the closest Cepheid and one of the brighter stars in the sky, dominating the constellation Ursa Minor. Polaris also happens to be aligned directly with the Earth’s spin axis, and this special position on the celestial North Pole makes it invaluable to navigators. As the Earth spins on its axis, Polaris sits serenely as all other stars rotate around it. At any point in the northern hemisphere, your latitude is the angle between Polaris and the horizon: zero degrees north at the equator, where Polaris is on the horizon, and 90 degrees north at the Pole where Polaris is directly overhead. As viewed from Oldham, Lancashire, UK, Polaris sits at an angle of 53.54 degrees above the horizon.
Christopher Columbus and Ferdinand Magellan relied on Polaris as they crossed the oceans and explored new worlds. Perhaps more surprisingly, on board Apollo 8 Jim Lovell carried a sextant as a back-up navigational device. Designed by the MIT instrument laboratory in Cambridge, Massachusetts, it may not have looked traditional but it operated in exactly the same manner as the one constructed by instrument maker John Bird in 1757. Polaris was one of Apollo’s key navigational stars. It was paired with Gamma Cassiopeia on Lovell’s charts, which was known in Apollo jargon as ‘Navi’. The name was coined by Gus Grissom on Apollo 1 as a prank – it was his middle name ‘Ivan’ backwards. Two other navigational stars, Gamma Velorum and Iota Ursa Major, were named ‘Regor’ after Roger Chaffee and ‘Dnoces’ after Ed White the ‘Second’. Using the stars for navigation might seem hopelessly old-fashioned, but if you think about it for a moment, you’ll realise that there is no other way that a spacecraft in deep space can orient itself, other than relative to the fixed stars on the celestial sphere.
A spacecraft will often shift its position relative to the stars, but on Earth, things feel different because our orbit around the Sun is relatively stable from year to year. There are wobbles on relatively short timescales associated with changes in the speed of Earth’s rotation, and these lead to the insertion of leap seconds to keep our atomic clocks synchronised with the heavens. Between 1972 and 1979, nine leap seconds had to be inserted, whilst none was needed between the beginning of 1999 and the end of 2005. Earth’s rotation rate is noticeably chaotic when compared to the accuracy of atomic clocks.
The largest short-term contribution to changes in Earth’s rotation comes from the gravitational influence of the Moon, which acts to slow down the rate of spin by around 2.3 milliseconds per century due to friction between the tidal bulges in the oceans and the rotating solid Earth beneath, but there are also longer-term changes. The most pronounced of these is known as axial precession or, more commonly, the precession of the equinoxes. The Earth spins on its axis like a gyroscope, and because it spins, it bulges out at the equator. Because the Earth isn’t a perfect sphere, the gravitational influence of the Sun and Moon exerts a torque on the Earth that causes its spin axis to sweep around in a circle once every 26,000 years. This is not subtle, because the spin axis itself is tilted at 23 degrees relative to the plane of Earth’s orbit, and precession therefore has a large effect on the night sky that was first documented by the Greek astronomer Hipparchus, around 150 BCE. Precession manifests itself as a shift in the position of the celestial pole relative to the fixed stars. There will come a time in the not too distant future when Polaris will no longer sit above the celestial North Pole as our spin axis traces out a circle in the sky. In about 3000 years’ time, navigators of the future will rely on Gamma Cephei as a back-up for their GPS systems as they sail across the seas of our planet, and in 8000 years it will be the bright star Deneb. The identity of the North Star has altered many times throughout human history. As the Egyptians finished building the Great Pyramid of Giza in 2560 BCE, Alpha Draconis lay closest to the celestial pole. Two and a half thousand years later, as the Romans did things for us, Kochab, the second-brightest star in Ursa Minor, and its neighbour Pherkad were known as the ‘Guardians of the Pole’. Precession therefore affects navigation, but more importantly it also affects our climate.
The 23-degree tilt of Earth’s spin axis is responsible for the seasons; summer in the northern hemisphere occurs when the North Pole is tilted towards the Sun, leading to constant daylight within the Arctic Circle. Half a year later and the geometry is reversed, with the South Pole receiving 24-hour daylight and the southern hemisphere experiencing summer. Precession alone would have no effect on the climate if the Earth’s orbit were a perfect circle, but it isn’t; it is elliptical, with the Sun at one focus. At the turn of the twenty-first century, it happens to be the case that the Earth is at its closest approach to the Sun (known as perihelion) in January, just after the winter solstice when the North Pole is pointing away from the Sun. This makes northern winters slightly milder than they would otherwise be, because the Earth receives a little bit more solar radiation during the northern winter. In around 10,000 years’ time, however, precession will have carried the Earth’s spin axis around by a half-turn, and it will be the North Pole that points towards the Sun at perihelion, making northern hemisphere summers slightly warmer and winters cooler. The more elliptical the Earth’s orbit, the more pronounced this effect.
This is where things get a little more complicated, but it’s the complication that matters for our story. The planets are significantly further away than the Moon, but also significantly more massive, and their constantly shifting positions induce periodic changes to our orbit over long timescales. Jupiter has the most pronounced effect due to its large mass and relative proximity. The largest of these changes occurs on a timescale of 400,000 years. Picture the Earth’s orbit becoming periodically more elliptical and more circular, stretching back and forth with a period of 400,000 years. This oscillation modulates the effect of precession on the climate; at the times when the Earth’s orbit is at its most elliptical, the changes due to precession will be at their most pronounced. This effect is known as astronomical or orbital forcing of the climate.
There are many such resonances in Earth’s orbit – another important change in the eccentricity of the ellipse occurs every 100,000 years. Furthermore, the tilt of the axis itself swings back and forth between around 22 and 25 degrees on a 41,000-year cycle. The whole solar system is like a giant bell, ringing with many hundreds of harmonics driven by the gravitational interactions between the Sun, planets and moons.
Over many thousands of years, these shifts in the Earth’s orbit and orientation relative to the Sun have led to dramatic changes in climate, and are certainly one of the key mechanisms that drive the Earth into and out of ice ages. It is perhaps obvious that these long-term shifts in climate should have had an effect on the evolution of life; ice ages present a significant challenge to animals and plants and this will provoke an evolutionary response via natural selection. More surprisingly, recent research has suggested a direct link between precession, the 400,000-year eccentricity cycle, and the evolution of early modern humans.
The Milankovitch theory describes the collective effects of changes in the Earth’s movements upon its climate. They are named after Serbian geophysicist and astronomer Milutin Milankovitch, who worked on it during his internment as a prisoner in World War One. Milankovitch mathematically theorised that variations in eccentricity, axial tilt and precession of the Earth’s orbit determined climatic patterns on Earth. The Earth’s axis completes one full cycle of precession approximately every 26,000 years. At the same time, the elliptical orbit rotates over a much longer timescale. The combined effect of the two precessions leads to a 21,000-year period between the astronomical seasons and the orbit. In addition, the angle between Earth’s rotational axis and the normal to the plane of its orbit (obliquity) oscillates between 22.1 and 24.5 degrees on a 41,000-year cycle. It is currently 23.44 degrees and decreasing.
THE PRECESSION OF EARTH’S SPIN AXIS
The Earth wobbles like a top on its axis over a 20,000-year cycle. The tilt of the Earth’s axis changes over a 40,000-year interval. The shape of its orbit changes the Earth’s distance from the Sun over a period of 100,000 years.
CLIMATE CHANGE IN THE RIFT VALLEY AND HUMAN EVOLUTION
The Great Rift Valley: evocative words that immediately suggest origins. There are many reasons I love visiting Ethiopia. I love the people. I love the food. I love the high-altitude freshness of Addis. I love the mountains and valleys and high plains. I even loved visiting Erta Ale, the legendary shield volcano at the Afar Triple Junction known as the gateway to hell, although I probably won’t do it again. But I also love an idea. It’s impossible to visit this ancient country and not catch a glimpse in your peripheral vision of a chain of ghosts stretching back ten thousand generations, because it is firmly embedded in popular culture that we came from here. Every one of us is related to someone who lived in Ethiopia hundreds of thousands of years ago. It is the Garden of Eden, the place where humanity began. What popular culture has yet to assimilate, however, is the fortuitous and precarious nature of the ascent of man. When I was growing up I remember talk of ‘the missing link’, that elusive fossil that would tie us definitively to our ape-like ancestors. When I started school, DNA sequencing was not yet invented, and Lucy hadn’t been unearthed. Today, we have a significantly more complete view of how Australopithecines like Lucy are related to modern humans, and whilst the details are still debated and new evidence is continually updating the standard model of hominin evolution, it is now possible to tell the broad sweep of the story in some detail.
The members of our human evolutionary family are referred to as hominins. The split between hominins and the ancestor of the chimpanzee occurred at some point before 5 million years ago in Africa, and by 4 million years ago, Australopithecus afarensis – Lucy – was present. Their brain size was approximately 500cc, around the same as a chimpanzee and less than one-third of that of a modern human. Around 1.8 million years ago, there was a step change in both brain size and the number of hominin species in the East African Rift. Several species of our genus Homo appeared, including Homo habilis and Homo erectus. They lived for a time alongside other species, including several Australopithecines and a genus known as Paranthropus. There are anthropologists who prefer to classify Paranthropus as a different species of Australopithecus. I make this point not to be confusing, but to highlight an important fact; the study of hominin evolution is a difficult area, and it is not surprising that there are ongoing debates about the classification of 2-million-year-old fossils and DNA sequences. What is important for our story, however, and what nobody disputes, is that there seems to have been a jump in both brain size and the number of species of hominins in the Rift Valley region around 1.8 million years ago. By around 1.4 million years ago, only one of these species had survived – Homo erectus – with a brain size of 1000cc. The next milestone is the appearance of Homo heidelbergensis around 800,000 years ago. Homo heidelbergensis is generally accepted to be the ancestor of Homo sapiens and the Neanderthals who lived alongside us in Europe until around 45,000 years ago, and possibly later. Homo heidelbergensis represented another jump in brain size, up to around 1400cc, which is close to that of modern humans.
In the late 1960s and early 1970s, two hominin skulls were found near the Omo River in Ethiopia. Known as Omo 1 and Omo 2, argon dating of the volcanic sediments around the level they were found dates them at 195,000+/-5000 years old. These are the earliest fossilised remains to have been identified as Homo sapiens.
The interesting question is what caused these rapid increases in brain size, driving hominin intelligence from the chimpanzee-like capabilities of Australopithecus to modern humans in only a few million years. Again, this is a very active area of research, and there are differences of opinion amongst experts. This is the nature of science at the frontier of knowledge, and this is what makes science exciting and successful. The model we are focusing on is the most widely accepted theory of human evolution. It is known as the recent single origin hypothesis, or more colloquially the ‘Out of Africa’ model, and the dates and locations we have described so far might be referred to as ‘textbook’. There is broad consensus, therefore, about the ‘When?’ and the ‘Where?’ But not ‘Why?’, and it is to ‘Why?’ that we now turn.
There is a trend towards larger brain size over the 4 million years since the emergence of Australopithecus, but the trend is not gradual. There is a large jump around 1.8 million years ago with the emergence of Homo erectus, and another jump just under 1 million years ago with Homo heidelbergensis. The final jump occurs when Homo sapiens emerges 200,000 years ago. The time period around 1.8 million years ago also corresponds to a leap in the number of hominin species present in the Rift Valley; there were at least five or six species living side by side, suggesting that something interesting occurred around this time which may have been responsible for, or was a contributing factor to, the observed increase in brain size, particularly in Homo erectus.
The internal volume of the primate skull increases from 275–500cc in chimpanzees to 1130–1260cc in modern humans. Neanderthals had a brain capacity in the range 1500–1800cc – the largest of any hominids. Recent research indicates that, in primates, whole brain size is a better measure of cognitive abilities.
The large number of deep-water lakes appearing temporarily in the Rift Valley around 1.8 million years ago indicates that at this time the climate, and in particular the level of rainfall, was varying quickly and violently. Similar climate variation occurs around 1 million years ago and 200,000 years ago, and this appears to be correlated with increase in hominin brain size. The theory is that rapidly changing climatic conditions in the Rift Valley at these specific times played an important role in driving the increases in brain size.
The selection pressures that may have led to these increases are unclear. Selection for adaptability was probably an important factor, but social factors such as the ability to live in large groups, and intra-species competition as a result of the larger number of species living side by side, particularly around 1.8 million years ago, must also have played a role. Having said that, it does appear that climate variation in the Rift Valley 1.8 million, 1 million and 200,000 years ago could have been a contributing factor to the development of our intelligence. This is known as the Pulse Climate Variability hypothesis.
We can now bring all these threads together to reveal a surprising and, for me, dizzying hypothesis which, if correct, sheds new light on the immensely contingent nature of the existence of our modern civilisation – or, in simpler language, why we are bloody lucky to be here!
The three dates – 1.8 million, 1 million and 200,000 years ago – correspond to the times when the Earth’s orbit was at its most elliptical. As described above, the mechanism by which climate changes due to precession at these times is well understood. The Pulse Climate Variability hypothesis asserts that the unique geology and position of the Great Rift Valley amplified these changes, and that early hominins responded by increasing their brain size. If this is correct, our brains evolved as a response to changes in the Earth’s orbit, driven by the precise arrangement of the orbits of the other planets around the Sun, and precession, driven primarily by the gravitational interaction between the Moon and Earth’s axial tilt, both of which date back to a collision early in the history of the solar system, and all this is plainly blind luck. Without an inconceivably unlikely set of coincidences, and the way these conspired together to change the climate in one system of valleys in wonderful Ethiopia, we wouldn’t exist.
Homo neanderthalensis has a unique combination of features on its skull that is distinct from fossil and extant ‘anatomically modern’ humans. Modern research involving morphological evidence, direct isotopic dates and fossil mitochondrial DNA from three Neanderthals indicates that the Neanderthals were a separate evolutionary lineage for at least 500,000 years. However, it is unknown when and how Neanderthals’ unique craniofacial features emerged.
If this is correct, then what a response! I held a brain for the cameras at St Paul’s teaching hospital in Addis. It is the most complex single object in the known universe, a most intricate example of emergent complexity assembled over 4 billion years by natural selection operating within the constraints placed upon it by the laws of physics and the particular biochemistry of life on Earth.
The brain contains around 85 billion individual neurons, which is of the same order as the number of stars in an average galaxy. But that doesn’t begin to describe its complexity. Each neuron is thought to make between 10,000 and 100,000 connections to other neurons, making the brain a computer way beyond anything our current technology can simulate. When we do manage to simulate one, I have no doubt that sentience will emerge; consciousness is not magic, it is an emergent property consistent with the known laws of nature. But that doesn’t lessen the wonder one iota. Out of this evolutionary marvel, we emerge. Thoughts, feelings, hopes and dreams exist on Earth because of electrical activity inside a 1.5-kilogram blob of stuff, which hasn’t changed much since the earliest modern humans began the long journey out of Africa.
If you could travel back in time and bring a newborn baby from 200,000 years ago into the twenty-first century, allowing it to grow up in our modern society with a modern education, it could achieve anything a modern child could. It could even become an astronaut. Which sets up one more question: if the hardware was present 200,000 years ago, then what changed to lift us from the Great Rift Valley into space?
‘AN UNPRECEDENTED DUEL WITH NATURE’
‘The best thing we can do now is just to listen and hope’, said Cliff Michelmore, broadcasting from the BBC’s studios 24 minutes from the expected splashdown of Apollo 13. On 17 April 1970, I was too young to watch the live broadcast, but I’ve seen the recording many times since. Grainy pictures from the deck of the USS Iwo Jima, its flight deck crammed with nervous sailors off the coast of Samoa; Patrick Moore and Geoffery Pardoe grim-faced in the studios, and James Burke, famously, with fingers crossed behind his back. ‘Apollo control, Houston, we’ve just had loss of signal from Honeysuckle’. Honeysuckle Creek Tracking Station in Canberra, Australia, was the last ground station to contact Apollo 13 before it entered the Earth’s atmosphere on its way home. Signal loss during re-entry is routine high drama on all space missions; the ionisation of the atmosphere caused by the frictional heating of the spacecraft blocks radio signals, typically resulting in radio silence for four minutes. On Apollo 13, six minutes passed in silence. The brilliance of the BBC’s quartet of commentators was in the silence they allowed on the airwaves. The only sound was the static of the NASA feed – a moment of genuine tension. No need for vacuous media babble; nobody could bring themselves to speak. ‘We’ll only know whether that heat shield was damaged by that explosion three days ago when they come out of radio blackout in just over two minutes’ said Burke. Silence. As four minutes passed, Houston reports ‘10 seconds to end of radio blackout’. Silence. Houston: ‘We’ve had a report that Orion 4 aircraft has acquisition of signal.’ ‘They’re through’ says Burke. ‘Let’s not anticipate, because the parachutes may have been damaged.’ ‘Shutes should be out’, murmurs Burke; not broadcasting, just saying. ‘There they are, there they are!’ ‘They’ve made it’ remarks Moore. And then applause. ‘I make it no more than 5 seconds late!’ shouts Burke, ‘No more than 5 seconds late!’
The safe return of Apollo 13 was arguably NASA’s finest hour; 55 hours 54 minutes and 53 seconds into the mission, 320,000 kilometres from Earth, Lunar Module pilot Jack Swigert switched on a system of stirring fans in the hydrogen and oxygen tanks in the service module, a routine procedure. A piece of Teflon insulation inside the tank had been damaged, it was later discovered, by a series of unlikely events that happened on the ground during the preparation of the spacecraft for flight. The wire shorted, the tank exploded, and the side of the service module was blown off, critically damaging the spacecraft’s power supply systems and venting the crew’s oxygen supply out into space.
The Command Module, the only part of the spacecraft capable of surviving a re-entry through the Earth’s atmosphere, was now running on batteries and with a rapidly diminishing oxygen supply that would not keep the astronauts alive long enough to return to Earth. The only option was to shut down the Command Module and retreat to the Lunar Module, effectively using it as a life raft. Lovell later spoke of how he didn’t regret the mission at all. He was robbed of his Moon landing, which must have been doubly frustrating given he’d already flown to the Moon on the historic Apollo 8 mission. But his reaction, revealed in interviews in later life, offers great insight into the character of a test pilot. ‘We were given the situation,’ Lovell explained, ‘to really exercise our skills, and our talents to take a situation which was almost certainly catastrophic, and come home safely. That’s why I thought that 13, of all the flights – including [Apollo] 11 – that 13 exemplified a real test pilot’s flight.’ Both Lovell and Haise have said that the idea of not returning safely to Earth never really came up. ‘There was nothing there that said irrefutably we don’t have a chance.’
Haise was correct, of course, because they did return safely. But they only had enough food and water to sustain two people for a day and a half and had to improvise a carbon dioxide filter to provide them with enough breathable air for the return journey. Locked in the Lunar Module with limited supplies of food and water and temperatures dropping towards freezing, life was far from comfortable. With the Command Module powered down to preserve the sparse battery supplies left after the loss of the fuel cells, the crew had to survive in a hostile environment with limited resources. Like so many outposts of human civilisation throughout history, shortage of water was a primary concern. Water was critical on the Lunar Module for two reasons; as well as being needed to keep the crew hydrated and to rehydrate the food, it also cooled the electrical systems on the spacecraft. Conserving water therefore became a critical part of the plan to return to Earth. Reducing their intake to just one-fifth of a normal human water ration, each of the crew suffered severe dehydration and together they lost 31.5 pounds in weight – nearly fifty per cent more than any other Apollo crew.
Despite the discomfort, setting a new mission trajectory and navigating their way along it remained the primary challenge. The standard way to make in-flight course corrections on Apollo was to use the Command Module’s main engine, but the system was located close to the damaged site and mission controllers decided that lighting it was too great a risk. Instead, the decision was made to use the LM’s descent engine to send them around the far side of the Moon and back to Earth in four and a half days. This is known as a free-return trajectory – a slingshot around the Moon at the correct angle to return directly to Earth. No one knew if an engine designed for a completely different purpose would perform this function successfully – but they knew that if it failed they would not return.
Five hours after the initial explosion, the LM engine was fired for a 35-second burn, successfully putting the crew onto a free-return trajectory. This solved one problem but raised another. Calculations of the trajectory estimated return to Earth 153 hours after launch, which would push the key reserves on the craft too low for comfort, so it was decided to speed up the spacecraft with another burn, cutting the total time of the voyage by ten crucial hours. Such were the slim margins on Apollo 13. The main navigation system in the Command Module was out of action, so Lovell had to calculate the correct navigational inputs, while back at base, mission control worked through the same calculations as a cross-check. Lovell also got to use his sextant, which he played with on Apollo 8, to navigate by the stars for real.
The calculations are preserved as handwritten notes, in the Lunar Module System’s Activation Checklist. This was the checklist Lovell and Haise would have used to fly down to the Moon’s surface. Now useless, Lovell used the waste paper to write down instructions to put the ship on course for Earth. Two hours after they rounded the far side of the Moon, the LM engine fired, following Lovell’s handwritten checklist, increasing the speed of the spacecraft by 860 feet per second and buying them ten precious hours.
The most dramatic rescue in the history of human spaceflight stands as a testament to the brilliance of the three test pilots Lovell, Haise and Swigert, and also to the brilliance of the engineers on the ground who simply knew their stuff. NASA’s Apollo engineers were young by today’s standards; the average age of the team in mission control for the splashdown of Apollo 11 was 28 years old. This is one of the reasons why the United States reaped such a colossal economic reward from its investment in Apollo. The generation of scientists and engineers who worked on and were inspired by Apollo went out into the wider economy and delivered a huge investment return; a series of studies, including one by Chase Econometrics, showed that for every dollar invested in Apollo, at least $6 or $7 was returned as increased GDP growth. This should, of course, be bloody obvious – new knowledge grows GDP – but every generation of politicians seems to require re-educating to understand the difference between spending and investment. And while I’m polemicising, let me say that the usual political argument – that public support is needed for such large investments – is drivel. Firstly, the investment in NASA wasn’t that large, never exceeding 4.5 per cent of the Federal budget throughout the lifetime of Apollo. And secondly, it is a politician’s job to lead from the front. Make the case that investment in knowledge, in pushing the boundaries of human capabilities and exploring all frontiers, both physical and intellectual, is the key to the future wealth, prosperity and security of civilisation. Aspire to be Kennedy, not a hand-wringing apologist for intellectual and technological decline.
The nine Apollo flights to the Moon remain the furthest modern humans have explored beyond the Rift Valley in our 200,000-year history. Homo sapiens first left Africa in large numbers 60,000 years ago, so on geological timescales we didn’t hang around. Our ancestors followed waves of earlier hominins. Homo erectus were in South East Asia 1.6 million years ago, and half a million years later Neanderthals had colonised Europe and Homo floresiensis were in Southern Asia. The details of the migration 60,000 years ago are particularly well understood as a result of the combination of genetic, archaeological and linguistic studies. The precision comes in part from the tracking sequences of mitochondrial DNA, which is passed down from the mother and not shuffled by sex. This makes it relatively stable and easy to track – changes are caused by mutations alone. The most widely accepted interpretation of the data suggests that a small population of between 1000 and 2500 individuals left East Africa 60,000 years ago and moved north across the Red Sea and through Arabia. The group then split, moving into Southern Europe 43,000 years ago, and travelling through India and into Australia on roughly the same timescale. The crossing into North America, via eastern Russia, was probably later, around 15,000 years ago.
OUT OF AFRICA
Evidence from fossils, ancient artefacts and genetic analyses combine to tell a compelling story of the migration of anatomically modern humans. Two possible routes have been identified for the human exodus out of Africa. A northern route would have taken our ancestors from their base in eastern sub-Saharan Africa across the Sahara desert, then through Sinai and into the Levant. An alternative southern route may have charted a path from Djibouti or Eritrea in the Horn of Africa across the Bab el-Mandeb strait and into Yemen and around the Arabian Peninsula.
These early groups of humans were hunter-gatherers. It has been estimated that the basic social units would have reached a maximum of around 150 individuals. This is known as Dunbar’s number, after the British anthropologist Robin Dunbar, who suggests that the largest social group amongst any given population of primates is related to the size of their brains (specifically the neocortex). Dunbar’s number can be observed today in the size of the average person’s social network, both in the real world and online; our hardware – the brain – has not changed appreciably since the first humans appeared in Africa 200,000 years ago. These social groups would have lived in loosely bound tribes, perhaps reaching a size of between one and two thousand individuals, operating within an area of around 100 kilometres. Populations would stabilise, perhaps in response to social factors, but also as a result of increased mortality rates caused by parasitic diseases and diminishing per-capita resource availability, before fragmenting and spreading. In this fashion, the rate of progression of our ancestors across the globe has been estimated to have been around 0.5 kilometres per year, or 15 kilometres per generation. Population density did not rise significantly beyond these levels until these proto-societies shifted from a hunter-gatherer lifestyle to agriculture around 12,000 years ago. This shift was the trigger for the development of civilisation: the most important single step, following the migration out of Africa, in the journey from apeman to spaceman.
FARMING: THE BEDROCK OF CIVILISATION
There are many competing theories as to the reason for the domestication of crops, but many note the correlation between the first evidence of agriculture and the beginning of the current inter-glacial period known as the Holocene, 12,000 years ago. In the fertile crescent around modern-day Jordan and Syria, people known as the Natufians were beginning to settle into larger communities, perhaps because of the relatively benign climate. The area would have been forested and rich in wild cereals, fruits and nuts, rather than the austere desert of today. One theory is that a brief 1000-year cold period known as the Younger Dryas, beginning around 10,800 BCE, triggered drier conditions in the region, forcing the Natufians to begin cultivating the previously abundant wild crops on which they had come to rely. Whatever the reason, it is generally agreed that the foundational crops of modern agriculture, including wheat, barley, peas and lentils, were all to be found in the Fertile Crescent by 9000 BCE, and by 8000 BCE the banks of the Nile were being cultivated.
At approximately the same time, evidence of farming can be found in Asia’s Indus Valley, in China and in Mesoamerica. This suggests that there was no single environmental or developmental cause for agriculture, because it appeared independently at many sites across the world. Rather, our large brains and relatively large social groups were ready to take up the challenge when the need arose.
Once agriculture was established, larger numbers of people could live together, taking advantage of the more stable food supply. The freedom from continual hunting and gathering would have introduced a new aspect to human life – free time – and it was used to great effect. Some of the earliest farmers settled in a place known as Beidha in modern-day Jordan around 7000 BCE. Living in round, stone-built houses, they grew barley and wheat and kept domesticated goats, engaged in ritual and ceremony and buried their dead. Importantly, each of these activities was carried out in specific areas of the settlements: the beginning of ‘town planning’. By the second century BCE a Semitic people known as the Nabataeans lived around Beidha. They employed new technologies to increase the reliability of farming and constructed walled agricultural terraces on the hillsides around the village to collect and store water. Animal husbandry was also expanding, with the domestication of cows, pigs, donkeys and horses. Even previously dangerous animals were coerced into living with humans; there is evidence that the Nabataeans kept dogs. As the great empires of Egypt, Greece and Rome prospered, the Nabataeans remained partially nomadic, driving their camel trains across the desert along the long-established trade routes between North Africa and India and the great cities of the Mediterranean. But then, around 150 BCE, they decided to try something different. A few kilometres south of Beidha, in a narrow gorge naturally formed in the soft sandstone rock, they built the city of Petra.
Today tourists stream through a magnificent passageway lined with buildings carved out of the desert rocks and known as the Siq, but 2000 years ago the great and good of Mesopotamia, Rome and Egypt would have walked this route into this jewel of late antiquity.
The grandeur of the buildings is still overwhelming; they stand not as great architecture for their time, but as simply great, with no caveat. The most famous is called Al Khazneh, which means ‘Treasure Box’, because of the carved urn above the entrance which, Bedouin legend has it, contains the treasure of a Pharaoh. Monumental architecture is a common feature in the rise of human civilisation. It is a statement of power and grandeur to impress and cow outsiders, but it also serves an internal purpose, cementing the position of the rulers in the hierarchy and therefore providing the stability and security on which civilisation rests. Over time, a virtuous circle emerges: the buildings help the civilisation prosper, and the more prosperous the civilisation, the more impressive the buildings become.
Petra’s wealth was derived from its location. Built within a natural gorge, the area is prone to flash floods, which provided precious water in a landscape that was arid by the time the Nabataeans began to build. The city also sits at the fulcrum of the ancient nomadic trade routes along which wood, spices, incense and dyes were transported from Africa and India and into the great Mediterranean civilisation beyond. The appetite of the Greeks and Romans for exotic goods was insatiable; black pepper alone fetched 40 times its own weight in gold in a Roman market. Petra, because of its strategic location, controlled all that trade and taxed it. Today, 1500 years after the city was abandoned, it is still a magnificent site – an overused but entirely accurate statement. Talk to an archaeologist, however, and you quickly realise how much more impressive it would have been in its hey-day. The hillsides running down the valley from the carved tombs are scattered with rocks, but closer inspection reveals them to be bricks, the remains of houses, temples and palaces. Everything from Al Khazneh to the houses would have been covered in white plaster and painted in bright colours which would have appeared resplendent against the monochrome desert sands.
To build on this scale required a huge labour force; Petra was home to at least thirty thousand people living in a few square kilometres of desert. Such a population density required technological innovation on a metropolitan scale, and the Nabataeans, perhaps more than any other civilisation in antiquity, were masters of fluid engineering. Virtually every drop of rainwater that fell on the surrounding hillsides was captured in grooves and stored in giant reservoirs and cisterns. They were better at plumbing than the Romans, who employed the Petran engineers in Rome. Petra had the world’s first pressurised water system, which could deliver 12 million gallons of water a day into the city.
Outside the city, the irrigation system continued out into the surrounding fields, lining the hillsides in still-visible terraces; the Nabataeans didn’t simply build a city, they terra-formed a landscape. I stood and imagined the ancient valley views with some awe; the mountain slopes would have been green with maize, barley, pulses and vineyards – a desert turned green and feeding this grandest of desert civilisations for six centuries. Whenever I see the ruins of Petra, Rome, Athens or Cairo, I wonder what Earth would be like today if the great civilisations of antiquity had not fallen. I blame the philosophers for not discovering the scientific method earlier and inventing the electric motor. How hard can it be?
Agriculture, then, was fundamentally important to the rise of civilisation because it enabled large numbers of people to live in one place, and gave them access to resources and time, which would have been unavailable to hunter-gatherers. With resources and time comes the division of labour, freeing up a small but important subset of individuals to engage in pursuits other than those necessary for immediate survival. Farmers, stonemasons, priests, soldiers, administrators and artisans emerge, together with a ruling class who begin to direct the construction of monumental architecture, partly for their own selfish ends. And cities like Petra become possible.
Petra was a relative latecomer in the emergence of the cities and civilisations of antiquity. The first great ancient civilisation, the Old Kingdom of Egypt, arose around 2600 BCE along the fertile and farmed banks of the Nile, and precisely the same pattern of agriculture, followed by social stratification, ritual and monumental architecture, can be seen. Present also in the Old Kingdom, and possibly developed there, was the one final vitally important innovation we will soon discuss: the written word.
THE KAZAK ADVENTURE: PART 1
It all seemed so simple when written down on a piece of paper. The BBC prepares something known as a call sheet, which tells a film crew everything they need to know about a trip. Call sheets are very neat; all the timings work beautifully, carefully documenting flights, ground transfers to locations and filming and rest periods, all of course in accordance with health and safety regulations and all that. Things never quite work out the way they’re envisaged back in the office, of course, but filming the return of the Expedition 38 crew from the International Space Station to the Kazak Steppe in March 2014 was the wildest adventure I’ve experienced.
The call sheet said that we would fly into Astana on 8 March, arriving at 1am on the 9th into our hotel. After a leisurely breakfast at 9am, we’d drive to a city called Karaganda, which has a spectacular statue of Yuri Gagarin in the town square. There, we’d meet up with our drivers who, embedded with Roscosmos, the Russian space agency, would drive us out to the landing site the following morning, arriving in time for a ‘hot meal’ and a good rest on the Steppe, ready to film the landing on the morning of the 11th after, of course, a ‘hot breakfast’. We’d then drive back to the airport, hop on a flight, and be home in time for lunch on the 12th. A doddle. Bollocks.
The Steppe of central Kazakhstan in March is a featureless frozen wilderness covering around 800,000 square kilometres of the country’s interior. There are no towns and few roads; just tufts of stunted brown grass and snow fading into an ice-grey leaden sky. In March 2014 temperatures were unseasonably cold, falling below -20°C at night, and it was snowing. Our team had standard 4×4 vehicles, which got stuck in the snow by mid-afternoon the day before the landing, even though we’d set off three hours earlier than the 6am officially sanctioned health and safety call time because of the weather. This was problematic, because our ‘ApeMan SpaceMan’ film was constructed carefully around this moment – the return of three human beings from space. Over vodka, cold meat and bread, we discussed our options.
We’d been helped along the snowy roads by a Russian team from the Siberian city of Tobolsk in two spectacular 6-wheel-drive vehicles, hand-built by a company called Petrovich. Tobolsk is best known for being the place dissidents were sent during the Soviet era. Tsar Nicholas II and his family enjoyed the Tobolskian hospitality for a year before being transported to Ekaterinburg to be shot. Mendeleev, the inventor of the Periodic Table, was born there, but so was Rasputin. It’s a tough place, and they know how to build tough vehicles. Our guide from Roscosmos managed to radio the Petrovich team, and they agreed that if we could catch them up in the frozen wilderness, they could take two of us out to the landing site. The cameraman and I jumped aboard a pair of snowmobiles, and headed out into the rapidly dimming late-afternoon twilight in search of the men from Siberia. If we hadn’t found them, then presumably you wouldn’t be reading this, but we did.
It was a difficult decision to jump onto the snowmobiles. We didn’t have a satellite phone because they are illegal in Kazakhstan, and nobody spoke English so we couldn’t quite assess the level of difficulty associated with finding these two Siberian needles in a Kazak Steppe. And we didn’t know who the Siberians actually were. It seemed that they were freelancers, hired to take photographs and broadcast live television pictures back from the landing site for the Russian space agency. We also had to decide whether we could make the film with only two people. Much as I spend a lot of time dreaming about jettisoning directors, producers and executives, there is a reason why we usually take a crew of six. Sound is particularly important; you don’t really miss the soundman until he’s not there (our soundman on the series is called Andy, but we always called him soundman – there are too many other things to remember).
As it turned out, the Petrovich crew were a hospitable and professional bunch, although their willingness to spend many days out in the wilderness waiting for the Soyuz – they’d driven down from Siberia and were in no hurry to get home – played on our minds. Approaching midnight on the night before the landing, we received a message from Roscosmos that the landing might be postponed due to the poor weather, and the decision was made to camp out on the Steppe and wait. In the distance, we could make out a small group of farm buildings through the snow, and we headed towards them. In broken English, one of the crew told us that it is a Kazak tradition to welcome travellers into your home, at any time of the day or night, and offer them food. And so we found ourselves inside a farm house that appeared to have heated walls and resembled the inside of an oven, eating a feast of jam, bread, assorted sweets and horse, all washed down with vodka, which the Petrovich crew carried in large crates alongside their satellite broadcasting hardware. It was unforgettable. Human Universe was filmed as a love letter to the human race, and time and again when I’ve found myself immersed unexpectedly in a culture, I’ve been reminded about why it is appropriate to want to write one.
At 4am, soaked in vodka, the call came through. Commander Oleg Kotov, Sergey Ryazansky and Mike Hopkins had climbed aboard the Soyuz and were preparing to depart the International Space Station. I was elated, because I genuinely thought the landing would be called off, and I had no idea what that would have meant, other than waiting for the storms to clear on the Steppe.
At 6.02am Kazak time, the Soyuz TMA-10M, the 199th Soyuz to fly since 1967, undocked from the ISS. This is the point of no return, except in an emergency. Just 2 hours and 28 minutes later, it fired its engine for a pre-programmed burn of 4 minutes and 44 seconds. This reduced the spacecraft’s velocity by 128m/s relative to the Station, which in its orbit that day was travelling at 7358m/s. That number is not arbitrary. It is given by a simple equation which can be derived easily from Newton’s Law of Gravitation and his Second Law of Motion, F=ma. We leave it as an exercise for the reader to show that these two laws of nature can be rearranged to show that the velocity v of any object in a circular orbit a distance r from the centre of the Earth, mass Me, is given by
To derive this result, you need to know that the force required to maintain an object of mass m in a circular orbit is mv2/r.
The Space Station orbits at an altitude of between 330 and 445 kilometres – let’s choose the middle ground of 387 kilometres – this is a back-of-the-envelope calculation. ‘Estimate is the name of the game’, as my old physics teacher used to say at school. The radius of the Earth is 6,378 kilometres, and the mass of the Earth is 5.97219 × 1024 kg. Newton’s gravitational constant is 6.67384 × 10-11 m3 kg-1 s-2. Do the calculation yourself; maths is good for you. With these numbers, v is approximately 7675m/s, which is close enough – the difference is due to the precise altitude of the ISS that day. I love doing little calculations like this. They reveal the immense power of mathematical physics; this really is the orbital velocity of the International Space Station, and it is forced to be so by laws of nature first published by Isaac Newton in 1687. If you’ve never done a calculation like this before, you should feel elated. The biologist Edward O. Wilson called this feeling the Ionian Enchantment, a poetic term he introduced to describe the realisation, credited to Thales of Miletus in 600 BCE, that the natural world is orderly and simple, and can be described with great economy by a small set of laws. It is nothing short of wonderful that we can calculate the orbital velocity of the International Space Station together in a few lines of a popular book, and this points us neatly towards the story of the last great innovation in the ascent from apeman to spaceman: the written word.
INTERMISSION: BEYOND MEMORY
I began my degree at the University of Manchester in 1992, which is when I started doing physics full time. I gained my PhD in 1998, and spent the next eleven years working as a particle physicist at the DESY laboratory in Hamburg, Fermilab in Chicago and CERN in Geneva. In 2009 I began filming Wonders of the Solar System, which slowed down my research a bit. But I’ve been at it now for 22 years, which is almost half my life. In that time, I’ve learnt a lot about how to be a scientist, how to think about scientific problems, how to make measurements of nature, particularly the behaviour of subatomic particles, and how to interpret those measurements to generate new knowledge and make new discoveries. But given all that, there is no way that I would be able to calculate the orbital velocity of the International Space Station from scratch. Given Newton’s laws, it’s trivial. Without them, it would be virtually impossible. Newton’s laws are far from obvious; they took Newton a scientific lifetime to produce, and he was a genius – one of the greatest scientific minds of all time. And even he didn’t start from scratch. He relied heavily on the previous works of Galileo, Euclid and a hundred other philosophers, geometers and mathematicians whose names have been forgotten but whose works remain as cornerstones of our scientific culture. The reason we could run through that simple calculation together is that the thoughts and discoveries of these generations of philosophers, scientists and mathematicians were not lost; they were preserved forever in the written word.
Writing appears to have arisen independently in several different cultures, just as with the development of agriculture, and just as agriculture triggered the birth of civilisation 12,000 years ago, so the emergence of writing supported a rapid increase in the complexity of civilisation. The earliest known system of writing is generally accepted to be cuneiform, the Sumerian system that emerged around 5000 years ago in the cities of Mesopotamia, although it is possible that Egyptian hieroglyphs may predate it. Literally meaning ‘wedge-shaped’, cuneiform comprises a thousand or more symbols created using a stylus made from reed that was pressed into a soft clay tablet. Following cuneiform and hieroglyphs, other forms of script emerged in Greece, China, India and, later, Central America.
Writing seems not to have arisen out of a deep human need to share and record intimate thoughts and lay down knowledge for future generations; that would be far too romantic. Rather, it appears to have served a more practical purpose, revealed in a set of around 150 Nabataean scrolls discovered by archaeologists in 1993. The scrolls date from around 550 ce, in the final period before Petra was abandoned. One of the most intact documents relates to a court case between two priests. It is alleged that one of the priests decided to run away from their shared house, taking a key to one of the upstairs rooms, two wooden beams that presumably held the roof up, six birds and a table. This is probably how writing began; the invention upon which modern human history rests arose, disappointingly, for admin purposes. This is seen not only in the relatively late Nabataean scrolls, but in many of the early texts. Cuneiform developed because of a need to keep track of trade and accounts in the increasingly complex economy of Mesopotamia. Egyptian hieroglyphs may be an exception, as there is a strong ritual component, but there is also evidence of their early use in commerce, administration, trade and law – the foundations of a modern society. Information about the natural world was also recorded; in hieroglyphs we see the cycle of the seasons chronicled, as well as important environmental events. There are also some beautiful early examples of the use of writing to express deeper human desires and feelings that resonate strongly today and show, yet again, that our ancestors had inner lives not too distant to our own. But the oldest surviving papyrus documents from the Old Kingdom are marvellously prosaic. From Dynasty 5, in the reign of Pharaoh Djedkare-Izezi between 2437 and 2393 BCE, can be found an early version of the parrot sketch.
‘As Re, Hathor and all the gods desire that King Izezi should live forever, and ever, I am lodging a complaint through the commissioners concerning a case of collecting a transport-fare.’
And so the letters continue; whilst the tombs are covered in the names of pharaohs and stories of the gods, the people of Egypt were using writing as we do today, and I find it wonderful, reassuring and moving in a funny sort of way to hear ancient voices complaining down the years. Perhaps we humans really will never change. From Dynasty 20, a millennium later during the reigns of Ramesses III and IV between 1182 and 1145 BCE, the complaints continue.
‘The scribe Amennakht, your husband, took a coffin from me saying, “I shall give the calf in exchange for it”, but he hasn’t given it until this day. I mentioned this to Paakhet, who replied “Give me a bed in addition to it, and I will bring you the calf when it is mature”. And I gave him the bed. Neither the coffin nor the bed is yet here to this day. If you are going to give the ox, send it on; but if there is no ox, return the bed and the coffin.’
Alongside the letters, the ritual, the complaints, the admin and the legal documents, there was also a sophisticated literary and storytelling tradition in ancient Egypt, and a powerful appreciation of the value of the written word. Three thousand years ago on the banks of the Nile, during the reign of Queen Twosret, someone wrote a eulogy for the writers:
These sage scribes …
Their names endure for eternity,
Although they are gone, although they
have completed their lifetimes, and all
their people are forgotten.
They did not make for themselves
pyramids of bronze with stelae of iron
They made heirs for themselves
as the writings and Teachings that
they begat … Departing life has made
their names forgotten; Writings alone
make them remembered.
Taken from The Tale of Sinuhe and
other Egyptian Poems 1940–1640
BC, Oxford World’s Classics
Writing was the final pivotal moment in our ascent from early agrarian civilisations to the International Space Station, because it frees the acquisition of knowledge from the limits of human memory. The hardware restrictions set down in the Rift Valley 200,000 years ago no longer matter. Writing allows a practically unlimited amount of information to be passed from generation to generation, and to be shared across the world. Knowledge is no longer lost but is always added to; it becomes widespread, accessible and permanent. A little boy from Oldham, Lancashire, can inhabit the mind of Newton, assimilate his lifetime’s work and derive new knowledge from it. Writing created a cultural ratchet, an exponentiation of the known that allowed humanity to innovate and invent way beyond the constraints of a single human brain. We now work together as a single mind spread across the planet and with a memory as long as history. It is this collective effort, enabled by the written word, that carried us, the human race, paragon of animals, from the Great Rift Valley to the stars. I deliberately borrow from Shakespeare; the most precious objects on Earth are not gems or jewels, but ink marks on paper. No single human brain could conceive of Hamlet, Principia Mathematica or Codex Leicester; they were created by and belong to the entire human race, and the library of wonders continues to grow.
THE KAZAK ADVENTURE: PART 2
The drive from the farm to the Soyuz landing site was agricul-tural. The Petrovich vehicles work as a pair, dragging each other out of snowdrifts when they get stuck. I wondered through the mildly paranoid haze that descends after 48 hours of wakefulness and 48 shots of vodka (which is not optional if Russian sensibilities are to be respected) what would happen to us if both vehicles got stuck. By dawn, we arrived at the GPS coordinates given to us by Roscosmos, and waited. We knew precise timings for re-entry, because those are given by physics alone once the de-orbit burn of 4 minutes and 44 seconds occurs. Recall that the Soyuz, along with the Space Station, was in a circular orbit travelling at 7358m/s, and the engine burn slowed it down by precisely 128m/s. This put the Soyuz into an elliptical orbit, which, when the breaking effects of the atmosphere are taken into account, put the craft on a collision course with Kazakhstan. It’s quite simple, and it works. In my experience filming with Roscosmos, the words ‘it’s quite simple and it works’ sum up Russia’s successful half a century in space. They don’t do things in as shiny, hi-tech a fashion as the United States; the Soyuz has been flying astronauts into space with minimal design changes since 1967. But today, the Soyuz is the only way to get to and from the ISS, and it is a reliable system. But to my inexperienced eyes, unused to the way the Russians do things, the return of the Expedition 38 crew after six months in space felt like a traction engine rally in Yorkshire arranged by Fred Dibnah. That’s not meant as a criticism, because I’d trust Fred Dibnah to organise a traction engine rally, and I’d trust the Russians to get me back from space. But neither stands on ceremony.
At precisely 9.23am, the Soyuz emerged from the snow-filled skies above the Steppe, swinging from its parachutes, and touched down with a burst of soft landing jets. One of our Petrovich colleagues saw it with his binoculars, and we headed off towards the spaceship in the snow. In one of the most bizarre moments of my life, we arrived, and, without thinking, jumped out and stumbled through the drifts towards the spacecraft. I fumbled around with the microphone for a while (recall that soundman didn’t make it), and then realised that there were no other vehicles around. A single helicopter had just landed; apart from that, there was only the wind driving gentle flurries across the Steppe.
Minutes later, the support vehicles arrived and Oleg Kotov, Sergey Ryazansky and Mike Hopkins were dragged from the hatch of their Soyuz, wrapped in sleeping bags and put into deckchairs. They looked happy, but knackered, and mildly discombobulated as a parade of Russian army generals in very big hats seized the opportunity for a photo. The Russians don’t overdo things; they just do them. Five times a year men and women make this voyage back to Earth having spent half a year in space, living amongst the stars on the International Space Station. Since the first expedition began on 2 November 2000, the station has been continuously occupied, and I hope that there will never again come a time when every human being is confined to Earth.
I carried in my pocket a reminder of my time in Ethiopia, the small flint we used for filming in the Rift Valley. I imagined a human, my great-great-grandfather, sitting somewhere in the vicinity of what would one day become Addis Ababa, diligently chipping away at the obsidian in my hand, the whole of history away. I set it down in the snow next to the Soyuz, descended from it as I am from him.