The Scream-We Are Their Children’s Children - The Great Disruption: Why the Climate Crisis Will Bring On the End of Shopping and the Birth of a New World - Paul Gilding

The Great Disruption: Why the Climate Crisis Will Bring On the End of Shopping and the Birth of a New World - Paul Gilding (2011)

Chapter 2. The Scream—We Are Their Children’s Children

To understand where we are and where we are going, we first have to understand where we’ve come from. In 2005, when I first wrote about the impending ecological system crash, I called the paper “Scream Crash Boom.”1 In summary it argued that the Scream—the call to action that had been under way since the late 1950s—was coming to an end; the Crash—of the ecosystem and economy—was beginning; and the Boom—a response of extraordinary speed and scale—was not far behind.

The reason I called the first phase the Scream was that it conveyed both the practical notion of warning—seeking to draw attention to a problem—and a healthy dose of fear—evoking the classic image of Edvard Munch’s painting. While many have accused environmentalists of “fearmongering” over the decades, when you see a threat, the right thing to do is to warn those around you. In hindsight, we now see clearly that the fears of the early environmentalists were well-founded indeed. Those who argued we would be okay were, to say the least, overly optimistic about society’s capacity to deal with the threats involved in a timely fashion.

I want to tell the story of the Scream for three reasons. First, we need to understand the full depth and complexity of the issues we are facing. As I will explain, we face a fundamental systemwide challenge that needs fixing from the ground up. This challenge goes to philosophy, science, economics, and personal values. Knowing the history can inform our knowledge of the subtleties and complexities of that challenge, so we are more likely to get the solutions right.

Second, given that most people have seriously focused on this area only recently, we should remember that many in science, business, government, and the community have been focused on it for decades. They have developed a great deal of experience and understanding of what works and what doesn’t. This knowledge can help us decide how to move forward and avoid duplicating effort.

Third, it’s a great story of enormous significance to humanity’s progress.

There are many views as to what signifies the “start” of the Scream or of the environmental movement. While I think this was around the late 1950s, there were certainly many people dedicated to the conservation of nature prior to this.

However, their views on the environment tended to position them as “conservationists,” focused on the protection of nature or wilderness as a separate place, an untouched place. They perceived that humans didn’t live in nature—it was somewhere we went on the weekend if we were lucky. In contrast, the modern environmentalists, who made up the Scream, saw nature as a system of which humans were an intimate and inseparable part.

A notable early exception to this thinking was the American writer Henry David Thoreau (1817-1862). While perhaps best remembered for his retreat to the Walden Woods, Thoreau understood the relationship between humans and nature in a profound way. Thoreau famously recognized that “in wildness is the preservation of the world.” Rather than seeing nature as something to be conserved and valued for its own sake or beauty, he recognized that human society was part of nature and dependent upon it. In Thoreau’s words, “It is in vain to dream of a wildness distant from ourselves.” He sought “to regard man as an inhabitant, or a part and parcel of Nature, rather than a member of society.”2

So while such thinking has been around for a long time, it was in the realm of philosophy rather than mainstream opinion. For me, the start of the Scream is best symbolized by the controversy over pesticide use in America in the late 1950s. While I lay on my cot in Adelaide, Australia, just ten months old, a debate erupted in the United States that would start the slow, multidecade process of reshaping popular thinking. Just weeks before Thanksgiving in 1959, the U.S. government announced it had found dangerous levels of the chemical weed killer aminotriazole in cranberries from Washington and Oregon. The timing of the result could hardly have been more dramatic. Consumers across the country stopped buying cranberries, several areas banned their sale completely, and Thanksgiving meals were largely cranberry-free. Tapping into the popular mood, the group Robert Williams & the Groovers even released a pop song entitled “Cranberry Blues,” urging listeners that “if you want to be sure not to get sick, don’t touch a cranberry with a ten-foot stick!”

This brought the issues around environmental protection into people’s living rooms and kitchens, and so began our awakening to the interconnectedness of life. We began to realize the environment was not just a wild place we visited for spiritual nourishment and recreation, but the place we lived, the source of our food and our physical health, and the foundation of our economy and prosperity.

This controversy led in part to Rachel Carson’s seminal 1962 book, Silent Spring. A serious and well-qualified scientist as well as a bestselling writer, Carson had become an active environmental campaigner in response to the excessive use of pesticides.

Her book gave birth to a way of thinking that put humans in the environment as part of a single system. She also established that scientists could be strong advocates on these issues and that their scientific knowledge gave them credibility to do so.

While many were already debating these issues, Carson’s literary skill helped to inspire many people to join the cause with her powerful metaphor of “the silent spring”:

There was once a town in the heart of America where all life seemed to live in harmony with its surroundings.… Then a strange blight crept over the area and everything began to change.… There was a strange stillness.… The few birds seen anywhere were moribund; they trembled violently and could not fly. It was a spring without voices. On the mornings that had once throbbed with the dawn chorus of scores of bird voices there was now no sound; only silence lay over the fields and woods and marsh.

While Carson’s book and writing were focused primarily on the environmental impacts of pesticide use, the reason for her historical importance was her ability to draw in the deeper implications of this behavior for human society. As she argued in Silent Spring:

The “control of nature” is a phrase conceived in arrogance, born of the Neanderthal age of biology and philosophy, when it was supposed that nature exists for the convenience of man.

The industry reaction to Carson’s Silent Spring was immediate and fierce, led by Monsanto and other chemical giants and backed up by the Department of Agriculture. When threats of lawsuits to prevent publication failed, industry resorted to a public smear campaign in an attempt to counteract Carson.3

These attacks were personal, with clear sexist overtones. Carson was labeled a “hysterical woman” rather than the calm and careful scientist she really was, and her argument was called “emotional” rather than scientific.

There are clear parallels with those who today criticize climate scientists for being “political,” for overstepping their role. In fact, what these contemporary scientists and Carson have in common is that they saw the clear message of science and felt a moral and professional obligation to use their knowledge to passionately advocate for this science to be heard and acted on.

Another parallel to today’s debate is that critics took Carson’s moderate and careful argument to the point of absurdity. An example found in a chemistry industry newsletter argued that Carson’s vision meant “the end of all human progress, reversion to a passive social state devoid of technology, scientific medicine, agriculture, sanitation, or education. It means disease, epidemics, starvation, misery, and suffering incomparable and intolerable to modern man.” 4 Of course, it meant nothing of the sort. A common rejoinder to Carson’s work in 1963 and 1964 was to assert that there seemed to be plenty of birds that year, a deliberate manipulation of Carson’s metaphor of the silent spring.5Again, the parallels to the reception of climate science are clear.

Monsanto even published in its company magazine a widely distributed article called “The Desolate Year,” which parodied Silent Spring by describing a world overrun by insects and pests in the absence of pesticides.6

The industry tactics backfired, however, and public opinion quickly swung firmly behind Carson. Their attacks served only to give more attention to Carson and her bestselling book. Caught in the public storm, President Kennedy ordered his Science Advisory Committee to investigate the claims made by Carson. Within the year, they had returned a report that substantially accepted and agreed with Carson’s findings. Shortly after this, Carson was called to testify before Congress on the issue and was well received.7

Carson continued her work, giving us an analysis that maintains relevance to this day. For example, in a CBS documentary in April 1963 she said:

We still talk in terms of conquest. We still haven’t become mature enough to think of ourselves as only a tiny part of a vast and incredible universe. Man’s attitude towards nature is today critically important simply because we have now acquired a fateful power to destroy nature. But man is part of nature and his war against nature is inevitably a war against himself.

While she died of cancer in 1964, just two years after publishing Silent Spring, Carson was subsequently widely recognized as one of the main inspirations for the modern environmental movement. Her work helped to establish the idea that we needed to control and regulate human behavior and led to crucial developments, including the 1970 establishment of the U.S. Environmental Protection Agency (EPA), which soon acted to ban the pesticide DDT and enforce other controls on the market.

Actions like this enshrined the idea that protection of the environment was an essential part of the regulatory framework within which the market had to operate. Time has further vindicated Carson’s work, and she was posthumously awarded the Presidential Medal of Freedom in 1980.

The 1960s ended with a powerful signal of the risks of inadequate regulation. On June 22, 1969, the Cuyahoga River in Cleveland, Ohio, caught fire when a potent mix of oil and chemicals that had been discharged in the river spontaneously and spectacularly burst into flame. While it wasn’t the first time this had happened, this event received widespread public attention, with Time magazine referring to it as the river where a person “does not drown but decays.”8

From 1970 on, the action started to come thick and fast. Around the world, other countries were tracking similar paths to that of the United States, with many governments acting at the national level. It was already clear to many, however, that these issues couldn’t be addressed just nationally and that a global focus would be needed.

In 1972, two important events occurred. The first was the United Nations Conference on the Human Environment, held in Stockholm. This meeting was chaired by Canadian Maurice Strong, who went on to become a powerful and positive force in corporate sustainability, particularly with the establishment of the Business Council for Sustainable Development (now known as the WBCSD).

While no decisions of great practical significance were made, the conference was a clear indicator of the rapidly increasing political importance of environmental issues in the international community. It laid the foundations for the decades to come, inspiring a series of international government-to-government meetings. These gatherings have become key milestones measuring society’s progress on sustainability, or the lack of it, with a recent example being the Climate Conference in Copenhagen.

The 1972 Stockholm Conference also established various global and regional scientific monitoring processes that helped provide the data scientists now use to measure the changing state of the global ecosystem. And in case you thought climate change was a recent issue, it was addressed at this meeting nearly forty years ago!

The second key event of 1972 was the publication of The Limits to Growth. While commissioned by the Club of Rome, an international group of intellectuals and industrialists, the report was produced by MIT experts who were focused on system dynamics—taking the behavior of systems, rather than environmental issues, as their starting point. What they modeled was the interaction between exponential growth and a world with finite resources.

What The Limits to Growth argued is now obvious to most rational people, but nearly forty years ago it completely challenged the then dominant worldview. It modeled, in twelve possible futures, the consequences of ongoing growth in population and the economy in the context of limited resources, including the limited capacity of the earth to “absorb pollution.” In doing so, it spelled out our true relationship with the world around us.

The computer model World3, at the heart of the report, recognized that human activity interacts with and affects the natural world. Not only are we completely dependent upon this natural world for our survival and prosperity, but in the language of Limits to Growth we are capable of “inducing its collapse.” The report concluded that such a physical collapse was inevitable if observed trends in humanity’s growing ecological footprint continued, and with it would come a dramatic decline in our wealth. Limits to Growth argued that while forward-looking policy could avoid humanity “overshooting” the earth’s limits, delays in political and economic decision making meant this would be challenging. Once the earth was in overshoot, the only options would be to initiate a “managed decline” of our footprint or accept the coming collapse.

The Limits to Growth report quickly obtained notoriety because when it was released, attacks on the work were fast and furious and came from many quarters. Famously, Yale economist Henry C. Wallich called it “irresponsible nonsense.”9 Why such a strong response? The book was a fundamental challenge to those who believed the market was a self-correcting system that could continue to grow indefinitely. The ideas in it threatened the global assumption that the consumer capitalism model of the time would inevitably and indefinitely continue its march across the world. It was like a grenade thrown into a glasshouse.

The work was so effectively vilified that it has become accepted wisdom that the book got it wrong. In fact, the book got it close to exactly right.

The most famous and effective attacks centered on one scenario from World3 where nonrenewable resources are depleted without any societal or market response. This was a clearly unrealistic scenario, as explained in the book, but in modeling it is useful to create extreme scenarios for comparison purposes. World3 was in fact used to generate a range of scenarios, many of which—including the “business as usual” scenario—saw collapse by the middle of the twenty-first century.

Despite the lack of rigor in the attacks, they soon became accepted, and for many even today The Limits to Growth simply got it wrong and is lumped in the same category as the earlier Malthusian forecasts of a global famine. Denial is a powerful thing.

In fact, The Limits to Growth has proven to be surprisingly accurate, not just conceptually as we’ll explore over coming chapters, but numerically as well. In 2008, a study was done into the modeling by Graham Turner from Australia’s national science body, the Commonwealth Scientific and Industrial Research Organization, in a paper entitled “A Comparison of ‘The Limits to Growth’ with Thirty Years of Reality.”10

It examined the past thirty years of actual results against the suite of scenarios in the Limits to Growth report and found that changes in industrial production, food production, and pollution up to 2000 compare well with the report’s business-as-usual scenario—called the “World3 standard run.” Interestingly, this scenario includes economic and societal collapse around the middle of the twenty-first century!

Of course, it was never the point of Limits to precisely forecast the future for one hundred years, a clearly impossible task. The objective was actually far simpler—namely, to establish the obvious and commonsense conclusion that if you insist on growing your footprint exponentially within finite limits, this will unavoidably lead to a crash, unless you decide to stop the growth before it is too late.

The fact that the book’s forecasts are broadly on track is a remarkable outcome and a testament to the author’s technical competence and system insights.

This work clearly indicated that what we were facing was not just an energy crisis, or a population problem, or a climate crisis. Rather, it was a system design problem, with “the system” being our model of consumption-based, quantitative economic growth. This meant a system design change would be needed to solve it. The work sold many millions of copies and along with Silent Spring was one of the defining environmental treatises of all time.

The book also triggered widespread media coverage of these issues. I clearly remember as a thirteen year old in 1972, sitting in the morning sunshine on the back veranda of the family home in Australia and being captivated as I read a newspaper series about the future of humanity. It painted a bleak picture of global crises around shortages of resources and food and forecast a society creaking under the burdens of population growth and pollution.

I recognized this was my future and that, if it unfolded as predicted, this would be a very bleak future indeed. Little did I know how deeply these ideas had entered my young mind. This was probably the moment my life’s direction was set.

Some thirty years later I became good friends with one of the authors, Professor Jorgen Randers, when we both joined the faculty of the Cambridge Programme for Sustainability Leadership and taught together on the Prince of Wales’s Business and the Environment Program.

When I discussed with Jorgen recently why and how he became a lifelong environmentalist, he explained that he joined the team that produced World3 and wrote the Limits to Growth report while completing his PhD at MIT. He did so out of intellectual curiosity about system dynamics rather than out of any initial interest in environmental issues. It was only when their analysis showed the consequences of exponential growth that his life changed track. He then became focused on advocacy to prevent what he learned from their modeling was the otherwise inevitable crash of the global economy and society, through pollution and resource depletion.

Nearly forty years later, Jorgen still maintains his passionate advocacy of the need for change, cheerfully lecturing around the world in his thick Norwegian accent and indulging his passion for visiting areas of great biodiversity that he believes will soon be largely gone.

Despite the lack of real action, from 1972 on the environmental movement built strongly. Greenpeace was founded, along with many other environmental organizations, and around the world people engaged in these issues at a broad and deep level. Greenpeace’s arrival was important both practically and symbolically. It symbolized the arrival into the mainstream of global nongovernmental organizations (NGOs)—nonaligned agencies that provide a global check and balance to the behavior of governments and multinational corporations. Greenpeace also provided a practical accountability and monitoring capacity with courageous and daring confrontations, bringing environmentally destructive behavior into the living rooms of ordinary people through its powerful use of the global media.

In the face of growing public concern that was mobilized largely by these groups, strong action by regulators like the various national EPAs and their equivalents during the 1970s saw significant steps taken to address city air quality, water pollution, and other such impacts. As a result, there was considerable improvement in many Western countries, and many incorrectly saw this as the problems being addressed. Certainly it was good that rivers stopped bursting into flame, but the problems ran much deeper.

Around this time I became an activist, at the age of fifteen, focusing on issues surrounding human rights and various independence struggles, such as that in East Timor. In the mid-1970s, I became very involved in antiapartheid campaigns. I had been heavily influenced in my thinking by the massacre in Soweto, South Africa, where children even younger than me were shot and killed when protesting against not being taught in their own language. The concept of sacrificing your life for your beliefs had a deep influence on my understanding of what it meant to be an activist and how lucky I was to live in Australia.

This led to my first involvement in direct action protests, chaining myself to the gates of the South African embassy in Canberra, Australia, at the age of seventeen. I remember being a very nervous young person taking action that could lead to my being arrested. However, I was acutely aware that with the people I was supporting in South Africa being shot for their beliefs, the risks to me paled by comparison. It was an exciting time for a young seventeen-year-old, being interviewed on national radio and TV about the outrageous abuses of human rights in South Africa while I stood there chained to the gates of the embassy’s main entrance. I believed I was making a difference, and it felt good.

I also remember very clearly, though rather embarrassingly, a day in 1977 when the International Whaling Commission was meeting in Canberra. On our way from an antiapartheid protest, we drove past a much larger protest against whaling by Greenpeace and others. The conversation in the car was one of moral outrage that so many people cared about whales more than they cared about people. “Why aren’t they joining our protest, which is about people being oppressed and killed?” we asked. “Who cares about whales when people are dying?”

Looking back, I can see that, like most people at that time, I failed to understand Carson’s argument about the interconnectedness of life and the arrogance of humanity. I saw people as superior and more important beings, from which whales were a separate and an unrelated distraction; I failed to see that protecting ocean life was about protecting the complex system that supported us. I didn’t yet understand that with the whales went the watchers.

I probably should have spent more time reading Henry Thoreau and less time reading Chairman Mao!

My personal head space took a profound shift in 1979 with the birth of my first child, Callan. Even though I was just twenty years old at the time, my span of interest suddenly catapulted way into the future. Many first-time parents say this happens. You realize that along with newfound responsibility is a newfound understanding of the implications of life being handed down to future generations, not just in theory but with your genes being passed on to experience whatever the future holds. Once you cross that line, the future becomes a lot more personal, and so it did for me when Callan was born.

So there was a lot going on in the 1970s. Despite these efforts, the 1980s was characterized mainly by environmental disasters, including some with global impact.

During the night of December 2-3, 1984, the American-owned Union Carbide pesticide plant in Bhopal, India, released tons of toxic gases into the local atmosphere in the world’s worst industrial disaster. Thousands were killed immediately, from the gases or in the panicked stampede to escape. Best estimates suggest that over fifteen thousand people ultimately lost their lives.11 In many ways, the disaster was emblematic of the 1980s. As developed countries raised their own standards, industry in developing countries continued to implement lower standards—to make products for rich countries. Accidents like that in Bhopal put this issue of Western companies’ behavior in the developing world firmly on the agenda.

On April 26, 1986, the irrelevance of borders to environmental pollution was catapulted into public consciousness. At 1:23 a.m. that day, two explosions occurred at the Chernobyl nuclear plant. A power surge had ruptured the uranium fuel rods, while a steam explosion created a huge fireball, causing the reactor’s dome-shaped roof to be blown off and the contents to erupt outward. Air was sucked into the shattered reactor, igniting flammable carbon monoxide gas that caused a reactor fire that burned for nine days.

The resulting radioactive plume blanketed the nearby city of Pripyat. The cloud moved on to the north and west, contaminating land in neighboring Belarus, then drifted across Eastern Europe and over Scandinavia. While monitoring stations in Scandinavia began reporting abnormally high levels of radioactivity, there was silence from the Soviet authorities. They took three days to acknowledge there had even been an accident.

Many parts of Europe were dramatically affected by radiation poisoning drifting across the continent. Swedish food authorities recommended that moose hunters eat moose or fish no more than once a month owing to significant levels of radioactive contamination. Mushrooms, berries, and honey from the north of Sweden—where the weather had carried the radiation—could not be sold. In the years following, hundreds of thousands of culled reindeers were rejected in testing due to radiation contamination. Reindeer herding and the sale of reindeer meat largely sustains the indigenous Saami population of northern Scandinavia. The stories of this incident are still told and resonate in Sweden to this day. As well as locking in public skeptism of the safety of nuclear power, people had been given a palpable example of global interconnectedness.

The 1980s also brought one of the world’s most famous oil spills by the world’s least favorite oil company, when the Exxon Valdez spilled 250,000 barrels of oil into the pristine waters of Alaska in 1989. A legal battle followed to hold Exxon accountable for the damage—they had placed in charge of the tanker a known alcoholic, who was drunk and not on the bridge at the time the vessel ran aground. At the initial trial, a jury levied $5 billion in punitive damages against Exxon. With their enormous resources and so much money at stake, Exxon managed to drag the legal process on for decades, until in 2008 the Supreme Court cut punitive damages to just $507 million. That same year, Exxon filed a record profit of over $40 billion.

With the Valdez incident and the corporation’s strident opposition to action on climate change, including actively financing antiscience climate skeptism to this day, ExxonMobil has earned the well-deserved nickname of the Death Star among many environmentalists.12

There was one significant positive development in the 1980s when the world adopted a key global environmental agreement to phase out chlorofluorocarbons (CFCs), which were creating a hole in the ozone layer. This agreement in 1987, supported by the conservative governments of Margaret Thatcher and Ronald Reagan, remains the classic example of denial and delay by industry being followed by decisive global action once denial ends. UN chief Kofi Annan described this agreement as “perhaps the single most successful international agreement to date,” and it remains a shining example of how action can be taken when business and governments decide to do so.

Some years later, as an adviser to the DuPont Company, I heard the inside view on this shift from the executives there. When DuPont’s own scientists came to the conclusion that CFCs were definitely the cause of ozone depletion, the company faced an ugly reality that a whole area of their business was effectively finished. DuPont, despite being accurately targeted by Greenpeace at one stage as the “World’s Biggest Polluter,” has a strong ethical culture. When their scientists agreed with the problem, DuPont agreed to close down that business and cease production, well ahead of what the agreement required. This was a tough decision, as it was not yet then clear to DuPont whether they could participate in the market for alternative products.

The executives I spoke to were proud of this decisive ethical action by their company. Mind you, at the time the decision wasn’t just about ethics, with DuPont correctly seeing this as a serious business issue. As DuPont’s Joseph Glas said, “When you have $3 billion of CFCs sold worldwide and 70 percent of that is about to be regulated out of existence, there is a tremendous market potential.”

The politics and divisions within the business community around such shifts in direction are often complex and fascinating. So whereas in 1980 DuPont had spearheaded the creation of the Alliance for Responsible CFC Policy, a lobby group fighting against regulation of CFCs, in 1986 with their change of heart they switched sides and lobbied the Reagan administration for action to ban them. DuPont’s efforts culminated in the Montreal Protocol, a treaty President Reagan described as “a monumental achievement.”

Some argued this was primarily about business rather than ethics. The reality is it was both. Mostafa Tolba, executive director of the UN Environment Programme, said, “The difficulties in negotiating the Montreal Protocol had nothing whatever to do with whether the environment was damaged or not. It was all about who was going to gain an edge over who; whether DuPont would have an advantage over the European companies or not.” I can well believe the negotiations at this point had become intensely commercial, with governments supporting their national companies’ positions. U.S. and European companies were racing one another to capture the market for substitutes, but the business decisions involved were complex. DuPont, for example, had to commit to around $500 million of investment, so timing and competitive position would have been critical business questions.

This offers a very good example of the messy reality of business in relation to environmental decision making. There are deeply ethical issues involved, and they have enormous commercial consequences. This reflects the reality of how markets behave. Businesses often have a genuine, principled commitment to ethical behavior, but the evidence suggests it is only when change is profitable and in line with market reward that they shift behavior on a significant scale. This complexity continues today with climate change, where we see constantly shifting positions by companies and industries as they come to accept that change is both necessary and inevitable but then seek to gain commercial advantage by either accelerating or slowing down the transition.

As the CFC debate raged in the mid- to late 1980s, it helped trigger the rise of the corporate sustainability movement. Many companies like DuPont realized that resistance to the emerging world of increased environmental concern was both futile and poor business strategy. Such companies decided to get ahead of the curve and be proactive in pursuing better practices.

The 1980s also saw the spectacular growth of environmental organizations around the world and strong campaigning against corporate pollution, with individual companies targeted rather than just a general push for regulation. This was the birth of campaigns targeting brands, with activists deliberately using a company’s focus on its brand as a point of vulnerability, as they did with Nike over sweatshops. Writer Naomi Klein noted: “Brand image, the source of so much corporate wealth, is also, it turns out, the corporate Achilles’ heel.”13 The more a company is a brand image, the more vulnerable it becomes to activist campaigns targeting that image.

This was also the era when the seriousness of fighting for environmental protection came into sharp focus, with the murder of a Greenpeace activist by a Western government. On July 10, 1985, agents from the French government’s intelligence agency, the Direction Générale de la Sécurité Extérieure, acting with the approval of French president François Mitterrand, bombed the Greenpeace vessel Rainbow Warrior in Auckland, New Zealand. The ship was about to sail for protests against nuclear weapons tests in the South Pacific. The bombing killed crew member Fernando Pereira, photographer and father of two young children.

There had previously been many cases in the developing world where environmental activists were killed by criminal elements or secret police. However, this case, where a Western democratic government murdered an activist in a friendly Western democratic country, was a stark reminder for environmentalists everywhere of what was at stake. It was also evidence of protest groups’ ability to have a significant impact on corporate and national reputation. Relatively small groups could now mobilize public opinion on a large scale with the clever use of the increasingly globalized media.

The Rainbow Warrior bombing and the broader public debate on the prospects for a nuclear war led me to reengage in activism from my then role as a serving member of the Australian military. I had joined the Royal Australian Air Force in 1983.

Prior to that, I had worked as labor union organizer for a Communist-led trade union, the Builders Labourers Federation, in Sydney. While I felt I was making a contribution to society by protecting workers’ rights and safe working conditions, in what was at that stage a pretty shoddy industry, I soon became disaffected with the ideological obsession of the leadership and their blind support of their political beliefs. There were too many examples where the leadership was focused more on the power and influence of the union rather than on the interests of the workers. At one stage, I even spent several weeks on a picket line in a dispute with another union over who covered the workers on that site. So I left that role in 1981, and after a year of unemployment (it being quite hard to land a job when your last one was as a labor organizer!), I joined the military.

This was a great surprise to my friends and family, who assumed my political leanings would prevent such a life turn. For me it was a consistent move. I was pursuing a life of making a contribution to society, and I saw the Australian military as doing just that.

While in the military and now with my second child, Asher, born, I became very concerned about the threat of nuclear war. Being in the military naturally led to great interest in matters of national and global security—after all, this was the 1980s, with Ronald Reagan, Star Wars, and a massive global movement against nuclear weapons.

I particularly remember a newspaper story from a science conference at the time reporting that an alarming proportion of teenagers believed there would be literally no future for them, as nuclear war was inevitable. They therefore felt there was no point in working toward a better life. It struck me that whether that assumption was accurate or not, the fact that we had a generation growing up with such a view was of great concern to me as a young parent.

I believe this period of global focus on the nuclear issue, when many came to understand that we had the capacity to destroy most of life on earth with a nuclear holocaust, was critical to later developments in society’s collective thinking. It provided a deep and direct understanding of the idea of intergenerational impact and that we humans could easily and irreversibly affect the entire planetary system. I think some people today still struggle to believe we really have the power to damage the earth’s environment as a whole. Sure, we could destroy a river here and a forest there, but the planet is so big, surely we couldn’t wreck it all?

The prospects of a nuclear winter—a sudden global cooling triggered by a massive nuclear holocaust coating the planet with fine dust particles—showed that in fact, yes, we could, and with just a few buttons and phone calls. It was a sobering time. We had learned to understand the implications of Rachel Carson’s comment that we had “now acquired a fateful power to destroy nature.”

Motivated by this threat to my children’s future, I was by 1985 still serving in the military but spending my personal time active in waterborne protests conducted by an activist group, the Sydney Peace Squadron, on Sydney Harbour against visits by nuclear-armed warships from the United States and the United Kingdom. At that time, I still enjoyed serving in the military and continue to this day to have great respect for our armed forces.

While the Australia military exists in a clear democratic framework and was surprisingly tolerant of what I did on my own time, we did in the end agree that a long-term career in the military was probably not compatible with a personal life as an antinuclear campaigner, especially since our protests were against allied countries’ ships. After some interesting (!) conversations with military intelligence, who came to check out my threat level, we amicably agreed to part company in 1986. I then committed myself full-time to my antinuclear campaigning; by that stage I was separated from my first wife and living with my two children.

Not being able to afford housing, the children and I occupied an abandoned government-owned house. It was badly dilapidated so we had to first rebuild the roof and put in doors and windows from scrap materials we collected. My income came from supportive activist friends and the government social-welfare payment for single parents. None of this posed a challenge as I was happily pursuing my life’s purpose.

The anti-nuclear weapons movement had a great influence on the environmental debate, as it helped connect the dots on many levels. For example, it exposed the many linkages between the government military and security apparatus and the civilian nuclear power industry. It was perfect fodder for conspiracy stories and for dramatizations like the BBC’s iconic TV series The Edge of Darkness, helping a whole generation grow up deeply skeptical about whose interests were being served by government.

After several years as an independent activist, I joined Greenpeace in late 1989 at a time when a great wave of growth had swept the U.S. and European environmental movements. Perhaps driven by the controversy around CFCs and the ozone hole on the back of the antinuclear campaigns, environmentalism had taken off in all Western countries. Membership and influence boomed as public awareness and media coverage exploded.

Companies ducked for cover as consumers railed against irresponsible behavior. This was the time when companies like Nike suddenly and unexpectedly found themselves embroiled in controversy. Nike thought their task was to make trainers and money but suddenly found themselves being expected to deal with complex social issues around social equity, workers’ rights in developing countries, and different cultural expectations about the appropriate working age. It was becoming clear that some new competencies were going to be required to make money in the future.

Up until this time, environmental issues had been seen primarily as concerns in developed countries, where public support was high and regulation tightening. As a result, many companies had thought they could operate in developing countries where environmental standards were lax and wages cheap. But as the 1980s progressed, companies found that the globalization they liked because it lowered their costs was also creating a new interconnected world. Activists were joining together as a connected network, with cheap technology enabling anyone to send a message to corporate headquarters via the media. So suddenly behavior anywhere was public everywhere.

The best organization in the world at doing this in the late 1980s was without doubt Greenpeace. I joined them in 1989 to lead the Clean Waters Clean Seas campaign in Australia, which focused on exposing the more outrageous examples of corporate pollution. It was a classic Greenpeace pipe-plugging campaign, with our first direct action being to send divers to plug up the underwater discharge pipes that an oil refinery used to discharge toxic waste into the ocean. In Australia at the time, there was little effective regulation of industrial pollution. Our team secretly took samples from companies’ discharge points that variously went into rivers, creeks, sewers, and oceans. Almost every discharge point we tested had levels of toxic waste way in excess of the legal limits specified in the companies’ license agreements.

These were heady days for Greenpeace, with the media loving the combination of exciting and bold direct actions and the exposure of what we called “illegal toxic waste dumping” and the companies called “discharges temporarily in excess of license limits.” Our political influence skyrocketed, and our direct actions captured the public imagination. It was firmly positioning us as the environmental good guys against the corporate polluting bad guys.

Most of the companies involved were clueless in their response. An infamous highlight was when the corporate PR guy from BHP, Australia’s biggest company at the time, put his hand across the lens of the TV cameras to prevent them from filming and had the journalists removed from the site by the police. This of course guaranteed sympathetic media for us, with blanket coverage of our protests, including our slogan rebranding BHP as Australia’s “Big Horrible Polluter”! This incident became the classic case study at PR conferences over the next decade in how not to respond to environmental protests.

While our intentions were honorable and the company’s behavior clearly wrong, not to mention illegal, I often grimace in hindsight at the delight I took in confronting corporate leaders on national television and humiliating them with the evidence of their “corporate vandalism.” Many of these were decent people caught by surprise with rapidly changing public expectations.

While most companies’ responses were incredibly naive, one corporate CEO, Dr. Michael Deeley from the chemical giant ICI, called one day and asked if he could come and chat with me (I was by this stage CEO of Greenpeace Australia). It was a surprising move, and I immediately agreed. ICI was a key target of ours, as their Sydney chemical plant was an appalling example of poor environmental practices.

It was a fascinating meeting and started to shift my attitudes to the corporate sector and more broadly to the role of the market. It was a private meeting, and we were both candid about our situations. Deeley explained that while Greenpeace’s campaigns were an issue for him, the much larger challenge was getting his organization to change its attitude toward environmental issues and to give them more priority. He talked about the old guard’s attitude and the complexities of modernizing an old organizational culture.

He was clearly a decent man, and while it didn’t stop us from campaigning hard against ICI over the years that followed, it certainly gave me an important insight into corporate behavior. It also made me think deeply about the dangerous psychology of “demonizing the enemy” as we had been doing to great effect. I understood he was coming to see me to avoid this, in his company’s self-interest, but I started to doubt the ethics of what we were doing as well. I thought perhaps we needed to focus more on attacking the behavior and less on attacking the morality of the people behind it.

While I was still at Greenpeace Australia in 1992, one of the most important historical environmental conferences was held, the Rio Earth Summit. This conference came at a new high point in global political awareness of these issues and was attended by 108 heads of state, including George H. W. Bush. This conference started the process of global climate agreements with the adoption by consensus of a treaty agreeing to prevent dangerous climate change—the UN Framework Convention on Climate Change (UNFCCC).

I have attended many such international meetings, including the Conference of the Parties to the UNFCCC in Kyoto in 1997, which led to the Kyoto Protocol and the Earth Summit +5 in New York, also in 1997. These events are better understood as “festivals of debate” rather than meetings, with thousands of lobby groups of all persuasions battling for media and political attention on their particular agendas.

They are also important examples of our immature global governance structures. They are generally great gatherings of the elite of environmental decision making, with business, NGOs, and government representatives getting together to lament the lack of progress—like a great collective confessional!

When I attended the Earth Summit +5 review in New York in 1997, a special UN General Assembly meeting, world leaders got up one after the other and gave speeches on how appalling it was that so little progress had been made in the five years since the 1992 Rio Earth Summit. It was a strange thing to witness, as the most powerful people in the world gathered but then behaved as if they were observers of the process and had little power to influence it. Five years later in 2002, the whole process occurred again in Johannesburg at the Earth Summit 2002.

At each of these meetings over two decades, increasingly earnest speeches have been made, I’m sure mostly genuinely felt, about the critical risks that humanity faces and the urgent need for action. From the outside, it looks as though all the important people are in the room and all the power required to change the world is there, ready and able to act. Yet on the inside, what actually happens is that pretty much no one is in charge, because as we’ll discuss later, the system has become so large that no one can be.

Over the fifty years of the Scream, we’ve learned that in reality global change is much more a bottom-up process. Our political leaders, with rare exceptions, respond at best to what they think the politics allows them to do rather than what they feel they should do. As we saw in Copenhagen, even when our political leaders are personally convinced of the need to act, the strategy of keeping one eye on what is politically acceptable at home and the other eye on protecting national economic interest merges with an immature and chaotic global decision-making process to make progress glacial.

So critically for our story, and the good news here, is that while little happened over these decades at the upper ends of political power, except for a greater understanding of the challenge, enormous strides were taken in the bottom-up process. Many, many millions have joined the ranks of the passionate and committed people working as activists, scientists, entrepreneurs, policy makers, corporate sustainability champions, and ordinary citizens. They have slowly but surely changed the way we all think, so that today everyone’s an environmentalist.

When the history of environmentalism is written, 2010 will be the point when pretty much everyone was on board and has agreed: “Someone should do something!” Now all we have to do is work out who that’s going to be. We’ll return to this at the end of the story.

However, despite the extraordinary levels of activity, the millions of people and billions of dollars focused on the effort, nothing of any real systemwide consequence has happened in response. We have all agreed the science is clear and indicates a major problem. We have fixed this river and that town, we have saved forests here and there, we have banned numerous toxic and dangerous chemicals, and we have become highly knowledgeable, now being able to monitor the total earth system as never before.

But where have we got to in the system as a whole?

That’s the next part of the story. For fifty years we’ve been saying we have to act on these issues or our children’s children will suffer the consequences. Well, we are their children’s children. So what’s going to happen?

To quote Winston Churchill (November 12, 1936):

They go on in strange paradox, decided only to be undecided, resolved to be irresolute, adamant for drift, solid for fluidity, all-powerful to be impotent.… Owing to past neglect, in the face of the plainest warnings, we have entered upon a period of danger. The era of procrastination, of half measures, of soothing and baffling expedience of delays, is coming to its close. In its place we are entering a period of consequences.… We cannot avoid this period, we are in it now.…