The Demon-Haunted World: Science as a Candle in the Dark - Carl Sagan, Ann Druyan (1997)

Chapter 14. Antiscience

There’s no such thing as objective truth. We make our own truth. There’s no such things as objective reality. We make our own reality. There are spiritual, mystical, or inner ways of knowing that are superior to our ordinary ways of knowing. If an experience seems real, it is real. If an idea feels right to you, it is right. We are incapable of acquiring knowledge of the true nature of reality. Science itself is irrational or mystical. It’s just another faith or belief system or myth, with no more justification than any other. It doesn’t matter whether beliefs are true or not, as long as they’re meaningful to you.

a summary of New Age beliefs,

from Theodore Schick Jr and Lewis Vaughn,

How to Think About Weird Things:

Critical Thinking for a New Age

(Mountain View, CA:

Mayfield Publishing Company, 1995)

If the established framework of science is plausibly in error (or arbitrary, or irrelevant, or unpatriotic, or impious, or mainly serving the interests of the powerful), then perhaps we can save ourselves the trouble of understanding what so many people think of as a complex, difficult, highly mathematical, and counterintuitive body of knowledge. Then all the scientists would have their comeuppance. Science envy could be transcended. Those who have pursued other paths to knowledge, those who have secretly harboured beliefs that science has scorned, could now have their place in the Sun.

The rate of change in science is responsible for some of the fire it draws. Just when we’ve finally understood something the scientists are talking about, they tell us it isn’t any longer true. And even if it is, there’s a slew of new things - things we never heard of, things difficult to believe, things with disquieting implications - that they claim to have discovered recently. Scientists can be perceived as toying with us, as wanting to overturn everything, as socially dangerous.

Edward U. Condon was a distinguished American physicist, a pioneer in quantum mechanics, a participant in the development of radar and nuclear weapons in World War II, research director of Corning Glass, director of the National Bureau of Standards, and president of the American Physical Society (as well as, late in his life, professor of physics at the University of Colorado, where he directed a controversial Air Force-funded scientific study of UFOs). He was one of the physicists whose loyalty to the United States was challenged by members of Congress - including Congressman Richard M. Nixon, who called for the revocation of his security clearance - in the late 1940s and early 1950s. The superpatriotic chairman of the House Committee on Un-American Activities (HCUA), Rep. J. Parnell Thomas, would call the physicist ‘Dr Condom’, the ‘weakest link’ in American security, and - at one point - the ‘missing link’. His view on Constitutional guarantees can be gleaned from the following response to a witness’s lawyer: The rights you have are the rights given you by this Committee. We will determine what rights you have and what rights you have not got before the Committee.’

Albert Einstein publicly called on all those summoned before HCUA to refuse to cooperate. In 1948, President Harry Truman at the Annual Meeting of the American Association for the Advancement of Science, and with Condon sitting beside him, denounced Rep. Thomas and HCUA on the grounds that vital scientific research ‘may be made impossible by the creation of an atmosphere in which no man feels safe against the public airing of unfounded rumors, gossip and vilification’. He called HCUA’s activities ‘the most un-American thing we have to contend with today. It is the climate of a totalitarian country.*

[* But Truman’s responsibility for the witch-hunt atmosphere of the late 1940s and early 1950s is considerable. His 1947 Executive Order 9835 authorized inquiries into the opinions and associates of all federal employees, without the right to confront the accuser or even, in most cases, to know what the accusation was. Those found wanting were fired. His Attorney General, Tom Clark, established a list of ‘subversive’ organizations so wide that at one time it included Consumer’s Union.]

The playwright Arthur Miller wrote The Crucible, about the Salem Witch Trials, in this period. When the drama opened in Europe, Miller was denied a passport by the State Department on the grounds that it was not in the best interests of the United States for him to travel abroad. On opening night in Brussels the play was greeted with tumultuous applause, whereupon the US Ambassador stood up and took a bow. Brought before HCUA, Miller was chastised for the suggestion that Congressional investigations might have something in common with witch trials; he replied, ‘The comparison is inevitable, sir.’ Thomas was shortly afterwards thrown in jail for fraud.

One summer in graduate school I was a student of Condon’s. I remember vividly his account of being brought up before some loyalty review board:

‘Dr Condon, it says here that you have been at the forefront of a revolutionary movement in physics called’ – and here the inquisitor read the words slowly and carefully – ‘quantum mechanics. It strikes this hearing that if you could be at the forefront of one revolutionary movement... you could be at the forefront of another.’

Condon, quick on his feet, replied that the accusation was untrue. He was not a revolutionary in physics. He raised his right hand: ‘I believe in Archimedes’ Principle, formulated in the third century BC. I believe in Kepler’s laws of planetary motion, discovered in the seventeenth century. I believe in Newton’s laws...’ And on he went, invoking the illustrious names of Bernoulli, Fourier, Ampere, Boltzmann and Maxwell. This physicist’s catechism did not gain him much. The tribunal did not appreciate humour in so serious a matter. But the most they were able to pin on Condon, as I recall, was that in high school he had a job delivering a socialist newspaper door-to-door on his bicycle.

Imagine you seriously want to understand what quantum mechanics is about. There is a mathematical underpinning that you musl first acquire, mastery of each mathematical subdiscipline leading you to the threshold of the next. In turn you must learn arithmetic, Euclidian geometry, high school algebra, differential and integral calculus, ordinary and partial differential equations, vector calculus, certain special functions of mathematical physics. matrix algebra, and group theory. For most physics students, this might occupy them from, say, third grade to early graduate school - roughly fifteen years. Such a course of study does not actually involve learning any quantum mechanics, but merely establishing the mathematical framework required to approach it deeply.

The job of the popularizer of science, trying to get across some idea of quantum mechanics to a general audience that has nol gone through these initiation rites, is daunting. Indeed, there are no successful popularizations of quantum mechanics in my opin ion, partly for this reason. These mathematical complexities are compounded by the fact that quantum theory is so resolutel) counterintuitive. Common sense is almost useless in approaching it. It’s no good, Richard Feynman once said, asking why it is thai way. No one knows why it is that way. That’s just the way it is.

Now suppose we were to approach some obscure religion 01 New Age doctrine or shamanistic belief system sceptically. We have an open mind; we understand there’s something interesting here; we introduce ourselves to the practitioner and ask for ar intelligible summary. Instead we are told that it’s intrinsically toe difficult to be explained simply, that it’s replete with ‘mysteries’ but if we’re willing to become acolytes for fifteen years, at the ene of that time we might begin to be prepared to consider the subjec seriously. Most of us, I think, would say that we simply don’t hav the time; and many would suspect that the business about fifteei years just to get to the threshold of understanding is evidence tha the whole subject is a bamboozle: if it’s too hard for us t understand, doesn’t it follow that it’s too hard for us to criticizeknowledgeably? Then the bamboozle has free rein.

So how is shamanistic or theological or New Age doctrine different from quantum mechanics? The answer is that even if we cannot understand it, we can verify that quantum mechanics works. We can compare the quantitative predictions of quantum theory with the measured wavelengths of spectral lines of the chemical elements, the behaviour of semiconductors and liquid helium, microprocessors, which kinds of molecules form from their constituent atoms, the existence and properties of white dwarf stars, what happens in masers and lasers, and which materials are susceptible to which kinds of magnetism. We don’t have to understand the theory to see what it predicts. We don’t have to be accomplished physicists to read what the experiments reveal. In every one of these instances, as in many others, the predictions of quantum mechanics are strikingly, and to high accuracy, confirmed.

But the shaman tells us that his doctrine is true because it too works - not on arcane matters of mathematical physics but on what really counts: he can cure people. Very well, then, let’s accumulate the statistics on shamanistic cures, and see if they work better than placebos. If they do, let’s willingly grant that there’s something here - even if it’s only that some illnesses are psychogenic, and can be cured or mitigated by the right attitudes and mental states. We can also compare the efficacy of alternative shamanistic systems.

Whether the shaman grasps why his cures work is another story. In quantum mechanics we have a purported understanding of Nature on the basis of which, step by step and quantitatively, we make predictions about what will happen if a certain experiment, never before attempted, is carried out. If the experiment bears out the prediction - especially if it does so numerically and precisely -we have confidence that we knew what we were doing. There are at best few examples with this character among shamans, priests and New Age gurus.

Another important distinction was suggested in Reason and Nature, the 1931 book by Morris Cohen, a celebrated philosopher of science:

To be sure, the vast majority of people who are untrained can accept the results of science only on authority. But there is obviously an important difference between an establishment that is open and invites every one to come, study its methods, and suggest improvement, and one that regards the questioning of its credentials as due to wickedness of heart, such as [Cardinal] Newman attributed to those who questioned the infallibility of the Bible... Rational science treats its credit notes as always redeemable on demand, while non-rational authoritarianism regards the demand for the redemption of its paper as a disloyal lack of faith.

The myths and folklore of many pre-modern cultures have explanatory or at least mnemonic value. In stories that everyone can appreciate and even witness, they encode the environment. Which constellations are rising or the orientation of the Milky Way on a given day of the year can be remembered by a story about lovers reunited or a canoe negotiating the sacred river. Since recognizing the sky is essential for planting and reaping and following the game, such stories have important practical value. They can also be helpful as psychological projective tests or as reassurances of humanity’s place in the Universe. But that doesn’t mean that the Milky Way really is a river or that a canoe really is traversing it before our eyes.

Quinine comes from an infusion of the bark of a particular tree from the Amazon rain forest. How did pre-modern people ever discover that a tea made from this tree, of all the plants in the forest, would relieve the symptoms of malaria? They must have tried every tree and every plant - roots, stems, bark, leaves -tried chewing on them, mashing them up, making an infusion. This constitutes a massive set of scientific experiments continuing over generations, experiments that moreover could not be duplicated today for reasons of medical ethics. Think of how many bark infusions from other trees must have been useless, or made the patient retch or even die. In such a case, the healer chalks these potential medicines off the list, and moves on to the next. The data of ethnopharmacology may not be systematically or even consciously acquired. By trial and error, though, and carefully remembering what worked, eventually they get there - using the rich molecular riches in the plant kingdom to accumulate a pharmacopoeia that works. Absolutely essential, life-saving information can be acquired from folk medicine and in no other way. We should be doing much more than we are to mine the treasures in such folk knowledge worldwide.

Likewise for, say, predicting the weather in a valley near the Orinoco: it is perfectly possible that pre-industrial peoples have noted over the millennia regularities, premonitory indications, cause-and-effect relationships at a particular geographic locale of which professors of meteorology and climatology in some distant university are wholly ignorant. But it does not follow that the shamans of such cultures are able to predict the weather in Paris or Tokyo, much less the global climate.

Certain kinds of folk knowledge are valid and priceless. Others are at best metaphors and codifiers. Ethnomedicine, yes; astrophysics, no. It is certainly true that all beliefs and all myths are worthy of a respectful hearing. It is not true that all folk beliefs are equally valid if we’re talking not about an internal mindset, but about understanding the external reality.

For centuries, science has been under a line of attack that, rather than pseudoscience, can be called antiscience. Science, and academic scholarship in general, the contention these days goes, is too subjective. Some even allege it’s entirely subjective, as is, they say, history. History generally is written by the victors to justify their actions, to arouse patriotic fervour, and to suppress the legitimate claims of the vanquished. When no overwhelming victory takes place, each side writes self-promotional accounts of what really happened. English histories castigated the French, and vice versa; US histories until very recently ignored the de facto policies of lebensraum and genocide toward Native Americans; Japanese histories of the events leading to World War II minimize Japanese atrocities, and suggest that their chief purpose was altruistically to free East Asia from European and American colonialism; Poland was invaded in 1939, Nazi historians asserted, because Poland, ruthless and unprovoked, attacked Germany; Soviet historians pretended that the Soviet troops that put down the Hungarian (1956) and Czech (1968) Revolutions were invited in by general acclamation in the invaded nations rather than by Russian stooges; Belgian histories tend to gloss over the atrocities committed when the Congo was a private fiefdom of the King of Belgium; Chinese historians are strangely oblivious of the tens of millions of deaths caused by Mao Zedong’s ‘Great Leap Forward’; that God condones and even advocates slavery was repeatedly argued from the pulpit and in the schools in Christian slave-holding societies, but Christian polities that have freed their slaves are mostly silent on the matter; as brilliant, widely read and sober a historian as Edward Gibbon would not meet with Benjamin Franklin when they found themselves at the same English country inn, because of the late unpleasantness of the American Revolution. (Franklin then volunteered source material to Gibbon when he turned, as Franklin was sure he soon would, from the decline and fall of the Roman Empire to the decline and fall of the British Empire. Franklin was right about the British Empire, but his timetable was about two centuries early.)

These histories have traditionally been written by admired academic historians, often pillars of the establishment. Local dissent is given short shrift. Objectivity is sacrificed in the service of higher goals. From this doleful fact, some have gone so far as to conclude that there is no such thing as history, no possibility of reconstructing the actual events; that all we have are biased self-justifications; and that this conclusion stretches from history to all of knowledge, science included.

And yet who would deny that there were actual sequences of historical events, with real causal threads, even if our ability to reconstruct them in their full weave is limited, even if the signal is awash in an ocean of self-congratulatory noise? The danger of subjectivity and prejudice has been apparent from the beginning of history. Thucydides warned against it. Cicero wrote

The first law is that the historian shall never dare to set down what is false; the second, that he shall never dare to conceal the truth; the third, that there shall be no suspicion in his work of either favouritism or prejudice.

Lucian of Samosata, in How History Should Be Written, published in the year 170, urged “The historian should be fearless and incorruptible; a man of independence, loving frankness and truth’.

It is the responsibility of those historians with integrity to try to reconstruct that actual sequence of events, however disappointing or alarming it may be. Historians learn to suppress their natural indignation about affronts to their nations and acknowledge, where appropriate, that their national leaders may have committed atrocious crimes. They may have to dodge outraged patriots as an occupational hazard. They recognize that accounts of events have passed through biased human filters, and that historians themselves have biases. Those who want to know what actually happened will become fully conversant with the views of historians in other, once adversary, nations. All that can be hoped for is a set of successive approximations: by slow steps, and through improving self-knowledge, our understanding of historical events improves.

Something similar is true in science. We have biases; we breathe in the prevailing prejudices from our surroundings like everyone else. Scientists have on occasion given aid and comfort to a variety of noxious doctrines (including the supposed ‘superiority’ of one ethnic group or gender over another from measurements of brain size or skull bumps or IQ tests). Scientists are often reluctant to offend the rich and powerful. Occasionally, a few of them cheat and steal. Some worked - many without a trace of moral regret -for the Nazis. Scientists also exhibit biases connected with human chauvinisms and with our intellectual limitations. As I’ve discussed earlier, scientists are also responsible for deadly technologies - sometimes inventing them on purpose, sometimes being insufficiently cautious about unintended side-effects. But it is also scientists who, in most such cases, have blown the whistle alerting us to the danger.

Scientists make mistakes. Accordingly, it is the job of the scientist to recognize our weakness, to examine the widest range of opinions, to be ruthlessly self-critical. Science is a collective enterprise with the error-correction machinery often running smoothly. It has an overwhelming advantage over history, because in science we can do experiments. If you are unsure of the negotiations leading to the Treaty of Paris in 1814-15, replaying the events is an unavailable option. You can only dig into old records. You cannot even ask questions of the participants. Every one of them is dead.

But for many questions in science, you can rerun the event as many times as you like, examine it in new ways, test a wide range of alternative hypotheses. When new tools are devised, you can perform the experiment again and see what emerges from your improved sensitivity. In those historical sciences where you cannot arrange a rerun, you can examine related cases and begin to recognize their common components. We can’t make stars explode at our convenience, nor can we repeatedly evolve through many trials a mammal from its ancestors. But we can simulate some of the physics of supernova explosions in the laboratory, and we can compare in staggering detail the genetic instructions of mammals and reptiles.

The claim is also sometimes made that science is as arbitrary or irrational as all other claims to knowledge, or that reason itself is an illusion. The American revolutionary, Ethan Alien - leader of the Green Mountain Boys in their capture of Fort Ticonderoga -had some words on this subject:

Those who invalidate reason ought seriously to consider whether they argue against reason with or without reason; if with reason, then they establish the principle that they are laboring to dethrone: but if they argue without reason (which, in order to be consistent with themselves they must do), they are out of reach of rational conviction, nor do they deserve a rational argument.

The reader can judge the depth of this argument.

Anyone who witnesses the advance of science first-hand sees an intensely personal undertaking. There are always a few - driven by simple wonder and great integrity, or by frustration with the inadequacies of existing knowledge, or simply upset with themselves for their imagined inability to understand what everyone else can - who proceed to ask the devastating key questions. A few saintly personalities stand out amidst a roiling sea of jealousies, ambition, backbiting, suppression of dissent, and absurd conceits. In some fields, highly productive fields, such behaviour is almost the norm.

I think all that social turmoil and human weakness aids the enterprise of science. There is an established framework in which any scientist can prove another wrong and make sure everyone else knows about it. Even when our motives are base, we keep stumbling on something new.

The American chemistry Nobel laureate Harold C. Urey once confided to me that as he got older (he was then in his seventies), he experienced increasingly concerted efforts to prove him wrong. He described it as ‘the fastest gun in the West’ syndrome: the young man who could outdraw the celebrated old gunslinger would inherit his reputation and the respect paid to him. It was annoying, he grumbled, but it did help direct the young whipper-snappers into important areas of research that they would never have entered on their own.

Being human, scientists also sometimes engage in observational selection: they like to remember those cases when they’ve been right and forget when they’ve been wrong. But in many instances, what is ‘wrong’ is partly right, or stimulates others to find out what’s right. One of the most productive astrophysicists of our time has been Fred Hoyle, responsible for monumental contributions to our understanding of the evolution of stars, the synthesis of the chemical elements, cosmology and much else. Sometimes he’s succeeded by being right before anyone else even understood that there was something that needed explaining. Sometimes he’s succeeded by being wrong - by being so provocative, by suggesting such outrageous alternatives that the observers and experimentalists feel obliged to check it out. The impassioned and concerted effort to ‘prove Fred wrong’ has sometimes failed and sometimes succeeded. In almost every case, it has pushed forward the frontiers of knowledge. Even Hoyle at his most outrageous -for example, proposing that the influenza and HIV viruses are dropped down on Earth from comets, and that interstellar dust grains are bacteria - has led to significant advances in knowledge (although turning up nothing to support those particular notions.)

It might be useful for scientists now and again to list some of their mistakes. It might play an instructive role in illuminating and

demythologizing the process of science and in enlightening younger scientists. Even Johannes Kepler, Isaac Newton, Charles Darwin, Gregor Mendel and Albert Einstein made serious mistakes. But the scientific enterprise arranges things so that teamwork prevails: what one of us, even the most brilliant among us, misses, another of us, even someone much less celebrated and capable, may detect and rectify.

For myself, I’ve tended in past books to recount some of the occasions when I’ve been right. Let me here mention a few of the cases where I’ve been wrong: at a time when no spacecraft had been to Venus, I thought at first that the atmospheric pressure was several times that on Earth, rather than many tens of times. I thought the clouds of Venus were made mainly of water, when they turn out to be only 25 per cent water. I thought there might be plate tectonics on Mars, when close-up spacecraft observations now show hardly a hint of plate tectonics. I thought the highish infrared temperatures of Titan might be due to a sizeable greenhouse effect there; instead, it turns out, it is caused by a stratospheric temperature inversion. Just before Iraq torched the Kuwaiti oil wells in January 1991, I warned that so much smoke might get so high as to disrupt agriculture in much of South Asia; as events transpired, it was pitch black at noon and the temperatures dropped 4-6°C over the Persian Gulf, but not much smoke reached stratospheric altitudes and Asia was spared. I did not sufficiently stress the uncertainty of the calculations.

Different scientists have different speculative styles, some being much more cautious than others. As long as new ideas are testable and scientists are not overly dogmatic, no harm is done; indeed, considerable progress can be made. In the first four instances I’ve just mentioned where I was wrong, I was trying to understand a distant world from a few clues in the absence of thorough spacecraft investigations. In the natural course of planetary exploration more data come in, and we find an army of old ideas ploughed down by an armamentarium of new facts.

Postmodernists have criticized Kepler’s astronomy because it emerged out of his medieval, monotheistic religious views; Darwin’s evolutionary biology for being motivated by a wish to perpetuate the privileged social class from which he came, or to justify his supposed prior atheism; and so on. Some of these claims are just. Some are not. But why does it matter what biases and emotional predispositions scientists bring to their studies, so long as they are scrupulously honest and other people with different proclivities check their results? Presumably no one would argue that the conservative view on the sum of fourteen and twenty-seven differs from the liberal view, or that the mathematical function that is its own derivative is the exponential in the northern hemisphere but some other function in the southern. Any regular periodic function can be represented to arbitrary accuracy by a Fourier series in Muslim as well as in Hindu mathematics. Non-commutative algebras (where A times B does not equal B times A) are as self-consistent and meaningful for speakers of Indo-European languages as for speakers of Finno-Ugric. Mathematics might be prized or ignored, but it is equally true everywhere - independent of ethnicity, culture, language, religion, ideology.

Towards the opposite extreme, there are questions such as whether abstract expressionism can be ‘great’ art, or rap ‘great’ music; whether it’s more important to curb inflation or unemployment; whether French culture is superior to German culture; or whether prohibitions against murder should apply to the nation state. Here the questions are oversimple, or the dichotomies false, or the answers dependent on unspoken assumptions. Here local biases might very well determine the answers.

Where in this subjective continuum, from almost fully independent of cultural norms to almost wholly dependent on them, does science lie? Although issues of bias and cultural chauvinism certainly arise, and although its content is continually being refined, science is clearly much closer to mathematics than it is to fashion. The claim that its findings are in general arbitrary and biased is not merely tendentious, but specious.

The historians Joyce Appleby, Lynn Hunt and Margaret Jacob (in Telling the Truth About History, 1994) criticize Isaac Newton: he is said to have rejected the philosophical position of Descartes because it might challenge conventional religion and lead to social chaos and atheism. Such criticisms amount only to the charge that scientists are human. How Newton was buffeted by the intellectual currents of his time is of course of interest to the historian of ideas; but it has little bearing on the truth of his propositions. For them to be generally accepted, they must convince atheists and theists alike. This is just what happened.

Appleby and her colleagues claim that ‘When Darwin formulated his theory of evolution, he was an atheist and a materialist,’ and suggest that evolution was a product of a purported atheist agenda. They have hopelessly confused cause and effect. Darwin was about to become a minister of the Church of England when the opportunity to sail on HMS Beagle presented itself. His religious ideas, as he himself described them, were at the time highly conventional. He found every one of the Anglican Articles of Faith entirely believable. Through his interrogation of Nature, through science, it slowly dawned on him that at least some of his religion was false. That’s why he changed his religious views.

Appleby and her colleagues are appalled at Darwin’s description of ‘the low morality of savages... their insufficient powers of reasoning... [their] weak power of self-command’, and state that ‘now many people are shocked by his racism’. But there was no racism at all, as far as I can tell, in Darwin’s comment. He was alluding to the inhabitants of Tierra del Fuego, suffering from grinding scarcity in the most barren and Antarctic province of Argentina. When he described a South American woman of African origin who threw herself to her death rather than submit to slavery, he noted that it was only prejudice that kept us from seeing her defiance in the same heroic light as we would a similar act by the proud matron of a noble Roman family. He was himself almost thrown off the Beagle by Captain FitzRoy for his militant opposition to the Captain’s racism. Darwin was head and shoulders above most of his contemporaries in this regard.

But again, even if he was not, how does it affect the truth or falsity of natural selection? Thomas Jefferson and George Washington owned slaves; Albert Einstein and Mohandas Gandhi were imperfect husbands and fathers. The list goes on indefinitely. We are all flawed and creatures of our times. Is it fair to judge us by the unknown standards of the future? Some of the habits of our age will doubtless be considered barbaric by later generations - perhaps for insisting that small children and even infants sleep alone instead of with their parents; or exciting nationalist passions as a means of gaining popular approval and achieving high political office; or allowing bribery and corruption as a way of life; or keeping pets; or eating animals and jailing chimpanzees; or criminalizing the use of euphoriants by adults; or allowing our children to grow up ignorant.

Occasionally, in retrospect, someone stands out. In my book, the English-born American revolutionary Thomas Paine is one such. He was far ahead of his time. He courageously opposed monarchy, aristocracy, racism, slavery, superstition and sexism when all of these constituted the conventional wisdom. He was unswerving in his criticism of conventional religion. He wrote in The Age of Reason:‘Whenever we read the obscene stories, the voluptuous debaucheries, the cruel and torturous executions, the unrelenting vindictiveness with which more than half the Bible is filled, it would be more consistent that we called it the word of a demon than the word of God. It … has served to corrupt and brutalize mankind.’ At the same time the book exhibited the deepest reverence for a Creator of the Universe whose existence Paine argued was apparent at a glance at the natural world. But condemning much of the Bible while embracing God seemed an impossible position to most of his contemporaries. Christian theologians concluded he was drunk, mad or corrupt. The Jewish scholar David Levi forbade his co-religionists from even touching, much less reading, the book. Paine was made to suffer so much for his views (including being thrown into prison after the French Revolution for being too consistent in his opposition to tyranny) that he became an embittered old man.*

[* Paine was the author of the revolutionary pamphlet ‘Common Sense’. Published on 10 January 1776, it sold over half a million copies in the next few months and stirred many Americans to the cause of independence. He was the author of the three best-selling books of the eighteenth century. Later generations reviled him for his social and religious views. Theodore Roosevelt called him a ‘filthy little atheist’ – despite his profound belief in God. He is probably the most illustrious American revolutionary uncommemorated by a monument in Washington, DC.]

Yes, the Darwinian insight can be turned upside down and grotesquely misused: voracious robber barons may explain their cut-throat practices by an appeal to Social Darwinism; Nazis and other racists may call on ‘survival of the fittest’ to justify genocide. But Darwin did not make John D. Rockefeller or Adolf Hitler. Greed, the Industrial Revolution, the free enterprise system, and corruption of government by the monied are adequate to explain nineteenth-century capitalism. Ethnocentrism, xenophobia, social hierarchies, the long history of anti-Semitism in Germany, the Versailles Treaty, German child-rearing practices, inflation and the Depression seem adequate to explain Hitler’s rise to power. Very like these or similar events would have transpired with or without Darwin. And modern Darwinism makes it abundantly clear that many less ruthless traits, some not always admired by robber barons and Fiihrers -- altruism, general intelligence, compassion - may be the key to survival.

If we could censor Darwin, what other kinds of knowledge could also be censored? Who would do the censoring? Who among us is wise enough to know which information and insights we can safely dispense with, and which will be necessary ten or a hundred or a thousand years into the future? Surely we can exert some discretion on which kinds of machines and products it is safe to develop. We must in any case make such decisions, because we do not have the resources to pursue all possible technologies. But censoring knowledge, telling people what they must think, is the aperture to thought police, authoritarian government, foolish and incompetent decision-making and long-term decline.

Fervid ideologues and authoritarian regimes find it easy and natural to impose their views and suppress the alternatives. Nazi scientists, such as the Nobel laureate physicist Johannes Stark, distinguished fanciful, imaginary ‘Jewish science’, including relativity and quantum mechanics, from realistic, practical Aryan science’. Another example: ‘A new era of the magical explanation of the world is rising,’ said Adolf Hitler, ‘an explanation based on will rather than knowledge. There is no truth, in either the moral or the scientific sense.’

As he described it to me three decades later, in 1922 the American geneticist Hermann J. Muller flew from Berlin to Moscow in a light plane to witness the new Soviet society firsthand. He must have liked what he saw, because - after his discovery that radiation makes mutations (a discovery that would later win him a Nobel Prize) - he moved to Moscow to help establish modern genetics in the Soviet Union. But by the middle 1930s a charlatan named Trofim Lysenko had caught the notice and then the enthusiastic support of Stalin. Lysenko argued that genetics - which he called ‘Mendelism-Weissmanism-Morganism’, after some of the founders of the field - had an unacceptable philosophical base, and that philosophically ‘correct’ genetics, genetics that paid proper obeisance to communist dialectical materialism, would yield very different results. In particular, Lysenko’s genetics would permit an additional crop of winter wheat - welcome news to a Soviet economy reeling from Stalin’s forced collectivization of agriculture.

Lysenko’s purported evidence was suspect, there were no experimental controls, and his broad conclusions flew in the face of an immense body of contradictory data. As Lysenko’s power grew, Muller passionately argued that classical Mendelian genetics was in full harmony with dialectical materialism, while Lysenko, who believed in the inheritance of acquired characteristics and denied a material basis of heredity, was an ‘idealist’, or worse. Muller was strongly supported by N.I. Vavilov, erstwhile president of the All-Union Academy of Agricultural Sciences.

In a 1936 address to the Academy of Agricultural Sciences, now presided over by Lysenko, Muller gave a stirring address that included these words:

If the outstanding practitioners are going to support theories and opinions that are obviously absurd to everyone who knows even a little about genetics - such views as those recently put forward by President Lysenko and those who think as he does - then the choice before us will resemble the choice between witchcraft and medicine, between astrology and astronomy, between alchemy and chemistry.

In a country of arbitrary arrests and police terror, this speech displayed exemplary - many thought foolhardy - integrity and courage. In The Vavilov Affair (1984), the Soviet emigre historian Mark Popovsky describes these words as being accompanied by ‘thunderous applause from the whole hall’ and ‘remembered by everyone still living who took part in the session’.

Three months later, Muller was visited in Moscow by a Western geneticist who expressed astonishment at a widely circulated letter, signed by Muller, that condemned the prevalence of ‘Mendelism-Weissmanism-Morganism’ in the West and that urged a boycott of the forthcoming International Congress of Genetics. Having never seen, much less signed, such a letter, an outraged Muller concluded that it was a forgery perpetrated by Lysenko. Muller promptly wrote an angry denunciation of Lysenko to Pravda and mailed a copy to Stalin.

The next day Vavilov came to Muller in a state of some agitation, informing him that he, Muller, had just volunteered to serve in the Spanish Civil War. The letter to Pravda had put Muller’s life in danger. He left Moscow the next day, just evading, so he was later told, the NKVD, the secret police. Vavilov was not so lucky, and perished in 1943 in Siberia.

With the continuing support of Stalin and later of Khrushchev, Lysenko ruthlessly suppressed classical genetics. Soviet school biology texts in the early 1960s had as little about chromosomes and classical genetics as many American school biology texts have about evolution today. But no new crop of winter wheat grew; incantations of the phrase ‘dialectical materialism’ went unheard by the DNA of domesticated plants; Soviet agriculture remained in the doldrums; and today, partly for this reason, Russia -world-class in many other sciences - is still almost hopelessly backward in molecular biology and genetic engineering. Two generations of modern biologists have been lost. Lysenkoism was not overthrown until 1964, in a series of debates and votes at the Soviet Academy of Sciences - one of the few institutions to maintain a degree of independence from the leaders of party and state - in which the nuclear physicist Andrei Sakharov played an outstanding role.

Americans tend to shake their heads in astonishment at the Soviet experience. The idea that some state-endorsed ideology or popular prejudice would hogtie scientific progress seems unthinkable. For two hundred years Americans have prided themselves on being a practical, pragmatic, nonideological people. And yet anthropological and psychological pseudo-science has flourished in the United States - on race, for example. Under the guise of ‘creationism’, a serious effort continues to be made to prevent evolutionary theory - the most powerful integrating idea in all of biology, and essential for other sciences ranging from astronomy to anthropology - from being taught in the schools.

Science is different from many another human enterprise - not, of course, in its practitioners’ being influenced by the culture they grew up in, nor in sometimes being right and sometimes wrong (which are common to every human activity), but in its passion for framing testable hypotheses, in its search for definitive experiments that confirm or deny ideas, in the vigour of its substantive debate, and in its willingness to abandon ideas that have been found wanting. If we were not aware of our own limitations, though, if we were not seeking further data, if we were unwilling to perform controlled experiments, if we did not respect the evidence, we would have very little leverage in our quest for the truth. Through opportunism and timidity we might then be buffeted by every ideological breeze, with nothing of lasting value to hang on to.