Smoking Ears and Screaming Teeth - Trevor Norton (2010)
‘We know not, and cannot know, where safety ends and danger begins’
– Dr Walter Channing
Life is a series of uncontrolled experiments. We call some of these misadventures adolescence, courting and parenthood. From time to time people take unnecessary risks: darting across the road in front of an oncoming truck, swimming with sharks, jumping from a bridge in the expectation that an elastic rope won’t snap, or from a plane in the belief that parachutes always open. Like the pole-vaulter they think only of the leap, never the fall. Pioneers in any field take a leap into the unknown and sometimes it’s a risky venture.
In medicine, animal tests became a prerequisite to reduce the risks to patients from new drugs. This followed a scandal in the United States in 1937. Sulfanilamide was the first ‘wonder drug’ for bacterial infections. In those days there was no requirement for testing new compounds. One brand was sweetened with diethylene glycol, what we now know as anti-freeze. Patients began dying in agony. The (unqualified) chemist of the company swigged a slug of their concoction to demonstrate its safety. Within a day he was dangerously ill. Almost all the 909 litres of elixir that had been distributed were traced and confiscated, but too late to save the hundred and seven recipients who died.
Congress rapidly passed a bill that forced drug companies to prove that a product was safe, and animal testing became the norm. Although we share virtually all our physiology and biochemistry with other mammals, our responses are not always identical. Aspirin is fatal to cats and penicillin kills guinea pigs. Fortunately the biochemists who purified penicillin and carried out the first tests of its effects used mice. Had they chosen guinea pigs, the first antibiotic might never have made it into production.
The regulations governing animal experiments have always been more strict than those concerning experiments on humans, hence Jack Haldane’s quip: ‘To do the sort of things to a dog that are done to the average medical student requires a licence signed in triplicate by two archbishops.’ In an essay entitled ‘On being one’s own rabbit’, Haldane pointed out that rabbits make ‘little serious attempt to cooperate with one’. Worse still, ‘dumb’ animals can’t tell you how they feel, and this can be essential in physiological studies. Nor do animals share all our diseases. They are not susceptible to cholera or yellow fever so all the experimentation for those diseases had to be done on humans.
Experimenting on others comes naturally to us. Cleopatra was reluctant to commit suicide without first establishing which poisons were most rapid and least unpleasant. So, before plumping for an asp-assisted exit, she is said to have tested a range of products on her handmaidens. She dismissed strychnine as it was not only agonising but also left the deceased with a sardonic smile, which was most unflattering.
In the eighteenth and nineteenth centuries the slums of big cities swarmed with destitute citizens who were seen as appropriate subjects for medical experimentation. With the introduction of smallpox inoculations to protect the recipient from exposure to more virulent strains, inmates of Newgate Prison tested the safety of the procedure, with a pardon for the survivors. The success of this trial led to inoculation becoming a compulsory method of prevention and the term ‘conscientious objector’ was first used in 1898 to describe those who risked prosecution for refusing to have their children inoculated. When Dr Benjamin Waterhouse introduced vaccination into the USA, he persuaded the Boston Board of Health to conduct a public demonstration. Nineteen volunteers were given cowpox and then, two weeks later, an injection of smallpox. Two ‘control subjects’ received only smallpox. The trials were ‘completely successful’, which presumably means that only the controls died. Small sample sizes (often smaller by the end of the experiment) were a feature of such tests.
In those days few medics had qualms about such experiments because diseases were incubated in the slums from where ‘infectious miasmas’ arose to plague respectable citizens. It seemed appropriate that the poor should be the vehicle by which medicine advanced.
Sick patients were also readily to hand and in constant supply. The terminally ill were ‘obvious candidates’ for drug trials. They had nothing to lose, but also little to gain. If they were in the advanced stages of a fatal disease, even the most wondrous of wonder drugs would be unlikely to help them. It was neither a fair trial of the drug’s capabilities, nor a proper way to care for the dying.
Doctors did not restrict their treatment to curing the disease from which the patient was suffering. Often they ignored the central tenet of medicine – do no harm. The desire to research overruled the duty of care. A doctor confided that he didn’t tell patients they were taking part in experimental trials ‘out of consideration for the patient’. One doctor who was spraying deadly germs into the noses of patients admitted: ‘They thought I was treating them for nasal congestion.’ It was a betrayal of trust: ‘Our patients obeyed us gladly. Our zeal led them to respect and trust us. It never occurred to them to enquire whether this zeal was in the interests of treatment or in the interests of science.’
The zeal of surgeons later became apparent in the race to carry out the first successful heart transplant. Of all the patients receiving hearts up to June 1969, fifty died in less than a month, ninety lasted less than two and a half months and only two survived longer.
Dr Pappworth’s scouring of medical journals revealed that even in the 1950s and 1960s patients in Britain and the USA were regularly subjected to risky procedures that contributed nothing to their cure. Such abuses still occur. From 1998 to 2000 over a hundred children in a Catholic care home in New York were subjects in trials requiring high doses of dangerous drugs. Between 1998 and 2003 the General Medical Council in Britain took action in a dozen cases of fraudulent research by general practitioners. They were giving untested drugs to patients without explaining the dangers involved or even informing them that they were taking part in an experimental trial. At least one of the doctors received £100,000 from a pharmaceutical company for conducting the experiment.
Such conduct flouts the rules that govern medical practice. The Nuremberg Code was drawn up following the atrocities wrought by Nazi doctors on prisoners in concentration camps. At the heart of the Code and of others that followed is the necessity to obtain the patients’ informed consent before subjecting them to any experimental treatment. Even so, in 1954 the secretary of a hospital management board countered a patient’s complaint of being operated on without his permission by claiming: ‘If a patient comes into hospital … he is deemed to agree to receive treatment.’ Although nowadays every university and hospital has an ethics committee to assess all research proposals, much depends on how frank the researcher is with the volunteer about the potential discomfort and risk, and how much of the explanation of the procedures can be understood by the average patient.
Barry Marshall, who injected himself with Helicobacter, gave as the reason for experimenting on himself: ‘I was the only person informed enough to give consent.’ Enoch Hale, the first person to be injected with a medicine, also chose self-experimentation because only ‘professional men can estimate the inconvenience or risk to which they may be subjected’.
All researchers should ask themselves: ‘Would I submit myself to this experiment?’ If not, then the experiment should not be attempted. Dr Chauncey Leake, a distinguished pharmacologist who endured several painful self-experiments, was adamant that pharmacologists developing new drugs ‘have a moral obligation to try such drugs on themselves … before using them experimentally on any other human being’. Enoch Hale stated that for hazardous experiments volunteers should not be used ‘even if they would be willing to undergo them’. Most self-experimenters believed that in all conscience they couldn’t ask even the most plucky volunteer to swallow live parasites or undergo a painful or dangerous procedure.
Where there is danger, someone must go first. Self-experimenters have greater protection than the non-scientist because of their detailed knowledge of what they are doing. Physiologists and medics are better able to read their symptoms and recognise warning signs during their experiments. Self-experimenters also have a strong incentive to alleviate the stress of any procedure. As John Stapp said of his rocket-propelled sled experiments: ‘You can design harnesses and restraints that are far better after you ride with one of your mistakes.’
Scientific researchers are obsessives. Dr William Bean measured the growth of his fingernails daily for thirty years to correlate it with his health. Another American doctor cracked the finger joints on his left hand for fifty years to see if it accelerated arthritis. When this degree of determination is directed at more serious matters it may drive the experimenter to take unwarranted risks. Thomas Brittingham, who injected himself with blood from patients with cancer and leukaemia, became addicted to self-experimentation. He confessed that at the time he had not considered the impact on his family of his possible death.
Auguste Piccard cautioned against impetuosity. He stressed that every potential danger had to be anticipated and every risk assessed. Only then should the scientist proceed.
In reality it is often the unimagined risks that bite. Every self-experiment carries some danger. Otherwise they would be unnecessary. Jack Haldane put these dangers into perspective: ‘Others make experiments which are apparently dangerous, but really perfectly safe provided the theory on which they are based is sound. I have occasionally made experiments of this kind and if I had died in the course of one, I should, while dying, have regarded myself not as a martyr, but as a fool.’ Roald Amundsen, the polar explorer, put it more succinctly: ‘Adventure is just bad planning.’
Many self-experimenters think that it’s what the other guy does that’s dangerous. Jack Haldane wrote: ‘Experiments in which one stakes one’s life in the correctness of one’s biochemistry are far safer than those of an airplane designer who is prepared to fall a thousand feet if his aerodynamics are incorrect. They are also more likely to benefit humanity.’
Perhaps no action can be entirely altruistic, but the research of these experimenters comes close. Whatever their ambition and ego, they were as likely to be criticised as praised for their efforts. Werner Forssmann, the first to catheterise the heart, was sacked for his pioneering spirit. A few achieved fame, but how many of the pioneers in this book are known to the public? The Curies became famous for their research, not for their courage in continuing to work with radioactive chemicals after they knew they were dangerous. Self-experimenters are not masochistic or suicidal: they are the bold shock troops of research.
It is ethically preferable for John Scott Haldane and his colleagues to have tested the effects of poison gases on themselves, compared to some of the alternatives. In 1943 Australian soldiers volunteered for experiments with mustard gas. They were warned in advance that there was ‘a small risk of burns and blisters’. In the event they suffered severe burns all over the body, violent vomiting and headaches, loss of nails and teeth, and developed chronic lung problems. Some died. In the 1950s and early 1960s British servicemen who believed they were involved in research into the common cold were actually exposed to mustard gas and sarin, a deadly nerve gas. Several almost died and one did. The verdict of an inquest held over fifty years later was ‘unlawful killing’. Little use was made of the findings of either of these military experiments, whereas Haldane’s self-experiments led to the development of the gas mask. If a researcher risks his own life, he makes damn sure that the research has a clear purpose.
Although many pioneers led lives at the outer limits of probability, they coolly endured the dangers and discomforts that their experiments engendered because they believed their mission was worthwhile. The proof that it was often research of the highest standard is evidenced by the number of Nobel Laureates who were self-experimenters.
They took risks to make our lives safer. Jack Haldane said: ‘It is occasionally necessary to make experiments which one knows are dangerous, for example in determining how a disease is transmitted. A number of people have died in this way. It is to my mind the ideal way of dying.’ Haldane had no desire to die prematurely. His thoughts were noble, not suicidal.
Even some of the sternest critics of experimentation on humans applauded those who experimented on themselves. Sir William Osler, the greatest medical educator of his time, wrote: ‘The history of our profession is starred with the heroism of its members who have sacrificed health and sometimes life itself in endeavours to benefit their fellow creatures.’
In a selfish world, society needs such people. We should celebrate them.
Attempting any of the self-experiments described in this book is not recommended.