For the Relief of Unbearable Pain: The Discovery of Anesthesia - Breakthrough!: How the 10 Greatest Discoveries in Medicine Saved Millions and Changed Our View of the World - Jon Queijo

Breakthrough!: How the 10 Greatest Discoveries in Medicine Saved Millions and Changed Our View of the World - Jon Queijo (2010)

Chapter 4. For the Relief of Unbearable Pain: The Discovery of Anesthesia


Even in today’s world of high-tech medicine, where the traditional skills of medicine are increasingly lost to the convenience of digital sensors and gadgets, it’s surprising how few physicians rue—or even remember—the lost art of nut-cracking.

It’s a shame, because if you happened to have a knack for that sort of thing—the amount of force required to crack open various nuts based on the hardness of their shells—you might just have the skill needed to be anesthesiologist in the dark ages of medicine. For, as one ancient prescription directed: Place a wooden bowl on the patient’s head, and knock him unconscious by striking the bowl with “sufficient strength to crack an almond, but leave the skull intact.”

Or perhaps your talents lie in the art of delicate strangulation. In this forgotten method of anesthesia, practitioners asphyxiated their patients to the point of unconsciousness without, hopefully, also killing them. This method was used by the Assyrians prior to circumcising their children—undoubtedly without prior written consent—and was used in Italy as late as the 1600s.

Of course, many less traumatic methods have been used throughout history in an attempt to spare patients the pain of a surgeon’s knife, including various opium preparations, the sleep-inducing seeds of henbane, the human-shaped mandrake root which, in addition to numbing pain, was said to emit a scream when pulled from the ground, and, the age-old favorite, alcohol.

Unfortunately, all early methods of anesthesia shared three key shortcomings: They didn’t work well, or they killed you, or—in some cases—both. In fact, true anesthesia—defined as the ability to reliably and safely produce partial or complete loss of sensation, with or without loss of consciousness—was not officially “discovered” until 1846. This itself is painful to consider, given how many patients until that time suffered the most excruciating operations, from wrenching tooth extractions to gruesome amputations, with little or no pain relief. In fact, until the mid-nineteenth century, perhaps the only meaningful choice a patient had when selecting a surgeon was to ask how fast he was. Which is why you would have wanted someone like William Cheselden or Dominique-Jean Larrey at your operating table: The former, an English surgeon, could remove a kidney stone in 54 seconds. The latter, chief surgeon in Napoleon’s army, could perform an amputation in as little as 15 seconds.

* * *

Sadly, for Fanny Burney, renowned nineteenth-century English novelist whose writings inspired Jane Austen, neither anesthesia nor surgical speed could save her from what must be among the most horrifying patient accounts of a major operation performed without anesthesia. On September 30, 1811, surgeons performed a full mastectomy to remove Burney’s cancerous right breast, a procedure that lasted nearly four hours. Burney somehow managed to survive the ordeal and described it nine months later in a letter to her sister. As she recounted, her only “anesthesia” was a wine cordial (liqueur) and the benefit of not knowing about the operation until just two hours before it began. But even the short notice was of little help. “This, indeed, was a dreadful interval,” she wrote. “Two hours thus spent seemed never-ending.”

It’s not hard to sympathize with Burney’s dread when she first entered the room in her house that had been prepared for the operation. “The sight of the immense quantity of bandages, compresses, and sponges made me a little sick. I walked backwards and forwards till I quieted all emotions and became, by degrees, nearly stupid, torpid, without sentiment or consciousness, and thus I remained till the clock struck three.”

Nor did her confidence improve when “seven men in black”—her physicians and their assistants—suddenly entered her home.

“I was now awakened from my stupor by a sort of indignation. Why so many and without my permission? But I could not utter a syllable... I began to tremble violently, more with distaste and horror of the preparations, even than of the pain.”

A short time later, as Burney was guided onto the operating “mattress,” she was given one final semblance of anesthesia: a linen handkerchief placed over her face to prevent her from seeing the proceedings. Unfortunately, even that failed its simple duty.

“It was transparent, and I saw through it that my bed was instantly surrounded by the seven men and my nurse. When, bright through the handkerchief, I saw the glitter of polished steel, I closed my eyes... A silence the most profound ensued, which lasted for some minutes, during which I imagine they took their orders by signs and made their examination. Oh, what a horrible suspension!”

The suspension was broken by more bad news when Burney, who had been expecting that only a limited amount of tissue would be removed, overheard their decision to remove her entire right breast. “I started up, threw off my veil, and cried out... I explained the nature of my sufferings, which all sprang from one point...”

But though the doctors listened “attentively,” they responded with “utter silence.” The veil was replaced, and Burney surrendered all resistance. The operation proceeded, as she recalled in vivid detail to her sister:

“When the dreadful steel was plunged into my breast—cutting through veins, arteries, flesh, nerves... I began a scream that lasted uninterruptedly during the whole time of the incision. I almost marvel that it doesn’t still ring in my ear, so excruciating was the agony. When the instrument was withdrawn, the pain seemed undiminished, for the air that suddenly rushed into those delicate parts felt like a mass of minute but sharp and forked daggers that were tearing the edge of the wound.”

And later, “When the instrument was withdrawn a second time, I concluded the operation was over—Oh no! Presently the terrible cutting was renewed and worse than ever... Oh Heaven! I then felt the knife tackling against the breast bone, scraping it!”

Burney recalled fainting twice during the operation, and finally, “When all was done, they lifted me up. My strength was so totally annihilated that I could not even sustain my hands and arms, which hung as if I had been lifeless, while my face, as the nurse has told me, was utterly colorless.” She added, “For months I could not speak of this terrible business without nearly again going through it. Even now, nine months after it is over, I have a headache from going on with the account.”

A painfully long wait: why it took 50 years for anesthesia to finally arrive

The good news is that Burney lived another 29 years after the operation. The bad news is that she need not have endured the horrors of surgery without anesthesia because in 1800—11 years before her operation—English scientist Humphry Davy had discovered something remarkable about a gas he had been experimenting with: “As nitrous oxide... appears capable of destroying physical pain,” Davy wrote, “it may be probably used with advantage during surgical operations...”

This is the kind of prophetic statement that can drive historians crazy. If Davy had observed the “pain-destroying” properties of nitrous oxide as early as 1800—with others soon realizing that ether and chloroform had similar properties—why did it take nearly another 50 years for doctors to “officially” discover anesthesia? While controversies and debates abound, most historians generally believe that a mixture of religious, social, medical, and technical factors created a world in which many people in the early nineteenth century didn’t want—or simply weren’t ready for—anesthesia.

One clue to this mystery is seen in the word “pain” itself. Derived from the Greek word poine, or penalty, it implies that pain is a form punishment from God for some committed sin, whether or not the person was aware of it. Thus, for those who believed pain was a form of divine justice, attempts to alleviate it were fundamentally immoral and strongly resisted. The power of such thinking became dramatically clear when debates arose in the 1840s over the morality of giving anesthesia to women during childbirth. In addition, various social factors—including those perhaps best grouped under the heading “pointless bravado”—also played a role. Historians note that in almost all civilizations, the ability to endure pain has been viewed as a sign of nobility, virility, and character. And finally, some nineteenth-century physicians opposed pain prevention because they believed it served a necessary physiological function, the elimination of which might interfere with healing.

Yet as Burney’s letter powerfully attests, many nineteenth-century patients facing the gleam of an approaching scalpel would have gladly welcomed the option of anesthesia. And most doctors would have gladly offered that option, if only out of self-interest. After all, nothing is so disruptive to the fine motor skills as a squirming, struggling, screaming patient. This was understood as far back as the fifth century BC, when the world’s first physician explained his view on the matter. The role of the patient, Hippocrates wrote in a treatise on surgery, was “to accommodate the operator... and maintain the figure and position of the part operated on...” And, oh yes, as he comes at you with that scalpel, “Avoid sinking down, and shrinking from or turning away.”

* * *

But to understand the factors that paradoxically delayed and led to discovery of anesthesia, we must look more deeply into the nature of anesthesia itself and its effect on human consciousness. Beginning in 1800, the discovery of medical anesthesia followed a quixotic, four-decade journey marked by a mixture of nobility and folly, curiosity and exhibitionism, courage and foolishness, callousness and compassion. And to begin that journey, one need look no further than the man who first observed and ignored the pain-killing potential of nitrous oxide. For it was Humphry Davy who, in the course of his scientific investigation of nitrous oxide, named the new gas Laughing Gas, inhaled up 20 quarts of it while seated in a sealed chamber and racing his pulse to 124 beats per minute, and who wrote of his experiences: “It made me dance about the laboratory as a madman and has kept my spirits in a glow ever since... The sensations were superior to any I ever experienced...inconceivably pleasurable...I seemed to be a sublime being, newly created and superior to other mortals...”

Milestone #1 From philanthropy to frivolity: the discovery (and dismissal) of nitrous oxide

Upon hearing that in 1798 an Englishman named Thomas Beddoes established a “Pneumatic Institution” in Bristol, England, many people today might imagine a group of scholars studying the design of jack-hammers and tubeless rubber tires. In reality, the Pneumatic Institution for Inhalation Gas Therapy, as it was formally known, was a venture that pushed the frontiers of late eighteenth-century medical science. At the time, scientists had recently discovered that air was not a single substance, but a combination of gases. What’s more, experiments by people like Joseph Priestly—who discovered nitrous oxide in 1772—revealed that different gases had different effects in the body. To enterprising people like Beddoes—well aware of the foul air now beginning to suffocate and sicken industrialized cities—the new science of gases created a market for health resorts and spas where people could be treated with various “therapeutic airs.” Equally important, the Pneumatic Institute funded the scientific study of gases, and one of its most precocious and talented researchers was 20-year-old Humphry Davy.

Davy was put to work in the laboratory to investigate the effects of nitrous oxide, a job that included not only inhaling the gas himself, but inviting visitors to inhale and report how it made them feel. During one of his experiments, Davy noticed something peculiar about the gas: It relieved the pain he was experiencing from an erupting wisdom tooth. But though this discovery led to his famous observation about the potential of nitrous oxide to relieve surgical pain, Davy became sidetracked by other intriguing properties of the gas.

In his 1800 report titled “Researches, Chemical and Philosophical, Chiefly Concerning Nitrous Oxide or Dephlogisticated Nitrous Air, and its Respiration,” Davy gave lengthy and vivid descriptions of these properties based on his own inhalations, including such entries as:

“My visible impressions were dazzling... By degrees as the pleasurable sensations increased, I lost all connection with external things. Trains of vivid visible images rapidly passed through my mind and were connected with words in such a manner as to produce perceptions perfectly novel. I existed in a world of newly connected and newly modified ideas...”

When Davy asked volunteers who had inhaled nitrous oxide in his laboratory to write accounts of their experience, most reported being as amazed—and pleased—as Davy: “It is not easy to describe my sensations,” a Mr. J. W. Tobin wrote. “They were superior to anything I ever before experienced. My senses were more alive to every surrounding impression. My mind was elevated to a most sublime height.” Mr. James Thomson described “a thrilling sensation about the chest, highly pleasurable, which increased to such a degree as to induce a fit of involuntary laughter, which I in vain endeavored to repress...” And although some, like Mr. M. M. Coates, harbored suspicions that such reports were more due to an overactive imagination than pharmacologic effects, they quickly became converts: “I had no expectations of its influence on myself,” Coates wrote, “but after a few seconds, I felt an immoderate flow of spirits and an irresistible propensity to violent laughter and dancing, which, being fully conscious of their irrational exhibition, I made great but ineffectual efforts to restrain...”

Attempting to better understand the effects of nitrous oxide on the body and mind, Davy even gave the gas to two paralyzed patients and asked how it made them feel. One reported, “I do not know how, but very queer,” while the other said, “I felt like the sound of a harp.” Davy thoughtfully wrote that the first patient probably had no analogous feeling with which to compare the sensations, while the second was able to compare it with a former experience with music.

As Davy continued to explore the visions and sensations produced by nitrous oxide, he contemplated their meaning with regard to philosophy and his interest in poetry. Thus, he formed a kind of club of artists—including the poets Robert Southey and Samuel Taylor Coleridge—with whom he could share the gas and discuss its effects on artistic sensibility. Southey, after receiving the gas from Davy, raved that it gave him “a feeling of strength and an impulse to exert every muscle. For the remainder of the day it left me with increased hilarity and with my hearing, taste, and smell certainly more acute. I conceive this gas to be the atmosphere of Mohammed’s Paradise.” While Coleridge’s response was more measured, he wrote to Davy that “My sensations were highly pleasurable... more unmingled pleasure than I had ever before experienced.”

While this all might sound like the birth of a 1960s drug cult, it’s important to understand that Davy’s boss, Thomas Beddoes, was a physician and well-intentioned philanthropist whose goal in forming the Pneumatic Institution was to produce a revolution in medicine. By experimenting with various gases, he hoped to treat “excruciating diseases” as well as conditions where “languor and depression are scarce less intolerable than the most intense pain.” One can’t help admire the sincerity of his intentions—and thus the motivation behind Davy’s experiments—when Beddoes wrote that he hoped to “diminish the sum of our painful sensations.”

Yet despite such lofty ambitions, Davy’s investigations of the euphoric effects of nitrous oxide ultimately distracted him from studying its potential for anesthesia. What’s more, Davy eventually lost interest in nitrous oxide altogether: Within two years, he left the Institution to pursue other areas of scientific investigation. Although Davy later received acclaim for discovering the elements potassium, sodium, calcium, barium, magnesium, strontium, and chlorine, he never followed up on his observations of the “pain-destroying” effects of laughing gas. Indeed, within a few years, nitrous oxide was no longer being seriously studied at all. By 1812, one former enthusiast was warning in lectures that the gas “consumes, wastes, and destroys life in the same manner as oxygen wastes a taper, by making it burn too quick.” Some historians even assert that nitrous oxide was “ridiculed into obscurity” by those who mocked the silly behavior of people under its influence.

And thus the first forays into anesthesia ran head-long into an ignominious and giggling dead end. Yet, putting aside the image of a dancing Humphry Davy careening madly about his laboratory, laughing gas should not be blindly condemned for its distractingly euphoric effects: It was those very properties that led to the next milestone.

Milestone #2 25 years of “jags” and “frolics” culminate in public humiliation—and hope

While medicine missed its chance to discover anesthesia in the early 1800s, the powers of nitrous oxide were not so quickly dismissed by other members of society. By the 1830s, reports began to surface that the recreational pleasures of inhaling nitrous oxide were being widely enjoyed—in both England and America—by virtually all strata of society, including children, students, entertainers, showmen, and physicians. About the same time, a new pleasure appeared on the scene, only to be similarly ignored by medicine and adored by the public: ether.

Unlike nitrous oxide, ether was not a recent laboratory discovery. It had been prepared nearly three hundred years earlier, around 1540, by the Swiss alchemist and physician Paracelsus. What’s more, Paracelsus had observed that administering ether to chickens “quiets all suffering without harm, and relieves all pain.” Nevertheless, it received little scientific attention until 1818, when Michael Faraday—famous for his work in electromagnetism—observed that inhaling ether vapors could produce profound lethargy and insensibility to pain. Unfortunately, taking a page from Davy’s work with nitrous oxide, Faraday instead focused on the “exhilarating” properties of ether.

And so, by the 1830s, as physicians were condemning both nitrous oxide and ether as dangerous for medical practice, both gases were being embraced by the public for their exhilarating effects. According to one account published in 1835, “Some years ago... the lads of Philadelphia inhaled ether by way of sport... [causing] playfulness and sprightly movements...” Other accounts of the time refer to gatherings in which wandering lecturers and showmen invited people on-stage to inhale ether or nitrous oxide for the amusement of themselves and the audience. In fact, several pioneers of anesthesia claimed that it was their “ether frolics” during childhood that later inspired them to experiment with the gases for medical anesthesia.

Which brings us to what may be the first recorded “medical” use of ether for anesthesia. In 1839, William Clarke, like his fellow college students in Rochester, New York, had attended and participated in an ether frolic. Several years later, while a medical student at Vermont Medical College, Clarke’s experience triggered an idea. Under the supervision of his professor, he dripped some ether onto a towel and placed it over the face of a young woman who was about to have a tooth extracted. Unfortunately, whatever anesthetic benefit the woman gained from the ether was dismissed by Clarke’s professor as an attack of hysteria, and Clarke was warned to abandon the further use of ether for such purposes. Thus, Clarke’s milestone accomplishment received little attention, and he died unaware of his contribution to the discovery of anesthesia.

Around that same time, the recreational use of ether inspired another physician, who many believe should be credited as the true discoverer of anesthesia. Crawford Long had witnessed many nitrous oxide and ether “jags” while growing up in Philadelphia. Later, as a practicing physician in Georgia, he often inhaled ether with friends for its exhilarating effects. But apart from the euphoric effects, something else about ether caught Long’s attention. As he later wrote, “I would frequently... discover bruised or painful spots on my person which I had no recollection of causing... I noticed my friends, while etherized, received falls and blows which I believed were sufficient to produce pain and they uniformly assured me that they did not feel the least pain from these accidents...” These observations were apparently on Long’s mind in 1842 when he met with a Mr. James Venable, who had two small two tumors on the back of his neck. Venable was reluctant to undergo surgery due to his dread of the pain, but Long knew that the man was an enthusiast of inhaling ether. Recalling the pain-blunting effects he’d seen in himself and his friends, Long suggested that he give ether to Venable during the operation. Venable agreed, and on March 30, 1842, the operation was successfully and painlessly performed. But although Long went on to administer ether to many other patients, he neglected to publish his work until 1849—three years after another individual would receive credit for the discovery.

Not long after Long’s first medical use of ether, another curious set of incidents led to a near miss in the discovery of anesthesia. In December, 1844, Horace Wells, a dentist living in Hartford, Connecticut, attended an exhibition in which a traveling showman, Gardner Colton, was demonstrating the effects of inhaling nitrous oxide. The next day, Colton put on a private demonstration for Wells and several others, during which a man who inhaled the gas proceeded to run wildly about the room, throwing himself against several couches and knocking them over, crashing to the floor, and severely bruising his knees and other parts of the body. Later, after the gas had worn off, the man marveled at the injuries he’d sustained and his lack of pain while under the influence of the gas, exclaiming, “Why, a person might get into a fight with several persons and not know when he was hurt!” Wells, suffering at the time from a painful wisdom tooth, was intrigued. He asked if Colton would give him the gas while another dentist removed the aching tooth. The following day, December 11, 1844, Colton administered nitrous oxide to Wells, the tooth was removed, and as the effects of the gas subsided, Wells exclaimed, “A new era in tooth pulling!”

But Wells’ luck ran out when he attempted to introduce his discovery to the medical world. In January, 1845, he traveled to Boston to introduce anesthesia to the surgeons at Massachusetts General Hospital. One surgeon, John Warren, gave Wells an opportunity to administer nitrous oxide to a patient scheduled for a tooth extraction. Unfortunately, before a large audience of students and physicians, the gas “was by mistake withdrawn much too soon,” and the patient groaned. Although the patient later testified the gas had lessened his pain, audience members called out “Humbug!” and Wells was laughed from the room.

And so, after decades of frolics and jags, dabbles and dismissals, along with the humiliations and unacknowledged successes of Clarke, Long, and Wells, a new milestone was finally imminent: The “official” discovery of anesthesia.

Milestone #3 Anesthesia at last: the discovery of Letheon (er, make that “ether”)

When Horace Wells suffered his humiliating setback with nitrous oxide in front of a crowded room at Massachusetts General Hospital, it’s not clear if one of the people in the audience—his former dental partner, William Morton—was among those who yelled out “Humbug!” In fact, Morton was probably as disappointed with the failure as Wells. Two years earlier, the two were working together on a new technique for making dentures that involved the painful removal of all the patient’s teeth. Less than satisfied with their current anesthetic—a concoction of brandy, champagne, laudanum, and opium—both were on the look out for better ways to relieve their patients’ pain and thus increase business. But though his former partner’s demonstration of nitrous oxide had failed, it was around that time that Morton learned from an acquaintance, a professor of chemistry at Harvard Medical School, that ether had some interesting properties Morton might be interested in.

According to some accounts, Professor Charles Jackson had personally discovered these properties in 1841 after a vessel of ether exploded in his laboratory and he’d found his assistant anesthetized. After Jackson told Morton about these effects and provided information about how to prepare ether, Morton began his own personal studies. In a whirlwind series of trials possible only in an FDA-free world, Morton experimented on his dog, a fish, himself, his friends, and then, on September 30, 1846, a patient undergoing a tooth extraction. When the patient awoke and reported experiencing no pain, Morton quickly arranged for a public demonstration.

Two weeks later, on October 16, 1846—in what is now considered the decisive moment in the “discovery” of anesthesia—Morton entered the surgical amphitheater of Massachusetts General Hospital. Although late from making some final adjustments to the apparatus designed to deliver the gas, Morton administered the ether to Gilbert Abbot while surgeon John Warren removed a tumor from Abbott’s neck. The demonstration was a success, and Dr. Warren, apparently familiar with a certain recent failure by Morton’s partner, turned to the audience and announced, “Gentlemen, this is no humbug.” The impact of the moment and its place in history was recognized by everyone present, including eminent surgeon Henry Bigelow, who said, “I have seen something today which will go around the world.” Bigelow was right. The news was reported in the Boston Daily Journal the next day, and within months, the use of ether for anesthesia had spread to Europe.

Despite Morton’s dramatic success, however, ether was almost immediately banned at Massachusetts General Hospital. Why? Morton had refused to tell the doctors exactly what he was administering. Claiming it was a secret remedy and under patent, he had added coloring and a fragrance to disguise the gas and called it “Letheon.” But hospital officials were unimpressed and refused to use it further until Morton revealed its nature. Morton finally consented, and a few days later, Letheon—stripped of its coloring, scent, and name—made its reappearance at the hospital as plain old ether.

Although Morton spent the next two decades trying to claim credit and financial reward for discovering anesthesia, he ultimately failed, partly because Jackson and Wells were also fighting for the honor. Nevertheless, despite the contributions of various individuals over the preceding five decades—Davy, Clarke, Long, Wells, and Jackson—today Morton receives the widest recognition for being the first to demonstrate anesthesia in a way that profoundly changed the practice of medicine.

Milestone #4 Coming of age: a new anesthetic and a controversial new use

Despite the rapid and widespread use of ether, medical anesthesia had not yet truly arrived. One reason ether became popular so quickly after Morton’s demonstration was that, whether by coincidence or fate, ether was blessed with a collection of almost too-good-to-be-true properties: It was easily prepared, markedly more potent than nitrous oxide, could be administered by simply pouring a few drops onto a cloth, and its effects were quickly reversible. What’s more, ether was generally safe. Unlike nitrous oxide, it could be breathed in concentrations sufficient to produce anesthesia without risking asphyxiation. Finally, ether didn’t depress heart rate or respiration and was nontoxic to tissues. Given the inexperience of those who first administered ether to patients—not to mention the clinically non-rigorous conditions seen at your average frolic and jag—nineteenth-century medicine could not have asked for a more ideal anesthetic.

In truth, however, ether was not perfect. Its limitations included the fact that it was inflammable, had an unpleasant odor, and it caused nausea and vomiting in some patients. As luck would have it, within a year after Morton’s demonstration, a new anesthetic had been discovered—chloroform—and in a short time it almost completely replaced ether in the British Isles. The rapid acceptance of chloroform in England was probably due to several advantages it had over ether: It was nonexplosive, it had a less offensive odor and a speedier onset, and—perhaps most important—it was not discovered by that brash, young upstart, the United States.

Although chloroform had been first synthesized in 1831, it had not been tested in humans until someone suggested to Scottish obstetrician James Simpson that he try it as an alternative to ether. Intrigued, Simpson did what any good researcher at the time would do—he brought some home and, on September 4, 1847, shared the potent gas with a group of friends at a dinner party. When Simpson later awoke on the floor, surrounded by his other unconscious guests, he became an ardent believer in the anesthetic properties of chloroform.

But Simpson did more than discover chloroform’s anesthetic properties. Although the earlier discovery of ether had been rapidly accepted by medicine and society, its use remained highly controversial in one area: childbirth. This sensitivity was based on the religious view held by some that the pain of childbirth was God’s just punishment for the sins of Adam and Eve. The outrage facing those who dared deny their moral punishment was violently seen in Simpson’s own city of Edinburgh 250 years earlier: In 1591, Euphanie Macalyane had sought relief from her labor pains and, by order of the King of Scotland, was rewarded by being burned alive. Perhaps hoping to redeem the sins of his own ancestors, Simpson strongly advocated using anesthesia for painless childbirth, and on January 19, 1847, he became the first person to administer anesthesia—in this case, ether—to ease the delivery of a baby to a woman with a deformed pelvis. While Simpson faced angry opposition for his “satanic activities,” he countered his critics by cleverly citing passages from the Bible, including suggesting that God was the first anesthesiologist: “...and the Lord God caused a deep sleep to fall upon Adam... and he took one of his ribs, and closed up the flesh instead thereof...”

Several months later, Fanny Longfellow, wife of the famous poet Henry Wadsworth Longfellow, became the first person in the United States to receive anesthesia during labor. In a letter she wrote afterwards, one can hear her mixed feelings of guilt, pride, anger, and simple gratitude for her pioneering role:

“I am very sorry you all thought me so rash and naughty in trying the ether. Henry’s faith gave me courage and I had heard such a thing had succeeded abroad, where the surgeons extend this great blessing much more boldly and universally than our timid doctors... I feel proud to be the pioneer to less suffering for poor, weak womankind... I am glad to have lived at the time of its coming...but it is sad that one’s gratitude cannot be bestowed on worthier men than the joint discoverers, that is, men above quarreling over such a gift of God.”

Milestone #5 From lint and gloves to modern pharmacology: the birth of a science

Although the use of ether after Morton’s demonstration was rapid and widespread, anesthesia was not yet truly a science. To understand why, one need only read the words of a Professor Miller, who explained at the time that in the Royal Infirmary of Edinburgh, anesthesia was applied with “anything that will admit chloroform vapor to the mouth and nostrils.” The “anything” Miller referred to included the nearest handy object, such as “a handkerchief, towel, piece of lint, nightcap, or a sponge” with, of course, special allowance for seasonal variations: “In the winter, the glove of a clerk or onlooker has been not infrequently pressed into service...” Miller added that dosing was less than an exact science: “The object is to produce insensibility as completely and as soon as we can, and there is no saying whether this is to be accomplished by fifty drops or five hundred.”

One reason for this casual attitude was the perception that ether and chloroform were so safe. But, as one might expect, the increasing use of anesthesia was soon accompanied by more frequent deaths—sometimes suddenly and unexpectedly. In one 1847 medical report, a physician in Alabama wrote that he had been called to operate on a Negro slave suffering from tetanus and lockjaw. As the doctor heated his cautery to clean the wound, a dentist began administering ether to the Negro. But to the shock of everyone present, “In one minute, the patient was under its influence; in a quarter more he was dead—beyond all my efforts to produce artificial respiration or restore life. All present agreed that he died from inhaling the ether.”

While such reports did not seem to concern most doctors, one man who became passionate, if not obsessed, with the use and safety of anesthesia was English physician John Snow. In 1846—two years before he would begin his milestone investigations into the outbreaks of cholera in London—Snow heard about the successful use of ether for anesthesia. Fascinated, he gave up his family practice and dedicated himself to study its chemical properties, preparation, administration, dosing, and effects.

Driven by his interest in safety, Snow investigated ether-related deaths, focusing on the role of overdosing and imprecise administration. At a time when pharmacology was in its infancy, Snow impressively calculated the solubility of ether in blood, the relationship between solubility and potency, and even the role of room temperature in how much anesthetic entered a patient’s body. Based on this work, Snow developed a device for vaporizing liquid anesthetics into gases, thereby creating a form of administration considerably more precise than, say, your average night cap or winter glove. Snow’s improvement in the safety of anesthesia is clear from his detailed records where, among more than 800 cases in which he administered ether or chloroform to patients, he recorded only three deaths due to the use of anesthetics.

But perhaps the most influential and fascinating aspect of Snow’s work was his clinical observations of patients as they underwent anesthesia. Prior to that time, most clinicians viewed anesthesia as a kind of “on/off” switch: Ether was administered, and the patient lost consciousness; surgery was performed, and the patient re-awoke. While it was obvious that patients experienced different stages of consciousness and awareness of pain, Snow was the first to seriously examine these stages and their relevance to safe, pain-free surgery. In his monograph, “On the Inhalation of the Vapour of Ether in Surgical Operations”—published in 1847 and now considered a classic in medicine andanesthesia—he not only provided guidelines for preparing and administering anesthesia, but identified five stages that are similar to the major stages of anesthesia recognized today:

Stage 1—Patients begin to feel various changes but are still aware of where they are and can make voluntary movements.

Stage 2—Patients still have some mental functions and voluntary movements, but they are “disordered.”

Stage 3—Patients become unconscious, losing mental functions and voluntary motion, though some muscular contractions may still occur.

Stage 4—Patients are fully unconscious and immobile, with the only physical movements being the muscular motions of respiration.

Stage 5—A dangerous final stage in which respiration is “difficult, feeble, or irregular,” and “Death is imminent.”

Providing details about these stages in a way that no clinician had before, Snow noted that patients generally passed from each stage to the next in one-minute intervals and that if the inhalation is discontinued after Stage 4, the patient will remain in that stage for one or two minutes before gradually passing back through Stage 3 (3 to 4 minutes), Stage 2 (5 minutes), and Stage 1 (10 to 15 minutes). He also wrote that surgery can be performed in Stage 3 “without producing any other effect than a distortion of features... and perhaps a slight moaning.” In contrast, in Stage 4 patients “always remain perfectly passive under every kind of operation.”

In describing how different types of patients react to anesthesia, Snow wrote that in passing from Stage 1 to 2, “hysterical females sometimes sob, laugh, or scream.” He also found that a patient’s memory of the experience usually only occurred in Stage 1 and that any reported feelings during this stage “are usually agreeable—often highly so.” His guidelines included what patients should eat before anesthesia (“a sparing breakfast”), helping patients inhale ether (“The pungency of the vapor is often complained of at first... The patient must be encouraged to persevere,” and a warning that in Stage 2, some patients may become excited and suddenly want to “talk, sing, laugh, or cry.”

Snow’s paper on ether was published in 1847, but before it had been widely distributed, James Simpson had introduced chloroform, and Snow soon began investigating the effects of this new anesthetic. Within several years, Snow had become an expert and was the favorite anesthesiologist for many of London’s top surgeons. His fame peaked in 1853 and 1857, when he was asked to administer chloroform to Queen Victoria during her delivery of Prince Leopold and Princess Beatrice, respectively. “When the chloroform was commenced,” Snow wrote, “Her Majesty expressed great relief...” And after the birth, “The Queen appeared very cheerful and well, expressing herself much gratified with the effect of the chloroform.”

When Snow died in 1858, his research into the pharmacology and administration of anesthesia, along with his clinical experience and publications, had raised anesthesia to a science and made him the world’s first true anesthesiologist. While the medical profession would not fully appreciate his work for many years, he had helped put the final exclamation point on one of the greatest breakthroughs in the history of medicine.

An underlying mystery: the link between losing consciousness and raising it

It’s not hard to understand why some rank anesthesia as the top discovery in the history of medicine. After thousands of years of ineffective methods to prevent pain—from alcohol, to mandrake root, to a sharp knock on the head—the discovery of inhaled anesthetics was unlike anything previously seen or imagined. The ability to easily and completely remove patients from the awareness of pain despite the most drastic operations on the body, while allowing them to awaken minutes later with few or no after effects, transformed medicine and society. Patients were now willing to undergo more life-saving and life-improving procedures, and surgeons, freed from the hazards of a struggling patient, could perform more operations, while developing new techniques and life-saving treatments.

Yet as we saw in the early decades of frolics and jags, the discovery of anesthesia required a social transformation on several levels. Religious concerns had to be overcome, and physicians who believed pain was necessary for healing had to be enlightened. What’s more, a new mindset had to arise in both physicians and patients that consciousness could be safely altered in this new and unimagined way. Most interesting, the pain-killing effects of anesthesia cannot, and perhaps should not, be separated from its effects on the mind. Looking back at the stories of Humphry Davy, Crawford Long, and Horace Wells, one can’t help notice the irony that it was euphoric properties of these gases—and the physical injuries people sustained while enjoying them—that led to the discovery of their anesthetic properties.

Indeed, although medicine and society quickly put its focus on the anesthetic benefits of ether, thoughtful pioneers were immediately interested in the philosophical and metaphysical questions raised by its effects on the mind and body.

For example, John Snow, in the midst of his detailed scientific investigations, was intrigued by the comments from his patients as they awoke from anesthesia. “Some of the mental states... are highly interesting in a psychological view... The dreams often refer to early periods of life, and a great number of patients dream that they are traveling...” Snow added that even after the patient had recovered, “There is usually a degree of exhilaration, or some other altered state of feelings... The patient often expresses his gratitude to his surgeon in more ardent and glowing terms than he otherwise would do...”

Henry Bigelow, the surgeon who was present at Morton’s milestone demonstration, also seemed curious about these effects when he wrote about several dental patients he observed as they were given ether. One patient, a 16-year-old girl, had a molar extracted. Although she had “flinched and frowned” when the tooth was removed, Bigelow reported that when she awoke, “She said she had been dreaming a pleasant dream and knew nothing of the operation.” Another patient, “a stout boy of 12,” had “required a good deal of encouragement” to inhale the ether. However, the youth was successfully anesthetized, two teeth were removed, and when he awoke, “He declared ‘It was the best fun he ever saw,’ avowed his intention to come there again, and insisted upon having another tooth extracted on the spot.” A third patient had a back tooth removed, and when she awoke, “She exclaimed that ‘It was beautiful.’ She dreamed of being at home, and it seemed as if she had been gone a month.”

Not surprisingly, some of the most descriptive accounts of how anesthesia affects the mind came from artists, thinkers, and philosophers of the time. Just one day after his wife became the first woman in the United States to receive anesthesia for childbirth, Henry Wadsworth Longfellow received ether for the removal of two teeth. He later wrote that after inhaling ether, “I burst into fits of laughter. Then my brain whirled round and I seemed to soar like a lark spirally into the air. I was conscious when he took the tooth out and cried out, as if from infinitely deep caverns, ‘Stop,’ but I could not control my muscles or make any resistance and out came the tooth without pain.”

Even the earliest experimenters of nitrous oxide found that its effects raised fundamental questions about mental and sensory experiences and our limited ability to describe them. As one person who received the gas from Davy wrote, “We must either invent new terms to express these new and particular sensations, or attach new ideas to old ones, before we can communicate intelligibly with each other on the operation of this extraordinary gas.”

Perhaps Davy was wise to seek the help of artists to put such experiences to words, for one of the best descriptions of how anesthesia awakens unexplored areas of consciousness came from American writer, naturalist, and philosopher Henry David Thoreau. On May 12, 1851, Thoreau received ether prior to a tooth extraction and later wrote, “I was convinced how far asunder a man could be separated from his senses. You are told that it will make you unconscious, but no one can imagine what it is to be unconscious—how far removed from the state of consciousness and all that we call ‘this world’—until he has experienced it.... It gives you experience of an interval as between one life and another, a greater space than you ever traveled. You are a sane mind without organs... You expand like a seed in the ground. You exist in your roots, like a tree in winter. If you have an inclination to travel, take the ether; you go beyond the furthest star...”

An evolving science: from “knocking out” a patient to administering precise molecular cocktails

The evolution of anesthetic drugs has traveled a long road since the pioneering discoveries of the mid-nineteenth century. Although nitrous oxide fell out of favor after Wells’ embarrassing failure, in was revived in the 1860s for use in tooth extractions and later for some surgical procedures. Chloroform remained popular in Europe for a time, but was eventually found to have safety problems not seen with ether—including the potential to cause liver damage and cardiac arrhythmias—and its popularity soon declined. Of the three original inhaled anesthetic gases, only ether remained a standard general anesthetic until the early 1960s.

Throughout the early twentieth century, many new inhaled anesthetics were studied and introduced, including ethylene, divinyl ether, cyclopropane, and trichloroethylene, but all were limited by their flammability or toxic properties. By the 1950s, several inhaled anesthetics were made nonflammable by the addition of fluorine. While some were discontinued due to concerns about toxicity, inhaled anesthetics still in use include enflurane, isoflurane, sevoflurane, and desflurane.

Since the 1950s, anesthesia has advanced along many fronts, from the development of local, regional, and intravenous anesthesia, to technical advances in administering and monitoring anesthesia. But perhaps the most exciting advances are now coming from the frontiers of neuroscience. Although no one knows exactly how anesthetics work—any more than we understand the nature of consciousness—recent findings have provided clues into how anesthetics affect the nervous system, from their broad effects on consciousness and pain, to their microscopic and molecular actions on individual brain cells (neurons) in different areas of the brain and spinal cord.

At the broadest level, clinicians now understand that anesthesia is not simply a matter of “knocking out” a patient, but involves several key components, including: sedation (relaxation), hypnosis (unconsciousness), analgesia (lack of pain), amnesia, and immobility. In the 1990s, researchers discovered that anesthetics exert these multiple effects by acting on different parts of the nervous system. For example, the same anesthetic may cause hypnosis and amnesia by acting on neurons of the brain, while causing muscular immobility through its effects on neurons of the spinal cord. However, because no one anesthetic is ideal for producing all components of anesthesia, today’s anesthesiologists usually select a combination of anesthetics to produce the desired effects while minimizing side effects.

* * *

Since the 1990s, researchers have uncovered even more surprising insights into how anesthetics work, opening new doors to better anesthetics for the future. For example, for years it was thought that all anesthetics acted on the same general target in the brain, broadly altering the neuron’s outer membrane. However, researchers now know that there is no universal pathway that explains how all anesthetics work—or even how any one agent works. Rather, general anesthetics change the way that neurons “fire” (that is, transmit signals to each other) by altering microscopic openings on the surface of neurons, called ion channels. Because there are dozens of different types of ion channels, anesthetics can cause a variety of effects depending on which channels they act on. What’s more, since the brain has literally billions of neurons and countless interconnections, the location of the affected neuron within the brain also plays a role. Researchers now know that major areas of the brain affected by anesthetics include the thalamus (relays signals to higher areas of the brain), hypothalamus (regulates many functions, including sleep), cortex (outer layer of the brain involved in thinking and conscious behavior), and hippocampus (involved in forming memories).

Even more exciting, neuroscientists have discovered in recent years that anesthetics produce their different effects by acting on highly specific “receptors.” Receptors are tiny “gatekeeper” molecules on the surface of neurons that determine whether or not ion channels open (and hence whether the neuron will fire). Thus, when anesthetics attach to various receptors, they can affect whether or not the neuron will fire. This is a key finding because there are many types of receptors, and researchers have learned that anesthetics may exert their unique effects—unconsciousness, sleepiness, analgesia, or amnesia—by binding to different receptors in different parts of the brain.

One particular receptor thought to play a key role in how anesthetics work is called GABAA. Studies have shown that different anesthetics may cause different effects based on which region (subunit) of the GABAA receptor they attach to, where the GABAA receptor is located on the neuron, and where that neuron is located in the brain. With so many variables, it’s clear that researchers have their work cut out for them trying to sort out the many pathways by which current and future anesthetics cause their effects.

Yet that’s exactly what is most interesting and exciting about the future of anesthesia. As we gain a more precise understanding of how and where anesthetics act in the nervous system, it might be possible to develop drugs that target specific receptors and their subunits, while ignoring others, resulting in highly specific effects. In this way, customized anesthetics could be combined to create safer and more effective anesthesia. As anesthesiologist Beverley A. Orser noted in a recent article in Scientific American, the broad effects seen with current anesthetics are “unnecessary and undesirable.” However, “With a cocktail of compounds, each of which produces only one desirable end point, the future version of anesthesia care could leave a patient conversant but pain-free while having a broken limb repaired...or a hip replaced.”

* * *

And so today, 150 years after William Morton changed the practice of medicine by introducing ether to the practice of surgery, anesthesia continues to evolve and transform medicine. Equally intriguing, some believe that the insights gained from studying anesthetics might help uncover other secrets of the mind. As researchers noted in a recent article in Nature Reviews, “Anesthetic agents have been used to identify [neurons and pathways] involved in conscious perception [and the] mechanisms of sleep and arousal...” Results from ongoing studies “will probably provide further insights... which will be of great importance both for medicine and basic neuroscience.”

Such insights could include not only how new anesthetics work, but the mysteries of human consciousness, from the nature of thoughts and dreams, to the sublime sensations and perceptions described by Humphry Davy more than 200 years ago. But wherever such studies lead us—whether deep within or “beyond the furthest star”—we should never lose sight of the life-altering benefits of anesthesia. As John Snow marveled in his 1847 paper, “The constant success with which ether is capable of being employed is one of its greatest advantages... and that the patient should not only be spared the pain, but also the anticipation of it... In most cases, the patients [can now] look forward to the operation merely as a time when they would get rid of a painful joint or some other troublesome disease...”

Snow’s statement speaks to the birth of a new science and a new awareness unknown before the nineteenth century, but deeply appreciated throughout the world ever since.