Ether was phased out of routine use as an anesthetic in the 1960s, when newer, non-flammable agents replaced it in operating rooms across the United States and Europe. It had been the standard general anesthetic for over a hundred years before that, stretching back to its famous public debut in 1846. The transition wasn’t sudden. It happened over roughly a decade as hospitals adopted a new class of chemicals that were safer, faster, and far less unpleasant for patients.
How Ether Became the First Real Anesthetic
On October 16, 1846, a dentist named William Morton demonstrated ether anesthesia at Massachusetts General Hospital in Boston. His device was simple: a small crystal ball containing a sponge soaked in ether, fitted with two openings. One held the liquid ether, and the other connected via a wooden spout to the patient’s mouth. The patient inhaled the vapor, lost consciousness, and underwent surgery without pain. It was a turning point in medicine.
For the next two decades, ether (and its rival, chloroform) was typically administered by dripping the liquid onto a cloth, compress, or wire-frame mask held over the patient’s face. These “open drop” methods were crude but effective. Doctors quickly recognized the need for better control over dosing, and by the early 1900s, more sophisticated gas machines emerged with flow meters, carbon dioxide absorption systems, and calibrated vaporizers that let physicians regulate exactly how much anesthetic a patient received.
Why Ether Lasted So Long
Despite its drawbacks, ether had real advantages that kept it in operating rooms for more than a century. It was remarkably safe in terms of keeping patients alive. Unlike chloroform, which could cause fatal heart rhythm problems if the dose was slightly too high, ether had a wide margin between an effective dose and a dangerous one. It was also cheap, widely available, and didn’t require complex equipment. A doctor in a rural hospital or battlefield could administer it with little more than a cloth and a bottle.
The Problems That Eventually Ended Its Use
Ether’s biggest liability was that it is highly flammable and explosive. Operating rooms increasingly relied on electrical equipment, particularly electrocautery tools that use electrical currents to cut tissue and seal blood vessels. Combining an explosive vapor with open sparks created obvious dangers. In 1960, when ether was still the primary inhaled anesthetic in the United States, surgical fires occurred at a rate of roughly one per 100,000 anesthesia cases. That may sound rare, but even a small number of operating room explosions was unacceptable when alternatives existed.
Beyond the fire risk, ether was genuinely miserable for patients. It has a strong, unpleasant smell and irritates the airways, making induction slow and uncomfortable. The induction period was long compared to newer drugs. And recovery was often brutal: up to 85% of patients experienced nausea and vomiting after waking up, along with headaches. For comparison, modern anesthetic agents cause post-operative nausea in a fraction of patients and allow people to wake up far more quickly and comfortably.
What Replaced Ether in the 1960s
The agents that pushed ether out of practice belong to a family called fluorinated hydrocarbons. Halothane came first and gained widespread adoption through the late 1950s and 1960s. Enflurane, isoflurane, and eventually sevoflurane followed in subsequent decades. These newer agents share a key property: they are not flammable. That single change eliminated the risk of operating room explosions and allowed surgeons to freely use electrocautery and laser instruments without worrying about igniting the anesthetic gas.
The newer agents also worked faster, wore off more quickly, and caused far less nausea. They produced less irritation to the lungs and airways, and they caused less organ damage over the course of a procedure. Delivering them required more sophisticated vaporizers and monitoring systems than the old open-drop ether method, but by the mid-20th century, anesthesia machines had already evolved into complex devices with electronic flow meters, ventilators, and alarm systems. The infrastructure was ready for the switch.
The Role of Anesthesiology’s Growth as a Specialty
The decline of ether also coincided with anesthesiology maturing into a full medical specialty. Before and during World War II, anesthesia was often administered by nurses or general practitioners with minimal specialized training. After the war, dedicated anesthesiology residencies expanded, subspecialties like pediatric and obstetric anesthesia developed, and the field’s focus broadened from simply managing pain during surgery to optimizing the entire surgical experience, including recovery, infection control, and patient comfort.
This shift in priorities made ether’s drawbacks harder to justify. A field increasingly concerned with post-operative outcomes and patient satisfaction wasn’t going to tolerate an agent that left 85% of patients vomiting. The professionalization of anesthesiology created demand for better drugs, and the pharmaceutical industry delivered them.
Ether’s Lingering Use in the Developing World
While ether disappeared from hospitals in the United States and Europe by the 1970s, it persisted longer in parts of the developing world. The same qualities that made it popular in the 1800s, its low cost, stability without refrigeration, and the ability to administer it without expensive equipment, made it practical in resource-limited settings where modern vaporizers and monitoring systems weren’t available. Some researchers have even argued for reconsidering ether in settings where the alternative is no anesthesia at all, though this remains a niche position. In well-equipped hospitals worldwide, ether has been replaced completely by modern inhaled agents and intravenous drugs that can induce unconsciousness in under 30 seconds.

