When Did Doctors Start Wearing Gloves: The History

Doctors started wearing rubber gloves during surgery in 1890, at Johns Hopkins Hospital in Baltimore. The story behind it is surprisingly personal: the gloves weren’t invented to protect patients from infection. They were created to protect a nurse’s hands from harsh chemical burns.

Surgery Before Gloves

For most of medical history, surgeons operated bare-handed. Before the 1840s, they didn’t even wash their hands, instruments went unsterilized, and patients were physically held down on the table. The concept that invisible germs caused infection simply didn’t exist yet.

That began to change in the 1860s when British surgeon Joseph Lister started treating wounds with dressings soaked in carbolic acid, a compound already used to disinfect sewage. Lister introduced hand washing, instrument sterilization, and spraying carbolic acid through the operating room during surgery. These antiseptic methods dramatically reduced post-surgical infections, but they came at a cost. The harsh chemical solutions, particularly mercuric chloride, destroyed the skin of anyone who had prolonged contact with them.

A Nurse’s Skin Problem Changed Everything

In the winter of 1889 to 1890, Caroline Hampton, the head operating room nurse at Johns Hopkins, told her surgeon, William Stewart Halsted, that the mercuric chloride solution used to scrub hands and instruments was causing severe dermatitis on her arms and hands. Halsted later wrote that because she was “an unusually efficient woman,” he wanted to find a solution rather than lose her.

On a trip to New York, Halsted asked the Goodyear Rubber Company to manufacture two pairs of thin rubber gloves with gauntlets as an experiment. The gloves worked so well that he ordered more. By 1890, Johns Hopkins became the first hospital to institute the use of rubber surgical gloves. Halsted and Hampton, incidentally, married the following year.

The adoption was gradual, though. At first, only the nurses and assistants who handled instruments wore gloves. Surgeons themselves wore them only for specific procedures, such as exploratory incisions into joints. It took years before gloves became standard for everyone in the operating room.

From Skin Protection to Infection Control

The real turning point came when doctors noticed something unexpected: gloves weren’t just protecting hands from chemicals. They were preventing infections. Joseph Bloodgood, a protégé of Halsted who arrived at Hopkins in 1892, began wearing gloves during all surgeries by 1896. In 1899, he published a landmark report on more than 450 hernia operations showing a near-complete drop in infection rates when surgeons wore gloves.

That finding reframed surgical gloves entirely. What started as a comfort measure became a cornerstone of sterile technique. Over the early 1900s, rubber gloves spread to operating rooms worldwide, though the gloves of that era were thick, reusable, and had to be sterilized between patients.

Disposable Gloves and Mass Adoption

For decades, surgical gloves remained expensive, reusable items that were washed and re-sterilized. That changed in 1964, when the Ansell Rubber Company manufactured the first disposable latex gloves. The breakthrough that made mass production affordable was a new sterilization method: gamma radiation. Compared to previous sterilization techniques, gamma radiation was far cheaper at scale, making single-use gloves economically viable for the first time.

Disposable gloves transformed medical practice beyond the operating room. Suddenly it was practical for doctors, nurses, and lab technicians to wear gloves during routine patient care, blood draws, and examinations, not just surgery.

The AIDS Crisis Made Gloves Mandatory

Even with disposable gloves widely available, many doctors still didn’t wear them for everyday patient contact through the 1970s and into the 1980s. Gloves were seen as a surgical tool, not a routine precaution. The HIV/AIDS epidemic changed that almost overnight.

As awareness grew that bloodborne viruses like HIV and hepatitis B could spread through contact with infected blood and body fluids, health authorities introduced “universal precautions,” the principle that all blood and certain body fluids should be treated as potentially infectious. In December 1991, OSHA finalized its Bloodborne Pathogens Standard, which took effect in March 1992. The rule required employers to provide personal protective equipment, including gloves, for any worker with occupational exposure to blood or other infectious materials. Glove use was no longer optional or left to a doctor’s judgment. It was a legal requirement.

From Latex to Nitrile

Latex dominated the medical glove market for decades after disposables became available, but a new problem emerged: allergies. Both healthcare workers and patients developed sensitivities to latex proteins, sometimes severe enough to cause anaphylaxis. In the mid-1990s, nitrile gloves entered the market. Originally developed for better chemical resistance rather than as a latex replacement, nitrile quickly became the preferred material in healthcare because it eliminated the allergy risk while offering equal or better durability. Today, nitrile gloves are the standard in most hospitals and clinics.

The full timeline spans just over a century. From Caroline Hampton’s irritated hands in 1890 to OSHA’s federal mandate in 1992, gloves went from an improvised solution for one nurse to a non-negotiable part of every medical encounter.