Before the modern syringe existed, people delivered medicines into the body using animal bladders, quills, hollow reeds, lancets, and a surprising variety of improvised tools. The path from these crude devices to the precision syringes we know today stretched across centuries, with each era finding creative ways to get substances past the skin or into body cavities.
Animal Bladders and Clysters
The oldest injection-like devices were clysters, tools designed to flush liquids into the rectum for medicinal purposes. For roughly 2,000 years, from ancient Egypt through the 18th century, practitioners performed bowel irrigations using a bag-like syringe made from the bladders of animals as a pouch. A tube or nozzle was attached to the bladder, and squeezing the pouch forced liquid into the body. These weren’t elegant instruments, but they worked well enough to remain the standard approach for delivering medicines internally across dozens of civilizations.
Clysters were so common in Renaissance and early modern Europe that they became a cultural fixture. Wealthy patients received elaborate enemas as routine health maintenance, and the devices themselves were sometimes crafted from ornate metals. The basic principle, a squeezable reservoir pushing fluid through a tube, is essentially the same concept that modern syringes still use.
Lancets and Scarification
For getting substances through the skin, the most widespread pre-syringe method was scarification: scratching or cutting the skin open and pressing medicine directly into the wound. This technique played a central role in one of medicine’s greatest achievements, the eradication of smallpox.
During variolation, the predecessor to vaccination, a practitioner would wet a small blade called a lancet with fresh material taken from a ripe smallpox pustule. The material was then introduced just beneath the skin on the arms or legs of a healthy person, producing a mild infection that conferred immunity. When Edward Jenner developed his cowpox-based vaccine in 1796, he used the same basic tool, transferring material from cowpox lesions to a small incision on the arm of an 8-year-old boy named James Phipps. No syringe, no needle. Just a lancet and a scratch.
Lancets remained the primary vaccination tool well into the 19th century. Even as hypodermic needles became available, many practitioners stuck with the familiar scarification technique for decades.
Quills, Trocars, and Early Experiments
The idea of injecting something directly into a vein appeared in the 1650s, long before anyone had a proper syringe to do it with. In 1656, the architect and scientist Christopher Wren demonstrated intravenous injection of opium mixed with alcohol into a dog at Oxford, producing a brief period of anesthesia followed by full recovery. The tools he used were improvised: a goose quill attached to an animal bladder. The quill served as a crude needle, thin enough to enter a vein, while the bladder acted as the plunger mechanism.
By the 1830s, French physicians were trying to get morphine under the skin to treat nerve pain, but their methods were still remarkably rough. Some forced morphine paste down grooved metal rods called trocars. Others used darning needles to push small drug pellets beneath the skin, where the body would slowly absorb them. These approaches delivered medication, but with little control over dosage and significant discomfort for the patient.
The First Hollow Needle
The critical breakthrough came in 1844, when Dublin physician Francis Rynd created a device specifically for subcutaneous injection. His instrument was a slender trocar and cannula (essentially a sharp rod inside a hollow tube) that could be inserted beneath the skin. A spring mechanism retracted the inner rod, and narcotic liquid descended from a hollow handle into the puncture site as the instrument was withdrawn. Gravity, not a plunger, moved the drug into the body.
Rynd designed this tool to treat neuralgia, a condition involving severe nerve pain, and it represented a genuine leap forward. For the first time, a physician could place a measured amount of liquid medication at a specific depth beneath the skin using a purpose-built instrument. But without a plunger to control flow, dosing remained imprecise.
The First True Syringes
The modern hypodermic syringe emerged in 1853, when two physicians independently created devices with both a hollow needle and a mechanism to push fluid through it. Charles Pravaz, a French surgeon, built a metal syringe that used a screw piston to advance fluid through the needle. The screw mechanism gave doctors a way to estimate how much they were injecting, a major improvement over gravity-fed designs. Alexander Wood of Edinburgh became the first physician to use a hypodermic syringe to administer drugs, and his version featured a steel needle with a hard rubber hub connected by a slip joint.
European physicians initially favored Pravaz’s metal design. Professor Louis-Jules Béhier of the Paris Académie de Médecine began using the Pravaz syringe for subcutaneous medication in 1859 and popularized it across Europe. He specifically rejected Wood’s glass syringe in favor of the metal version because the screw piston provided better dose control.
From Metal to Glass to Plastic
Early metal syringes had a significant limitation: you couldn’t see the liquid inside them. The first glass syringe appeared in 1874, when Béhier presented the Arsonval syringe, but it took another two decades for glass to become the standard material. In 1894, Parisian instrument maker Hermann Wülfing-Lüer manufactured the first graduated all-glass hypodermic syringe. Its conical tip allowed leak-free connections, and the glass could withstand the heat of autoclaving at 120 degrees Celsius without breaking. The Lüer syringe became the template that dominated medicine for the next half-century.
Glass syringes were reusable, which meant they had to be sterilized between patients. Hospitals boiled them, steamed them, or autoclaved them with pressurized steam. A common approach in settings without an autoclave was to suspend instruments on a rack above boiling water inside a closed sterilizer, exposing them to steam for at least 10 to 20 minutes. Adding 2 percent soda to the boiling water improved effectiveness. These methods worked reasonably well but technically disinfected rather than fully sterilized, and the constant cycle of cleaning, reassembling, and sharpening needles was labor-intensive.
The reusable era ended in 1955, when Roher Products introduced the Monoject, the world’s first single-use plastic syringe. Disposable syringes eliminated the sterilization problem entirely. They arrived sharp, sterile, and ready to use, then went straight into the waste bin. Within a few decades, reusable glass syringes had largely disappeared from clinical practice, surviving mainly in a few specialized applications.
Why Each Step Mattered
The progression from animal bladders to disposable plastic syringes wasn’t just about convenience. Each innovation solved a specific problem that had been injuring or killing patients. Lancets and scarification couldn’t deliver precise doses. Gravity-fed hollow needles couldn’t control flow rate. Metal syringes hid the liquid from view. Glass syringes required perfect sterilization between uses. And reusable needles dulled with each insertion, causing more tissue damage over time.
What’s remarkable is how recently all of this happened. The hypodermic syringe is barely 170 years old. For most of human medical history, getting a drug into the bloodstream meant either swallowing it, absorbing it through a wound, or pushing it into a body cavity with a squeeze of an animal bladder. The precision injection we now take for granted, a quick poke with a thin disposable needle, is one of the newest tools in the medical kit.

