How Did They Test for Pregnancy in the 1800s?

In the 1800s, there was no reliable test for pregnancy. Doctors and women alike relied on a combination of physical symptoms, physical exams, and a few crude urine observations, none of which could confirm pregnancy with certainty until the baby itself could be felt moving. The first true laboratory pregnancy test didn’t arrive until the late 1920s, meaning the entire 19th century operated on educated guesswork.

Waiting and Watching: Self-Diagnosis at Home

For most women in the 1800s, the first and most practical method of detecting pregnancy was simply paying attention to their own bodies. A missed menstrual period was the earliest clue, followed by nausea, breast tenderness, and fatigue. As the NIH notes in its history of pregnancy testing, “the best method for diagnosing pregnancy remained careful observation of their own physical signs and symptoms (such as morning sickness).” There was no drugstore test to settle the question. Women tracked their cycles, noticed changes, and waited.

This self-observation carried real uncertainty. A missed period could mean pregnancy, but it could also result from illness, stress, malnutrition, or dozens of other causes. Many women simply couldn’t know for sure until their belly began to grow or the baby started to move.

What Doctors Looked For

When a woman did see a physician, the exam was largely observational. A Scottish physician named Lyall cataloged the signs doctors used to estimate pregnancy and its timing: unusual sensations a woman reported near the time of conception, the cessation of menstruation, the date of a single act of intercourse (if known), and the moment of “quickening,” or first felt fetal movement. Even with all of these data points, as one historical account put it, “pregnancy was frustratingly ambiguous, difficult to diagnose and date.” Doctors frequently disagreed with one another about whether a woman was pregnant at all.

Vaginal exams were performed, sometimes repeatedly, but they often produced inconclusive results. In one documented case, a woman was “frequently examined per vaginam, and at various periods,” yet her doctors remained divided on whether she was pregnant. Without imaging or blood tests, the physical exam could only reveal so much, especially in early pregnancy.

Chadwick’s Sign

One physical clue that gained recognition during the century was a bluish or purplish discoloration of the vulva, vagina, and cervix. This color change results from increased blood flow to the pelvic area during pregnancy. A French doctor named Etienne Joseph Jacquemin first linked this sign to pregnancy in the early 1800s while treating imprisoned sex workers. Decades later, in 1886, an American doctor named James Read Chadwick presented the finding to the American Gynecological Society, and the observation became known as Chadwick’s sign. It was considered a useful early indicator, but it wasn’t visible in every woman and required a pelvic exam to detect.

The Kyestein Urine Test

Urine-based pregnancy detection has ancient roots, with medieval practitioners claiming they could diagnose pregnancy by the color and clarity of a woman’s urine. By the 19th century, French doctors had developed a slightly more systematic version called the “kyestein pellicle” test. The idea was simple: collect a sample of the woman’s urine, leave it in a vessel for several days, and watch for a sticky film to form on the surface. If the film appeared, the woman was considered pregnant.

The test had a certain logic to it. Pregnancy does change urine composition, and various theories circulated about “identifiable crystals or bacteria” that might be present. But the kyestein pellicle was unreliable. The film could form for other reasons, and it often failed to appear in women who were genuinely pregnant. It represented the best attempt at a chemical-style test the century had to offer, and it wasn’t very good.

Quickening: The Only “Proof”

For most of the 1800s, the single most trusted confirmation of pregnancy was quickening, the moment a woman first felt the baby move inside her. This typically happens between 16 and 25 weeks of pregnancy, meaning women often couldn’t confirm what they suspected for four to six months.

Quickening carried weight far beyond the doctor’s office. In Western legal tradition stretching back centuries, it was the point at which a fetus was considered a living human being with legal protection. English legal authorities from Henry de Bracton in the 13th century through William Blackstone in the 18th century all treated quickening as the dividing line. An abortion performed before quickening was treated as a lesser offense, or no offense at all, while one performed after quickening could be punished as homicide. This legal framework persisted well into the 19th century in both Britain and the United States, making quickening not just a medical milestone but a legal one.

The concept also had deep theological roots. St. Augustine and St. Thomas Aquinas both pointed to quickening as the moment of “ensoulment,” when the life in the womb became fully human. For everyday women, though, the significance was more personal and practical: quickening was the moment the pregnancy became real and undeniable.

Listening for the Heartbeat

One genuine technological advance arrived early in the century. In 1818, shortly after the invention of the stethoscope, doctors discovered they could detect the fetal heartbeat by listening through the instrument placed on a pregnant woman’s abdomen. This was a meaningful breakthrough. For the first time, a doctor could confirm the presence of a living fetus without relying on the mother’s report of movement.

The technique spread through obstetric practice in Dublin and Edinburgh during the 1820s and 1830s. It was especially valuable in later pregnancy, but it couldn’t detect a heartbeat in the early months, and it required skill and a quiet room. Still, it gave doctors one objective piece of evidence they’d never had before, and it began to chip away at the dominance of quickening as the gold standard for confirming pregnancy.

When Pregnancy Looked Like Something Else

One of the most serious problems with 19th-century pregnancy diagnosis was that a growing abdomen didn’t always mean a baby. Dropsy, a condition involving fluid buildup in the body (now called edema), could cause dramatic abdominal swelling that looked nearly identical to pregnancy. Ovarian tumors and cysts produced similar swelling. Without imaging, doctors had to rely on touch, patient history, and guesswork to tell the difference.

Misdiagnosis went in both directions. Women with dropsy or tumors were sometimes treated as pregnant for months, only to discover the swelling had a very different cause. And women who were genuinely pregnant could be told they had a tumor. One famous case involved Jane Todd Crawford, whose large abdominal mass was initially mistaken for pregnancy before a surgeon determined it was an ovarian tumor and performed a pioneering operation to remove it in 1809. Doctors sometimes tried to rule out pregnancy by confirming a patient was a virgin, but even this approach depended entirely on the woman’s own testimony.

Why It Took So Long to Improve

The core problem throughout the 1800s was biological: pregnancy produces hormonal changes that are invisible to the naked eye, and the tools to detect those changes didn’t exist yet. The hormone responsible for a positive result on modern pregnancy tests, human chorionic gonadotropin (hCG), wasn’t identified until the early 20th century. The first true laboratory pregnancy test, which involved injecting a woman’s urine into animals and observing their reproductive response, didn’t emerge until 1927.

Until that point, every method available was either subjective (how does the woman feel?), unreliable (did a film form on the urine?), or only useful well into the pregnancy (can we hear a heartbeat or feel the baby move?). For the millions of women who lived through the 19th century, confirming a pregnancy in its early weeks was simply not possible. The answer, more often than not, was to wait and see.