How Did They Test for Pregnancy in the 1700s?

In the 1700s, there was no reliable way to confirm a pregnancy in its early months. Doctors and midwives openly admitted this. The most trusted confirmation was “quickening,” the moment a woman first felt the baby move, which typically happened around 14 to 18 weeks. Everything before that point relied on a mix of urine inspection, folk tests, and educated guesswork that even the leading medical minds of the era acknowledged was unreliable.

Doctors Admitted They Had No Good Answer

What’s striking about 18th-century medical literature is how candidly practitioners confessed their ignorance. John Aitken, who wrote a widely circulated treatise on pregnancy, stated plainly that “the early state of pregnancy, or its existence for the first three or four months, is not always easily detected.” Hendrik van Deventer, a hugely influential figure in obstetrics, refused to even discuss early pregnancy detection in his book, writing that “the Signs of Impregnation are uncertain, and fallible in the first Months, wherefore we shall not give them a Place in this Book.”

William Smellie, one of the most celebrated obstetricians in Scotland, complained that the process of conception was “altogether uncertain” because opportunities to study pregnant women through dissection almost never arose. Another professor of midwifery, Alexander Hamilton, conceded it was “exceedingly difficult to ascertain the proportional growth or progress of the foetus in the womb.” In short, the experts of the day knew they were guessing.

Urine Inspection Was the Oldest Tool

The most common diagnostic method carried over from centuries of tradition was uroscopy: visually examining a woman’s urine. Practitioners looked at color, clarity, and whether the surface was foamy. Under medieval guidelines that persisted well into the 1700s, a pregnant woman’s urine was expected to appear clear, light lemon in color, tending toward whitish, with a layer of foam on top.

These “piss prophets,” as skeptics called them, had no understanding of the hormonal changes that actually alter urine during pregnancy. They were simply pattern-matching based on centuries of accumulated (and often contradictory) observations. The method was popular enough to be widespread, but not accurate enough to be trusted on its own.

Folk Tests Using Urine and Household Items

Beyond simple visual inspection, a whole catalog of folk tests promised to reveal pregnancy through chemical-like reactions. One involved mixing urine with wine and watching what happened. This may have had a sliver of validity: the acids in wine can curdle certain proteins that appear in pregnant women’s urine at higher concentrations. But nobody at the time understood why it sometimes seemed to work.

Other tests were more creative. A needle placed in a woman’s urine was said to turn red or black if she was pregnant. By the 16th century, “needle” had been misread as “nettle” in copied manuscripts, so a parallel tradition developed where a nettle leaf left in urine overnight was checked for red spots the next morning. Another method involved boiling urine and looking for white streaks, which supposedly confirmed pregnancy. One 1656 midwifery manual, the Compleat Midwives Practice, suggested that if a woman’s urine was sealed in a container for several days, “certain live things” would eventually become visible in it.

A test depicted in a 17th-century painting involved dipping a ribbon in urine and then burning it, with the smell or appearance of the burned ribbon supposedly providing the answer. And a method borrowed from ancient Egypt, still referenced in later centuries, involved urinating on barley and wheat seeds. If the barley sprouted first, it predicted a boy. Wheat sprouting first meant a girl. If nothing sprouted, the woman wasn’t pregnant.

None of these tests were standardized or consistently applied. They circulated through midwifery manuals, oral tradition, and folk knowledge, with different regions and practitioners favoring different methods.

Quickening Was the Real Confirmation

The moment that actually mattered, both medically and legally, was quickening. This is the first time a pregnant woman feels the baby move inside her, and it typically occurs between 14 and 18 weeks of gestation. English common law between the 16th and 18th centuries designated a pregnancy as “official” only at quickening. Before that point, a woman might suspect she was pregnant based on missed periods, nausea, or breast changes, but none of these were considered proof.

Quickening had a significance beyond the practical. It was widely understood as the moment the fetus gained life or a soul. This made it the dividing line for legal questions about inheritance, criminal liability, and the status of the pregnancy itself. A woman’s own testimony that she had felt movement was the primary evidence, making early pregnancy diagnosis fundamentally a matter of self-reporting rather than external testing.

Who Did the Diagnosing

For most women in the 1700s, pregnancy was managed entirely by midwives. These were experienced women who had typically given birth themselves and learned their skills through one-on-one apprenticeships with senior midwives. Their diagnostic approach drew heavily on embodied knowledge: recognizing the physical signs of pregnancy through years of hands-on experience rather than formal medical theory.

This was also the century when “man-midwives,” the forerunners of modern obstetricians, began gaining influence, particularly among wealthy families. These male practitioners based their approach on anatomy and emerging science rather than traditional intuition. But their advantage was mostly in managing complicated deliveries with instruments, not in diagnosing early pregnancy. Both groups were equally limited when it came to confirming pregnancy before quickening.

The tension between these two groups was real. One experienced midwife, Sarah Stephen, noted in 1795 that in over 30 years of practice she had encountered only eight labors requiring a surgeon’s help. She expressed bewilderment at the growing insistence that medical intervention was necessary, wondering why women should “require so often what they scarcely ever require… until man had found out many inventions.”

Why Early Pregnancy Was So Hard to Detect

The fundamental problem was biological knowledge. Nobody in the 1700s knew that pregnancy produces a specific hormone (what we now call hCG) that enters the bloodstream and urine within days of a fertilized egg implanting. Modern pregnancy tests detect this single molecule with near-perfect accuracy. Without that knowledge, 18th-century practitioners were left interpreting indirect clues: the color of urine, whether seeds sprouted, how a ribbon smelled when burned.

It would take until 1927 for scientists to discover that injecting a pregnant woman’s urine into mice caused visible changes in their ovaries, the first test based on actual pregnancy hormones. The home pregnancy test didn’t arrive until 1976. For the women of the 1700s, certainty about pregnancy came only when the baby made itself known by moving, and everything before that was an educated guess at best.