How to Measure Water Resistivity: Step by Step

Water resistivity is measured using a conductivity meter or resistivity meter that passes a small electrical current between two electrodes submerged in the water sample. The instrument reads conductivity (how easily current flows), and since resistivity is simply the inverse of conductivity, most meters display both values automatically. The theoretical maximum for perfectly pure water is 18.18 MΩ·cm at 25°C, so any reading below that tells you ions or contaminants are present.

What Resistivity Actually Tells You

Pure water contains very few ions, so it resists electrical current strongly. When salts, minerals, or dissolved gases are present, they break into charged particles that carry current, dropping the resistivity. This makes resistivity a fast, reliable proxy for water purity: the higher the number, the cleaner the water.

Resistivity and conductivity are reciprocals of each other. If you know one, you know the other. The formula is straightforward: resistivity (in Ω·cm) equals 1 divided by conductivity (in S/cm). A conductivity reading of 0.055 µS/cm, for example, translates to about 18.2 MΩ·cm, the benchmark for ultrapure water at 25°C. Conductivity is typically reported in microsiemens per centimeter (µS/cm), while resistivity uses megohm-centimeters (MΩ·cm) for high-purity water or ohm-meters (Ω·m) for natural water sources.

Equipment You Need

The core instrument is a conductivity/resistivity meter paired with a probe (also called a conductivity cell). Benchtop meters offer higher accuracy for lab work, while handheld meters work well for field sampling. Many modern meters display resistivity directly, but even if yours only shows conductivity, the conversion is a single division.

The most important specification on your probe is its cell constant, labeled K. This value determines the range of conductivity the probe can accurately measure. Choosing the wrong cell constant is one of the most common sources of error:

  • K = 0.01: Best for ultrapure water below 1 µS/cm (above 1 MΩ·cm)
  • K = 0.1: Suited for high-purity water in the 0.5 to 200 µS/cm range
  • K = 1.0: General-purpose, covering 10 to 2,000 µS/cm
  • K = 10: Designed for high-conductivity samples like brackish or saline water, from 1 to 200 mS/cm

If you’re measuring ultrapure or deionized water, you need a low cell constant (0.01 or 0.1). Using a general-purpose K=1.0 probe on ultrapure water will give unreliable readings because the signal is too small for the probe geometry to detect accurately. Your meter also needs to support selectable cell constants if you’re using anything other than K=1.0.

Step-by-Step Measurement Process

Start by calibrating your meter with a conductivity standard solution that falls within your expected measurement range. For high-purity water work, a 10 µS/cm or 100 µS/cm standard is typical. Follow your meter’s calibration procedure, which usually involves immersing the probe in the standard and pressing a calibration button.

Rinse the probe with deionized water before inserting it into your sample. For a static (non-flowing) measurement, submerge the probe so the electrodes are fully covered, with no air bubbles trapped between them. Let the reading stabilize, which usually takes 15 to 30 seconds. For in-line measurements on a flowing system like a water purification loop, the probe installs directly into the pipe or a flow cell, and the meter reads continuously.

These two approaches, static sampling and continuous in-line monitoring, are formally described in ASTM D1125, the international standard for electrical conductivity and resistivity testing of water. Pharmaceutical plants and semiconductor fabrication facilities typically use the in-line method for real-time purity monitoring, while environmental testing and spot checks use the static method.

Why Temperature Correction Matters

Temperature has a dramatic effect on water resistivity. At 25°C, ultrapure water reads 18.2 MΩ·cm. At 10°C, that same water can read above 40 MΩ·cm, more than double, not because it got purer but because cold water conducts less. If you don’t correct for temperature, your readings will be misleading.

The standard reference temperature is 25°C. Most quality meters include an automatic temperature compensation (ATC) feature: a built-in temperature sensor adjusts the displayed value to what it would be at 25°C. If your meter lacks ATC, or if you need to convert readings manually, Arp’s formula provides a good approximation. In Celsius, multiply your measured resistivity by (T₁ + 21.5) divided by (T₂ + 21.5), where T₁ is your measurement temperature and T₂ is 25°C. For Fahrenheit, replace 21.5 with 6.77 and use 77°F as the reference.

Always record both the raw reading and the temperature at the time of measurement, even when using automatic compensation. This gives you a way to verify the correction later.

Common Sources of Error

The biggest enemy of accurate high-purity water measurement is atmospheric carbon dioxide. When ultrapure water is exposed to air, CO₂ dissolves into it and forms carbonic acid, which dissociates into ions. Those extra charge carriers can increase conductivity by a factor of three or more in low-salinity water, which means your resistivity reading drops significantly within minutes of exposure. A sample of 18.2 MΩ·cm water left in an open beaker can fall to 1 MΩ·cm or lower surprisingly fast.

To minimize CO₂ contamination, measure samples quickly after collection, keep containers sealed, and use in-line measurement whenever possible. Flow cells that isolate the water from the atmosphere give the most accurate results for high-purity applications.

Other common pitfalls include dirty or fouled electrodes (clean them regularly with mild acid or the manufacturer’s recommended solution), air bubbles trapped on the probe surface, and using an uncalibrated meter. Probe cables that are damaged or excessively long can also introduce electrical noise, particularly at the very low conductivity levels found in ultrapure water.

Interpreting Your Results

Where your reading falls tells you a lot about your water quality. Here are some reference points at 25°C:

  • 18.2 MΩ·cm: Theoretical maximum for pure water. Ultrapure (Type I) lab water reaches 18.0 to 18.2 MΩ·cm.
  • 1 to 15 MΩ·cm: High-purity water suitable for many laboratory and industrial processes (Type II and III grades).
  • 0.01 to 1 MΩ·cm: Deionized or distilled water that may have picked up some contamination.
  • Below 0.01 MΩ·cm (below 10,000 Ω·cm): Tap water, natural freshwater, or process water with significant dissolved solids.

For natural water and groundwater, resistivity depends heavily on total dissolved solids and the specific ions present. Seawater has a resistivity around 0.2 Ω·m. Fresh groundwater typically falls between 10 and 100 Ω·m. These values are useful in environmental monitoring, geophysical surveys, and assessing water treatment performance.

If your readings are lower than expected, the most likely culprits are dissolved CO₂ from air exposure, contaminated sample containers, a probe that needs cleaning, or the wrong cell constant for your range. Recheck each of these before assuming the water itself is impure.