Conductivity is measured by passing a small electrical current between two electrodes submerged in a liquid and recording how easily that current flows. The result is expressed in Siemens per meter (S/m), though most portable meters display readings in microSiemens per centimeter (µS/cm) or milliSiemens per centimeter (mS/cm). The entire process takes seconds with a handheld meter, but getting an accurate reading depends on calibration, temperature, and choosing the right sensor for your liquid.
What Conductivity Actually Tells You
Conductivity measures how well a solution carries an electrical charge, which depends almost entirely on the dissolved ions in it. Pure water conducts very little electricity on its own. Add salt, minerals, or other dissolved solids and the conductivity climbs. This makes conductivity a fast, indirect way to gauge what’s dissolved in water without sending a sample to a lab.
The practical range is enormous. Good drinking water typically falls between 0 and 800 µS/cm. Freshwater sources generally stay below 1,500 µS/cm. Seawater sits around 50,000 µS/cm. Ultrapure water used in laboratories and semiconductor manufacturing has conductivity so low it’s measured in fractions of a µS/cm. Knowing where your expected reading falls on this scale helps you pick the right meter and calibration standard.
Equipment You Need
The most common tool is a conductivity meter with a probe that holds two or four metal electrodes. Two-electrode probes are simpler and work well for low to moderate conductivity ranges. Four-electrode probes reduce a phenomenon called polarization, where ions build up on the electrode surfaces and skew the reading, making them better suited for higher conductivity solutions.
For viscous fluids or liquids carrying suspended solids, a toroidal (inductive) sensor is the better choice. Instead of exposed electrodes, toroidal sensors use two parallel ring-shaped coils. One coil generates a magnetic field that induces a current in the liquid, and the second coil measures that current. Because no metal electrode contacts the liquid directly, these sensors resist fouling and need less maintenance in harsh or dirty environments. Some versions are designed as flow-through sensors installed directly in a pipe, measuring conductivity without interrupting the process stream.
Step-by-Step Measurement Process
Start by calibrating your meter. Most conductivity meters are calibrated using a potassium chloride (KCl) solution with a known conductivity value. A common standard is 0.01 molar KCl, which produces a conductivity of 1,412 µS/cm at 25°C. You can also buy standards at other values (84 µS/cm, 12,880 µS/cm) to match the range you expect to measure. Rinse the probe with distilled or deionized water, immerse it in the standard solution, and let the meter adjust its reading to match the known value.
Once calibrated, rinse the probe again and place it in your sample. Submerge the electrodes completely and hold the probe still, avoiding contact with the walls or bottom of the container. Wait for the reading to stabilize, which usually takes a few seconds. Record the value along with the sample temperature, since temperature has a significant effect on conductivity.
Why Temperature Matters So Much
Conductivity readings change with temperature because warmer water allows ions to move more freely. The standard reference temperature is 25°C. Most meters apply automatic temperature compensation (ATC), using a built-in temperature sensor to adjust the raw reading back to what it would be at 25°C. The typical compensation factor assumes conductivity changes by about 1.9% for every degree Celsius, which works reasonably well for many natural waters.
That default factor isn’t perfect for every sample, though. Research published in Environmental Science & Technology found that using a fixed 1.9% compensation factor can produce errors ranging from roughly negative 42% to positive 25% compared to actual measurements across a range of natural water types. For most field work this is acceptable, but if you need high precision, you can determine a sample-specific compensation factor by measuring the same sample at multiple temperatures and calculating the correction yourself. Some advanced meters let you enter a custom compensation coefficient.
Converting Conductivity to Total Dissolved Solids
Many people measure conductivity because they actually want to know the total dissolved solids (TDS) in their water. Meters often display both values, but the TDS number is always an estimate. The meter multiplies the conductivity reading by a conversion factor, and that factor depends on what’s actually dissolved in the water.
For natural freshwater sources, a conversion factor around 0.76 is commonly used. So a conductivity reading of 1,000 µS/cm would give an estimated TDS of about 760 mg/L. Different types of dissolved minerals conduct electricity at different rates, so a water sample dominated by calcium will need a different factor than one dominated by sodium chloride. If you’re doing serious water quality work, it’s worth checking whether your meter’s default factor matches your water chemistry, or running a separate lab analysis to determine the right factor for your specific source.
Common Sources of Error
Air bubbles trapped on or near the electrode surfaces are one of the most frequent causes of inaccurate readings. Even a small bubble can partially block the electrical path between electrodes, producing a lower reading than the true value. Gently tapping the probe or swirling it slowly in the sample before taking a reading helps dislodge bubbles.
Electrode fouling is another persistent issue. Oils, biological films, mineral deposits, and other coatings gradually build up on electrode surfaces, creating a resistive layer that reduces the measured conductivity over time. Regular cleaning with an appropriate solution (mild acid for mineral deposits, detergent for organic films) keeps readings accurate. Between samples, always rinse the probe with deionized water to prevent cross-contamination.
A subtler problem is polarization, which occurs when ions accumulate in a thin layer right at the electrode surface during measurement. This effectively creates a small opposing voltage that makes the solution appear less conductive than it really is. Polarization is more pronounced at high conductivity levels and with two-electrode probes. Four-electrode designs and alternating current excitation both help minimize it. If your readings in concentrated solutions seem suspiciously low or drift downward over time, polarization is a likely cause.
Choosing the Right Meter for Your Application
For basic water quality checks in a home, aquarium, or garden, an inexpensive handheld meter with a two-electrode probe and automatic temperature compensation is sufficient. These cost anywhere from $20 to $150 and cover a range of 0 to around 20,000 µS/cm.
For environmental monitoring, lab work, or industrial process control, look for a meter with selectable cell constants, multiple calibration points, and the ability to adjust the temperature compensation coefficient. A cell constant describes the geometry of the probe’s electrodes. Low cell constants (around 0.1) are suited to low-conductivity samples like purified water, while higher cell constants (1.0 or 10.0) handle tap water, wastewater, or brine. Using the wrong cell constant pushes your measurement outside the probe’s accurate range.
In industrial settings where the probe stays immersed continuously, toroidal sensors are often the most practical option. Their lack of exposed electrodes means they can run for months in chemically aggressive or particle-laden fluids with minimal drift, whereas a conventional electrode probe in the same environment might need weekly cleaning to stay accurate.

