What Is Leak Testing? Methods, Units & Accuracy

Leak testing is the process of checking whether a sealed part, container, or system allows gas or liquid to escape (or enter) through unintended openings. It’s used across nearly every industry that makes or maintains sealed products, from fuel tanks and medical device packaging to HVAC systems and beverage containers. The methods range from something as simple as watching for bubbles in water to using helium-sensitive instruments that can detect leaks smaller than a millionth of a cubic centimeter per second.

How Leak Testing Works

Every leak test relies on the same basic physics: if a sealed object has a hole or crack, a pressure difference between the inside and outside will push gas or liquid through that opening. The test either measures that flow directly, detects a pressure change caused by the flow, or picks up the presence of a substance that has migrated through the defect.

Some methods are internal, placing sensors inside a container to detect physical changes associated with a leak. Others introduce a tracer material and then monitor the outside of the part for signs of that tracer escaping. In either case, the goal is a clear answer: does this part leak beyond an acceptable limit, or doesn’t it?

What counts as “acceptable” depends entirely on the application. Underground fuel storage tanks, for example, must be tested to detect leaks as small as 0.10 gallons per hour with at least 95% reliability and a false alarm rate of 5% or less, per EPA standards. A semiconductor vacuum chamber demands sensitivity millions of times finer than that. The method you choose depends on how small a leak matters.

Pressure Decay Testing

Pressure decay is the most common industrial leak test. The concept is straightforward: pressurize a sealed part, isolate it from the air source, and watch whether the pressure drops over time. A drop beyond a set threshold means the part is leaking.

The test runs in three stages. During the fill step, air is pushed into the part until it reaches the target pressure. Then the air supply valve closes, trapping the pressure inside. A stabilize step follows, giving temperature and pressure fluctuations time to settle. Finally, during the test step, a pressure sensor takes its reading over a fixed window of time. If the pressure has fallen more than the allowed amount, the part fails.

Pressure decay testing is accurate, easy to automate, and sensitive to small leaks. It also tends to be simple and inexpensive, making it a flexible solution that works with virtually any type of part. The main limitation is that the test must be calibrated against a known leak standard, and that calibration is affected by the volume of the part being tested. If you’re running different-sized parts on the same line, you need separate test parameters for each size.

Mass Flow Testing

Mass flow testing takes a different approach. Instead of sealing off the air supply and watching pressure drop, it keeps the part at constant pressure and measures how much air must be fed in to maintain that pressure. If air is escaping through a leak, the instrument detects the makeup air flowing in to compensate. The result is essentially instantaneous once pressure stabilizes.

The key advantage over pressure decay is that mass flow testing doesn’t care about part volume. Because it measures flow rate directly through a calibrated flow sensor, it doesn’t need a separate leak standard for calibration. This makes it well suited for production lines where parts vary in size or internal volume. On the other hand, it requires steady pressure throughout the test, which can be harder to maintain in some setups.

Bubble Emission Testing

Bubble testing is the most intuitive leak detection method. You pressurize a sealed part, submerge it in liquid (or apply a soap solution to the surface), and watch for bubbles forming at the leak site. It’s a visual, low-cost technique that excels at pinpointing exactly where a leak is located.

The ASTM standard for bubble emission techniques sets a normal sensitivity limit of about 1 × 10⁻⁵ standard cubic centimeters per second. For leaks at or above 1 × 10⁻⁴ standard cubic centimeters per second, trained operators can reproduce go/no-go results within about 10% of each other. The method isn’t designed to measure a precise leak rate, though. It tells you where a leak is and roughly how big it is based on bubble size, but it won’t give you a number you can plot on a chart.

For medical packaging, the ASTM F2096 standard uses a variation of this approach: internal pressurization bubble testing. It can detect gross leaks down to channel defects of 250 micrometers (about the width of two human hairs) with 81% probability. This method is destructive, since it requires puncturing the package to supply internal air pressure, so it’s used for sample-based quality checks rather than 100% inspection.

Tracer Gas Detection

When you need the highest possible sensitivity, tracer gas testing with helium and a mass spectrometer is the standard. Helium is used because its atoms are tiny (second smallest element), it’s inert, and it’s present in the atmosphere at only about 5 parts per million, so even trace amounts are easy to distinguish from background noise.

In a typical setup, helium is introduced inside the test part, and a mass spectrometer tuned to helium’s atomic mass monitors the outside for escaping gas. The practical sensitivity limit of a standard helium mass spectrometer leak detector is around 10⁻¹⁰ atmospheric cubic centimeters per second. Specialized techniques can extend that sensitivity by a factor of 100,000 beyond that limit. This level of detection is essential for components like spacecraft fuel systems, semiconductor fabrication chambers, and particle accelerators where even molecular-scale leaks are unacceptable.

Leak Rate Units

Leak rates can be confusing because different regions and industries use different units. In the U.S., the standard unit is “scc/m” or “scc/s,” meaning standard cubic centimeters per minute or per second. European industry uses Pa·m³/s or mbar·l/s, which are the SI units defined in nondestructive testing standards. “Standard” conditions are defined as 101.325 kPa (normal atmospheric pressure) and 20°C.

The distinction between “standard” and “actual” cubic centimeters matters more than most people realize. At higher test pressures, the gas inside the part is compressed, so a given volume of escaping gas represents more molecules than the same volume at atmospheric pressure. At a test pressure of 150 psig, for instance, the standard leak rate is 11.2 times larger than the actual (measured at test pressure) leak rate. At 20 psig, the multiplier drops to 2.36. If you’re comparing leak rates from different tests, make sure both numbers are expressed in the same unit at the same reference conditions, or the comparison is meaningless.

What Affects Test Accuracy

Temperature is the single biggest source of error in leak testing. For pressure-based methods, pressurizing a part causes adiabatic heating: the air warms up simply from being compressed. As it cools back down, the pressure drops even in a perfectly sealed part, mimicking a leak. This is why every pressure decay test includes a stabilization step, but if the wait isn’t long enough, the residual cooling shows up as a false failure.

For tracer gas methods, temperature variations affect the calibrated leak standard by approximately 3 to 4% per degree Celsius. A shift of just a few degrees can introduce 5 to 10% error in your results. Most modern helium leak detectors compensate for the temperature of their internal reference standard, but they typically don’t compensate for the temperature of the part being tested or any external calibration leak. Testing in an environment with fluctuating temperatures, near loading docks or HVAC vents, for example, can quietly degrade accuracy.

Other interference factors include moisture inside the test part (water vapor condenses and evaporates unpredictably, causing pressure noise), mechanical deformation of flexible parts under pressure (which changes internal volume and looks like a pressure drop), and residual contamination that outgasses volatile compounds during the test. Controlling these variables is often the difference between a leak test that catches real defects and one that generates a stream of false rejects.

Choosing the Right Method

  • Bubble testing is best for locating leaks visually when you need to find exactly where a defect is. It’s low cost and requires minimal equipment, but it’s slow, operator-dependent, and not suited for automated production lines.
  • Pressure decay is the workhorse for high-volume production testing of consistent parts. It’s fast, automatable, and cost-effective. Choose it when your parts are uniform in size and your required sensitivity is moderate.
  • Mass flow is the better choice when parts vary in volume or size, since it doesn’t require recalibration for different geometries. It provides a direct flow measurement rather than an inferred one.
  • Helium mass spectrometry is reserved for applications demanding the highest sensitivity, often in aerospace, semiconductor, and vacuum system manufacturing. It’s the most expensive option but detects leaks that no other method can.

The right test depends on three questions: how small a leak do you need to find, how fast does the test need to run, and what is the part’s size and geometry? Starting from those answers and working backward to a method is more reliable than picking a technique first and hoping it fits the application.