Why Is Chlorine Added to the Water Purification Process?

Chlorine is added to drinking water because it kills the bacteria, viruses, and parasites that cause waterborne diseases like typhoid, cholera, and dysentery. It remains the most widely used disinfectant in public water systems for a simple reason: it keeps working long after it’s added, protecting water as it travels through miles of pipes to reach your tap.

How Chlorine Kills Pathogens

Chlorine is a powerful oxidizer. When dissolved in water, it attacks microorganisms by disrupting their internal chemistry. Rather than simply punching holes in a cell’s outer wall, chlorine interferes with the enzymes and chemical processes that keep a microorganism alive. It can break apart DNA strands, shut down energy production inside cells, and disable the proteins organisms need to function. This makes it effective against a broad range of threats, from common bacteria like E. coli to viruses like poliovirus.

Some organisms are harder to kill than others. Bacteria are the most vulnerable, requiring relatively low chlorine concentrations and short contact times. Viruses take somewhat more effort. Parasitic cysts, like those from Giardia, are roughly 10 to 1,000 times more resistant than bacteria and viruses, which is why water treatment plants often combine chlorine with filtration or other methods to handle the full spectrum of contaminants.

The Residual Effect: Protection Beyond the Plant

This is what sets chlorine apart from other disinfection methods like ultraviolet light or ozone. Those technologies work well at the treatment plant, but their disinfecting power doesn’t travel. Chlorine, by contrast, leaves a small, measurable residual in the water that persists through the distribution system.

That residual serves three purposes. First, it continues to neutralize any microorganisms that enter the water through small cracks, aging joints, or pressure changes in the pipes between the treatment plant and your faucet. Second, it limits the growth of biofilm, the slimy colonies of bacteria (including Legionella, the organism behind Legionnaires’ disease) that can develop on the interior surfaces of water mains. Third, it acts as a built-in alarm system for water utilities: if the chlorine residual drops unexpectedly at a monitoring point, it signals that something in the system may have gone wrong.

The EPA requires public water systems that draw from surface water to maintain a disinfectant residual throughout their distribution networks. Most tap water contains between 0.2 and 1 milligram per liter of chlorine, well below the EPA’s maximum allowable level of 4 milligrams per liter.

When Chlorine Is Added During Treatment

Chlorine can be introduced at different stages of the purification process, and many plants add it more than once. Pre-chlorination happens early, before filtration and coagulation. At this stage, chlorine helps oxidize dissolved organic compounds and metals like iron and manganese, making them easier to remove in later steps. It also controls algae growth in settling basins and intake structures.

Post-chlorination happens near the end of the process, after sediment and particles have already been removed. This is the primary disinfection step, where chlorine concentration and contact time are carefully controlled to ensure pathogens are inactivated before the water enters the distribution system. The final dose is calibrated to leave just enough residual chlorine to protect the water on its journey through the pipes.

Forms of Chlorine Used in Treatment

Water utilities use chlorine in several forms. Chlorine gas was the original standard and remains effective, but it poses serious safety risks during storage and handling. A gas leak at a treatment facility could endanger workers and nearby residents, which is why the U.S. Department of Homeland Security has flagged it as a security concern.

Many utilities have switched to sodium hypochlorite, essentially a concentrated form of liquid bleach. It produces the same disinfecting chemistry in water but is far safer to store and handle. Some facilities generate sodium hypochlorite on-site from salt and electricity, eliminating the need to transport hazardous chemicals altogether. The finished water quality is comparable regardless of which form a plant uses.

A third option, chloramine (chlorine combined with ammonia), is sometimes used because it produces fewer byproducts and lasts longer in the distribution system. However, it’s a weaker disinfectant, so it’s typically used as a secondary treatment rather than the primary one.

The Historical Impact on Public Health

Before chlorination, waterborne diseases were a leading cause of death in American cities. The first continuous use of chlorine in a U.S. municipal water supply began in Jersey City, New Jersey, in late 1908. The results were dramatic and almost immediate.

One of the clearest examples comes from Wheeling, West Virginia. Between 1917 and 1918, the city’s typhoid fever rate ran between 155 and 200 cases per 100,000 people. Chlorination was introduced in late 1918, and by the first three months of 1919, only seven cases were recorded. Then, for three weeks in April 1919, chlorination was temporarily stopped. Cases tripled within that short window. Once chlorination resumed, the numbers dropped again, with only 11 cases in the remaining six months of the year. That kind of natural experiment left little doubt about chlorine’s effectiveness.

Disinfection Byproducts: The Tradeoff

Chlorine isn’t without drawbacks. When it reacts with naturally occurring organic matter in the water (decaying leaves, soil particles, algae), it creates chemical byproducts. Nearly 600 of these compounds have been identified so far, though most are present in extremely small amounts.

The two most closely monitored groups are trihalomethanes (THMs) and haloacetic acids (HAAs). THMs were first detected in finished drinking water in the 1970s and have been regulated since 1979. HAAs were added to the regulatory list in 1998. The EPA caps THMs at 80 micrograms per liter and HAAs at 60 micrograms per liter. Long-term exposure to elevated levels of these compounds has been associated with increased cancer risk in some studies, which is why water utilities carefully balance their chlorine doses: enough to kill pathogens, but not so much that byproduct levels climb above safe thresholds.

Treatment plants reduce byproduct formation by removing as much organic matter as possible before adding chlorine. This is one reason pre-filtration and coagulation steps matter so much. The cleaner the water is before chlorine is introduced, the fewer byproducts form.

Why Chlorine Remains the Standard

Alternatives to chlorine exist. Ozone is a stronger disinfectant. Ultraviolet light effectively inactivates parasites that resist chlorine. But neither provides lasting protection in distribution pipes, and both cost significantly more to install and operate. Most modern treatment plants use a layered approach, combining UV or ozone at the plant with chlorine for residual protection in the pipes. Chlorine’s combination of low cost, proven effectiveness, ease of use, and lasting residual makes it difficult to replace entirely. More than a century after its introduction, it remains the backbone of safe drinking water delivery in the United States and around the world.