Line loss is the electrical energy that disappears as heat between where power is generated and where it’s actually used. Every wire, transformer, and connection in the electrical grid converts a small portion of the electricity flowing through it into waste heat. In the United States, transmission and distribution losses average about 5% of all electricity generated, according to the U.S. Energy Information Administration. That may sound small, but applied to an entire national grid, it represents an enormous amount of wasted energy.
Why Wires Lose Energy
The primary cause of line loss is a straightforward physics principle: when electric current flows through any conductor, the conductor’s resistance converts some of that electrical energy into heat. This is the same process that makes a toaster glow or a space heater warm a room, just happening unintentionally across miles of power lines. The amount of power lost depends on two things: how much current is flowing and how much resistance the wire has. Crucially, the loss scales with the square of the current. Double the current, and you quadruple the heat loss.
Every conductor has some resistance. Copper and aluminum, the metals most commonly used in power lines, have relatively low resistance but are not perfect conductors. That resistance increases with the length of the wire and decreases with larger wire diameters. It also rises with temperature, which means power lines lose more energy on hot summer days, exactly when electricity demand tends to peak.
The Two Main Types of Loss
Most line loss falls into two categories: resistive losses and corona losses.
Resistive losses (sometimes called “copper losses”) are by far the larger contributor. They happen everywhere current flows, from high-voltage transmission lines spanning hundreds of miles to the wiring inside your house. Any time electricity encounters resistance, energy is shed as heat.
Corona losses are specific to high-voltage transmission lines. When the electric field around a conductor gets strong enough to break down the surrounding air, it creates a visible glow and an audible hum. This ionization process pulls energy from the line. In fair weather, corona loss is negligible. Stanford researchers reviewing measurements of a 765-kilovolt transmission line found losses of just 1.87 kilowatts per kilometer in good conditions, amounting to roughly 0.08% over 1,000 kilometers. In bad weather, the picture changes dramatically. Rain, frost, and humidity amplify the effect. The same line lost 84.3 kilowatts per kilometer in poor weather, pushing losses to about 3.7%. Bundling multiple conductors together and using larger-radius wires significantly reduces corona discharge.
How High Voltage Keeps Losses Down
This is the single most important engineering trick behind the modern power grid. Since power loss is proportional to the square of the current, reducing the current is the most effective way to cut losses. You can transmit the same amount of power at a much lower current by raising the voltage. That’s why power plants send electricity out at extremely high voltages, often 345,000 volts or more, then step it down through transformers as it gets closer to homes and businesses.
Without high-voltage transmission, long-distance electricity delivery would be impractical. The losses over hundreds of miles at lower voltages would waste far more energy than the 5% average we see today. Transformers at substations progressively reduce the voltage from transmission levels down to the 120 or 240 volts your outlets provide.
What Causes Losses to Vary
Several factors determine how much energy a particular stretch of the grid loses:
- Wire material and size. Aluminum is lighter and cheaper than copper but has higher resistance. Larger-diameter conductors carry more current with less resistance, which is why major transmission lines use thick, bundled cables.
- Distance. Longer lines mean more resistance. A power plant 500 miles away will lose more energy getting electricity to you than one 50 miles away.
- Temperature. Hot weather increases the resistance of metal conductors. Lines running at capacity on a 100-degree day lose more energy than the same lines on a cool evening.
- Load. The more current a line carries, the greater the losses. Peak demand periods create disproportionately higher losses because of that squared relationship between current and heat.
- Equipment age. Older transformers are less efficient than modern designs. A heavily loaded, aging transformer can be a significant source of wasted energy.
Line Loss in Your Home
Line loss isn’t just a grid-scale issue. Inside residential wiring, the same physics apply on a smaller scale. Electricians call it “voltage drop,” and the National Electrical Code recommends keeping it under 3% for individual circuits and under 5% for the combined path from your electrical panel to any outlet or appliance.
When wiring is undersized or runs are excessively long, voltage drop climbs beyond those limits. The practical effects are noticeable: lights dim, motors overheat and struggle to start, and sensitive electronics can malfunction. Over time, excessive voltage drop also means higher energy bills because appliances draw more current trying to compensate for the reduced voltage, and some of that extra current is simply wasted as heat in the wiring itself. In severe cases, overheating wires become a safety hazard.
If you’re running a long circuit to a detached garage or workshop, for example, using a larger wire gauge than the minimum required helps keep voltage drop within safe, efficient limits.
How Utilities Reduce Grid Losses
Grid operators use a combination of strategies to minimize the energy that never reaches customers. The most common approaches include replacing aging conductors with larger or lower-resistance materials, upgrading old transformers to higher-efficiency models, and restructuring grid topology so power takes shorter or less congested paths.
Reactive power compensation is another key tool. Devices like shunt capacitors, installed at substations and along distribution lines, improve what engineers call the “power factor.” In simple terms, they reduce the portion of current that sloshes back and forth without doing useful work, which lowers the total current on the line and cuts resistive losses. A typical goal is raising the power factor to 0.9 or above at each distribution transformer.
More recently, utilities have started using automated voltage optimization systems that continuously adjust voltage levels across the grid in real time, responding to changing loads rather than relying on fixed settings. These systems use sensors and software to find the sweet spot where voltage is high enough for equipment to function properly but not so high that excess energy is wasted.
Why Line Loss Shows Up on Your Bill
Utilities must generate more electricity than customers actually consume to account for line losses. That extra generation costs money, and those costs are built into electricity rates. In deregulated energy markets, line loss factors are sometimes listed as a separate charge or adjustment on commercial electricity contracts. Even where it’s not broken out explicitly, every ratepayer is covering the cost of energy that heated up wires instead of powering anything useful.
Efforts to reduce line loss have a direct financial and environmental payoff. Every percentage point of loss eliminated means less fuel burned, fewer emissions produced, and lower costs passed along to customers. For a grid the size of the United States, even fractional improvements in efficiency translate to billions of kilowatt-hours saved annually.

