In the United States, about 5% of all electricity generated is lost before it reaches homes and businesses. That figure, averaged over 2018 through 2022 by the U.S. Energy Information Administration, represents the combined losses from both high-voltage transmission lines and lower-voltage local distribution networks. In other countries, the number can be significantly higher.
Where the Losses Actually Happen
Electricity travels in two stages. First, high-voltage transmission lines carry power over long distances from power plants to regional substations. Then, lower-voltage distribution lines deliver it through neighborhoods to individual buildings. Losses occur at both stages, but the physical causes differ.
The single biggest source of loss is simple heat. When electric current flows through a wire, some energy converts to heat due to the wire’s resistance. This effect scales with the square of the current, meaning that doubling the current quadruples the heat lost. That relationship is why power companies use extremely high voltages for long-distance transmission: higher voltage allows the same amount of power to be carried with less current, which dramatically cuts heat losses.
Transformers, the devices that step voltage up and down at every stage of the journey, are another major source. Transformer core losses account for roughly 25 to 30% of total distribution losses. These losses happen even when no electricity is being drawn by customers, simply because the transformer is energized and its steel core absorbs energy through magnetic effects.
Across a typical mixed network, about 25 to 33% of technical losses are “fixed,” meaning they happen regardless of how much electricity is flowing. The remaining 67 to 75% are “variable” losses that rise and fall with demand. During peak hours on a hot summer afternoon, variable losses climb substantially because more current is flowing through every component in the system.
How the U.S. Compares Globally
The 5% average in the United States is relatively low by global standards. Countries with newer or well-maintained infrastructure, like Germany, South Korea, and Japan, achieve similar or slightly better numbers. But in many developing nations, total losses can reach 15 to 25% or higher. India, for example, has historically reported losses well above 15%, though the figure has been declining with grid modernization.
The gap between countries comes down to three factors: the age and quality of equipment, the distances electricity must travel, and how much electricity is stolen or goes unmetered.
Technical vs. Non-Technical Losses
Not all lost electricity disappears as heat. A meaningful share, particularly in developing economies, is “non-technical loss.” This category includes electricity theft through illegal connections, broken or tampered meters, billing errors, and unregistered customers. Globally, non-technical losses cost electricity distributors an estimated $89.3 billion per year, with the top 50 emerging economies accounting for $58.7 billion of that total.
In the U.S. and Western Europe, non-technical losses are a small fraction of the total because metering infrastructure is relatively modern and enforcement is consistent. In regions where illegal connections are widespread, non-technical losses can actually exceed the physical losses in the wires themselves.
Why Higher Voltage Means Lower Loss
The physics here is straightforward. A power line carrying 100 megawatts at 500,000 volts needs far less current than the same line carrying 100 megawatts at 10,000 volts. Since heat loss depends on the square of the current, the high-voltage line wastes dramatically less energy. This is why cross-country transmission lines operate at hundreds of thousands of volts, and why the voltage is only stepped down closer to where you actually use it.
Long-distance lines using direct current (DC) instead of alternating current (AC) can reduce losses even further, particularly for routes over 400 miles. Several major DC transmission lines already operate in the U.S. and China for exactly this reason.
What 5% Loss Means in Practice
Five percent might sound small, but at scale it represents an enormous amount of energy. The U.S. generates roughly 4 trillion kilowatt-hours of electricity per year. A 5% loss means about 200 billion kilowatt-hours are dissipated as heat in wires and transformers annually. That’s roughly equivalent to the entire electricity consumption of a mid-sized country.
For individual consumers, this loss is already baked into your electricity rate. You don’t see a separate line item for transmission losses, but utilities factor it into the cost per kilowatt-hour they charge. When grid losses decrease through infrastructure upgrades, it puts mild downward pressure on rates. When losses increase, due to aging equipment or longer transmission distances, costs drift upward.
Factors That Push Losses Higher
Several conditions make losses worse. Older conductors with higher resistance waste more energy. Overloaded lines during peak demand periods see sharply higher variable losses. Longer distances between generation and consumption mean more wire for current to travel through. And poor power quality, such as imbalanced loads across the three phases of an AC system, creates additional inefficiency.
Climate plays a role too. Wires have higher resistance when they’re hotter, so transmission losses tend to increase on hot days, exactly when air conditioning drives peak demand. This unfortunate coincidence means the grid is least efficient precisely when it’s under the most stress.
Utilities manage these losses through a combination of infrastructure investment (replacing old transformers, upgrading conductors), grid optimization software that routes power more efficiently, and strategic placement of generation closer to demand centers. Rooftop solar and local battery storage also reduce distribution losses by generating electricity right where it’s consumed, bypassing much of the grid entirely.

