What Is Electricity Consumption and How Is It Measured?

Electricity consumption is the total amount of electrical energy used over a period of time, measured in kilowatt-hours (kWh). One kWh equals the energy drawn by a 1,000-watt device running for one hour. It’s the number on your electric bill, the figure utilities track for every home and business, and the metric governments use to gauge how much power a country needs.

How Electricity Consumption Is Measured

Power, the rate at which electricity flows, is measured in watts. A single watt equals one ampere of current under the pressure of one volt. Small devices like phone chargers draw just a few watts, while larger appliances like ovens or dryers pull thousands of watts (kilowatts). But watts only tell you how much power something demands at a given moment. To capture actual consumption, you need to factor in time.

That’s where the kilowatt-hour comes in. If you run a 100-watt light bulb for 10 hours, you’ve consumed 1 kWh (100 watts × 10 hours = 1,000 watt-hours = 1 kWh). Your electric meter records this continuously, tallying every kWh your household pulls from the grid. At the end of the billing cycle, that total is multiplied by your rate per kWh to produce your bill.

The formula works the same way for any device: multiply its wattage by the number of hours it runs, then divide by 1,000 to convert to kWh. A 2,000-watt space heater running for 3 hours uses 6 kWh. A 10-watt LED bulb left on overnight for 8 hours uses 0.08 kWh. Knowing this math lets you pinpoint which appliances are driving your costs.

What the Average Household Uses

The average U.S. household consumes about 10,500 kWh of electricity per year, which works out to roughly 875 kWh per month. At the current average residential rate of about 17.3 cents per kWh, that translates to roughly $150 per month, though rates vary widely by state and season.

Where all that electricity goes breaks down roughly like this:

  • Air conditioning: about 16% of total home electricity use
  • Refrigerators: about 14%
  • Space heating (electric): about 10%
  • Water heating: about 9%
  • Lighting: about 9%
  • Clothes dryers: about 6%

Heating and cooling together account for more than a quarter of the typical home’s electricity. That share climbs even higher in regions with extreme summers or winters, or in all-electric homes that don’t use natural gas for heating.

Why Consumption Fluctuates

Electricity use varies with the weather more than any other single factor. Changes in temperature and humidity directly drive demand for heating and cooling. The residential sector shows the largest seasonal swings, with noticeable spikes every summer (air conditioning) and winter (electric heating). Even homes that heat with natural gas or oil still use electricity to power furnace fans, circulation pumps, and other components.

Beyond weather, household size, building insulation, the age and efficiency of your appliances, and simple habits like leaving lights on all play a role. A poorly insulated home in Texas will consume far more electricity in July than a well-sealed home in a mild coastal climate.

Industrial electricity demand, by contrast, stays relatively flat throughout the year. Factories don’t need much heating and cooling compared to their total energy use, so economic conditions like production volume matter more than the season.

How National Consumption Breaks Down

In the United States, electricity consumption splits roughly into thirds across three major sectors. Residential customers (single-family homes and apartment buildings) account for more than a third of national electricity use. Commercial customers (offices, retail stores, hospitals, schools) also use more than a third. Industrial users (manufacturing, mining, agriculture) consume less than a third.

Globally, electricity consumption surged in 2024, growing by nearly 1,100 terawatt-hours, a 4.3% increase over the prior year. That growth rate was almost double the recent average, driven largely by economic expansion and electrification of transportation and heating in many countries.

Peak Demand vs. Base Load

Electricity demand isn’t constant throughout the day. It follows a predictable cycle: lowest in the late night and early morning hours, rising through the workday, and typically peaking in the late afternoon or early evening. The minimum level of demand that never drops below a certain floor is called the base load, usually around 30 to 40% of the daily maximum. Power plants designed for base load run continuously to meet that minimum.

The extra demand above that floor, especially during the hottest summer afternoons or coldest winter mornings, is handled by “peaking” plants that can start up and shut down quickly. This is why electricity rates in some areas are higher during peak hours. If your utility offers time-of-use pricing, shifting energy-intensive tasks like laundry or dishwashing to off-peak hours can lower your bill.

The Growing Role of Data Centers

One of the fastest-growing sources of electricity demand is data centers, fueled by the expansion of cloud computing and artificial intelligence. U.S. data centers consumed about 76 terawatt-hours in 2018, roughly 1.9% of the nation’s total electricity use. By 2023, that figure had more than doubled to 176 TWh, or 4.4% of total consumption.

Projections from the Lawrence Berkeley National Laboratory estimate data center demand could reach 325 to 580 TWh by 2028, representing 6.7 to 12% of projected national electricity use. Some estimates looking further out to 2030 range from 200 TWh to over 1,000 TWh, depending on how quickly AI adoption scales. In parts of the country, this demand is already outpacing available grid capacity, pushing companies to delay projects or contract power directly from private producers.

For consumers, this matters because rising total demand can put upward pressure on electricity prices and strain the grid during peak periods, particularly in regions where new generation capacity hasn’t kept pace with growth.