Battery capacity is measured in ampere-hours (Ah) or milliampere-hours (mAh), which tell you how much current a battery can deliver over a set period of time. A battery rated at 5,000 mAh can theoretically supply 5,000 milliamps for one hour, or 1,000 milliamps for five hours, before it’s fully drained. But that simple rating hides a lot of nuance: the speed you drain the battery, the temperature, and even the age of the cell all change the number you actually get.
Ampere-Hours vs. Watt-Hours
The two main units you’ll see on battery labels are ampere-hours (Ah) and watt-hours (Wh). Ampere-hours measure charge, meaning how much current flows over time. Watt-hours measure energy, meaning how much total work the battery can do. The relationship between them is straightforward: Wh = V × Ah. A 3.7V lithium cell rated at 3 Ah stores 11.1 Wh of energy.
This distinction matters when you’re comparing batteries that run at different voltages. Two batteries can have the same Ah rating but store very different amounts of energy if their voltages differ. Watt-hours give you a true apples-to-apples comparison of total energy. That’s why electric vehicles and home solar systems are typically rated in kWh (kilowatt-hours) rather than Ah.
For small consumer electronics like phones and earbuds, you’ll almost always see milliampere-hours (mAh), which is simply one-thousandth of an Ah. A 4,500 mAh phone battery is the same as 4.5 Ah.
How Capacity Is Tested
The standard way to measure a battery’s capacity is a controlled discharge test. The battery starts fully charged, then a constant current is drawn from it while instruments record voltage, current, and time. The discharge continues until the battery hits a specific minimum voltage, called the cutoff voltage. The total current multiplied by the time it took gives you the capacity in ampere-hours.
For most lithium-ion chemistries (the type in phones, laptops, and EVs), the cutoff voltage is 3.0V per cell. Lithium iron phosphate cells use a lower cutoff of 2.5V, while lithium titanate cells bottom out at 1.8V. Draining below these thresholds risks permanent damage, so the cutoff defines the usable capacity.
Throughout the test, the discharge current must stay constant. As a battery drains, its voltage drops, which naturally changes how current flows through a fixed load. Industrial test setups continuously adjust the load resistance to keep the current steady. This is why you can’t just hook a battery to a lightbulb and call it a capacity test.
Why Discharge Speed Changes the Result
A battery’s rated capacity assumes a specific discharge rate, expressed as a “C-rate.” A 1C rate means drawing the full rated capacity in one hour. A 0.5C rate takes two hours. A 2C rate tries to drain it in 30 minutes. The faster you pull energy out, the less total capacity you actually get.
This happens because high current creates more heat and greater stress on the battery’s internal chemistry. At a 3C discharge rate, one study found internal resistance increased by as much as 27.7% compared to slower rates. That extra resistance wastes energy as heat rather than delivering it to your device. Over many cycles, high-rate discharges also accelerate permanent capacity loss through mechanical damage to the battery’s active materials.
For lead-acid batteries (the type in cars and backup power systems), this effect is especially pronounced and is described by Peukert’s Law. The equation uses a constant unique to each battery to predict how much capacity you’ll actually get at a given discharge rate. A lead-acid battery rated at 100 Ah at a slow 20-hour discharge rate might deliver only 60-70 Ah if you try to drain it in one hour. Lithium-ion batteries handle high discharge rates better, but the effect still exists.
Temperature Makes a Big Difference
Cold weather is the enemy of battery capacity. Testing shows that a lithium-ion battery loses about 15% of its measured capacity at roughly negative 10°C (14°F) compared to room temperature. At negative 20°C (negative 4°F), that loss jumps to around 35%. The chemical reactions inside the cell simply slow down in the cold, reducing how much energy you can extract before hitting the cutoff voltage.
This is why your phone might die at 30% on a freezing day, or why an EV’s range drops noticeably in winter. The energy is technically still in the battery, but the chemistry can’t release it fast enough. Warming the battery back up restores most of that “lost” capacity. For accurate capacity testing, manufacturers run discharge tests at a standardized temperature, typically 25°C (77°F), so that results are comparable.
How Your Devices Estimate Capacity in Real Time
The battery percentage on your phone or laptop comes from a battery management system (BMS) that estimates remaining capacity without doing a full discharge test every time. The most common method is coulomb counting: the BMS measures the current flowing in and out of the battery continuously and integrates it over time. Think of it like a water meter tracking how much has flowed out of a tank.
The formula is simple in concept. The system starts with a known charge level, then subtracts the cumulative current drawn over time, expressed as a ratio of the battery’s rated capacity. The result is the state of charge (SOC), shown as your battery percentage. In practice, small measurement errors accumulate over many cycles, which is why your phone occasionally needs to recalibrate. You might notice the last 10% draining faster or slower than expected, which is the BMS correcting its estimate against actual voltage readings.
Other estimation methods include measuring the battery’s open circuit voltage (which correlates with charge level when the battery is at rest) and tracking internal resistance, which rises as a battery ages.
Capacity Fades Over Time
Every rechargeable battery loses capacity with use. The industry tracks this with a metric called state of health (SoH), expressed as a percentage. A brand-new battery starts at 100% SoH, meaning its actual capacity matches the factory specification. Over hundreds of charge cycles, that number gradually drops. A battery is generally considered at end of life when SoH falls to 70-80%, depending on the application.
There’s no single way to measure SoH. Battery management systems may factor in internal resistance, voltage behavior, self-discharge rate, charge acceptance, cycle count, age, and temperature history. Some combine several of these into a single health score. This is why two identical batteries with the same cycle count can show different health levels: one may have been stored in a hot car, while the other lived in a climate-controlled office.
Tools for Measuring Capacity
A standard multimeter can tell you a battery’s voltage, which gives a rough indication of charge level, but it cannot measure capacity. Voltage alone doesn’t reveal how long the battery will sustain that output under load.
A dedicated battery tester goes further. These devices put a controlled load on the battery, measure how it responds, and calculate remaining capacity. Higher-end testers let you select the battery chemistry and rated capacity, then run a diagnostic that reports both voltage and usable capacity. For hobbyists working with rechargeable cells, USB battery analyzers and hobby-grade chargers with discharge modes can run a full drain-and-measure cycle, giving you an actual Ah reading rather than an estimate.
Professional battery analyzers used in labs and manufacturing can cycle batteries hundreds of times at precisely controlled temperatures and C-rates, building a complete profile of how capacity changes under different conditions. These are the instruments behind the specs printed on the label.

