To measure the amp-hour (Ah) capacity of a battery, you discharge it at a known, constant current and multiply that current by how long the battery lasts before hitting its cutoff voltage. A 100 Ah battery discharged at 5 amps should run for 20 hours, but the actual result tells you the battery’s true remaining capacity. The process is straightforward in concept, though getting an accurate number requires attention to discharge rate, temperature, and cutoff voltage.
The Core Formula
Amp-hours are simply amps multiplied by hours:
A (amps) × H (hours) = Ah (amp-hours)
If you draw 3 amps from a battery and it dies after 8 hours, the measured capacity is 24 Ah. That’s the entire calculation. The challenge isn’t the math; it’s controlling the variables so the number you get is meaningful and repeatable.
Why Discharge Rate Matters
Batteries deliver less total energy when drained quickly. This is called the rate-capacity effect, and it’s one of the most important things to understand before you start testing. A battery rated at 100 Ah won’t actually deliver 100 amps for a full hour. At high currents, internal resistance causes more energy to be lost as heat, and the voltage drops to the cutoff point sooner than expected.
Manufacturers account for this by rating batteries at a specific C-rate. The C-rate describes how fast a battery is discharged relative to its capacity. A rate of C/20 (also written as 0.05C) means dividing the rated capacity by 20 to get the test current. For a 100 Ah battery, that’s 5 amps over 20 hours. Lead-acid and alkaline batteries are commonly rated at C/20 because it produces a realistic, repeatable capacity number. If you test at a higher current, you’ll measure fewer amp-hours than the label claims, and neither you nor the battery is wrong.
To compare your result to the manufacturer’s rating, match their discharge rate. Check the battery’s datasheet for the rated C-rate and use that same current for your test.
Choosing the Right Cutoff Voltage
You can’t discharge a battery to zero volts without damaging it. Every battery chemistry has a cutoff voltage, the point where the test should stop. Draining below this threshold can permanently reduce the battery’s capacity or destroy it entirely.
For a 12V lead-acid battery, the typical 0% state-of-charge voltage is around 10.5V. A 12V lithium battery reaches empty at roughly 10.0V. These values vary slightly by manufacturer and specific chemistry, so check your battery’s datasheet for the recommended discharge floor. Setting the wrong cutoff will either undercount your capacity (stopping too early) or harm the battery (stopping too late).
Temperature Changes Your Results
Battery capacity drops in cold weather. The standard rating temperature is 25°C (77°F), and every degree below that costs you about 0.9% of the battery’s capacity per °C. At 4°C (40°F), you’d need to multiply your measured capacity by a correction factor of 1.30 just to compare it to the rated value. At 10°C (50°F), the correction factor is 1.19.
If you’re testing to see whether a battery still meets its spec, do it indoors at room temperature. If you’re testing to see how a battery performs in real-world cold conditions, test it in those conditions, but understand the number will be lower than the label and that’s expected. Capacity is generally not corrected for temperatures above 25°C.
Methods for Measuring Capacity
Manual Method With a Multimeter and Load
The simplest approach uses a known resistive load (like a power resistor or a 12V light bulb), a multimeter, and a clock. Connect the load to the battery, note the current draw with the multimeter, and monitor the voltage as it drops. When the voltage reaches the cutoff threshold, stop the timer and multiply amps by hours.
A multimeter tells you voltage and current but won’t track capacity on its own. You’re the data logger in this setup, which means checking in periodically and recording readings. It works for a rough estimate, but the current through a simple resistor drifts as voltage drops, making the calculation less precise. For a 20-hour test, this gets tedious.
Dedicated Battery Capacity Testers
A battery tester applies a controlled load and does the measurement automatically. It puts a load on the battery, monitors voltage throughout the discharge, and reports remaining capacity as a percentage of the battery’s rated spec. Unlike a multimeter, which only gives you a voltage snapshot, a tester tells you whether the battery can actually deliver the power it’s supposed to.
Commercial capacity testers range from inexpensive USB testers for small lithium cells (under $20) to professional bench instruments for large battery banks. Most handle the cutoff voltage, timing, and calculation for you.
Smart Battery Shunts
For batteries in an installed system (RVs, solar setups, boats), a smart shunt provides ongoing capacity monitoring without running a dedicated discharge test. The shunt installs in-line with the main negative battery cable and measures all current flowing in and out of the battery bank. It tracks cumulative amp-hours consumed and calculates state of charge in real time, typically accurate to within a few percentage points when properly calibrated.
Installation is straightforward for anyone comfortable with basic 12V wiring. The shunt connects between the battery’s negative terminal and the rest of the system, with a small wire harness for the display or Bluetooth module. This approach doesn’t give you a single lab-quality Ah number, but it gives you continuous, practical capacity data while the system is in use.
DIY Arduino-Based Testers
If you want precise, automated measurements on a budget, you can build a capacity tester with an Arduino microcontroller, an LCD screen, a power resistor mounted on a heatsink, and a transistor to control the load. The Arduino reads voltage across the resistor, calculates current, tracks elapsed time, and stops the test when voltage hits the cutoff. You input the battery’s rated capacity, desired test duration (typically 20 hours), and end-voltage threshold. The result is stored in memory so it survives a power cycle. Expect a parts cost under $30 and a few hours of soldering.
Getting an Accurate Reading
Start with a fully charged battery. If the battery isn’t at 100% state of charge, your measured Ah will be lower than the true capacity, and you’ll have no way to tell whether the battery is degraded or just wasn’t full when you started. For lead-acid batteries, a resting voltage of 12.5V or higher confirms a full charge. Let the battery sit for at least a few hours after charging before testing, since surface charge can inflate the voltage reading.
Use a constant-current load if possible. Simple resistors let the current decrease as the battery voltage drops during discharge, which complicates the calculation. An electronic load or a transistor-controlled circuit (like the Arduino setup) maintains steady current throughout the test, giving you a clean amps-times-hours result. If you’re stuck with a fixed resistor, log voltage at regular intervals and calculate the average current for each interval, then sum the amp-hours across all intervals.
Record the ambient temperature. Even a few degrees can shift your result by several percent. Note it alongside your Ah measurement so you can apply the correction factor if needed.
For batteries you’re evaluating for replacement, a measured capacity below 80% of the rated value is a common threshold for “end of useful life” in most applications. A 100 Ah battery measuring 75 Ah has lost a quarter of its capacity and will continue degrading with each charge cycle.

