Battery state of charge (SoC) tells you how much energy remains in a battery as a percentage, from 0% (empty) to 100% (full). There’s no single formula that works for every situation. Instead, there are several methods, each with different tradeoffs between simplicity and accuracy. The right approach depends on your battery chemistry, how precise you need to be, and whether you’re building a battery management system or just monitoring a single cell.
The Voltage Method (OCV)
The simplest way to estimate SoC is by measuring voltage. Every battery has a predictable relationship between its open circuit voltage (OCV) and how much charge it holds. “Open circuit” means no load is connected, so no current is flowing in or out. You measure the resting voltage, then look it up on a chart specific to your battery chemistry to find the corresponding percentage.
For example, a lithium iron phosphate (LFP) cell reads about 3.6V at 100% SoC and drops to roughly 2.0V at 0%. A 12V lead-acid battery sits around 12.7V when fully charged and 11.9V when nearly empty. These voltage-to-percentage curves are unique to each chemistry, so you need the right lookup table for your specific battery type.
The catch is that this method only works well when the battery has been resting. Under load, voltage sags due to internal resistance, giving you a falsely low reading. After removing the load, it can take anywhere from a few minutes to over 45 minutes for the voltage to stabilize to a true open circuit value. You can improve accuracy by compensating the voltage reading with a correction term based on the current flowing and the battery’s temperature, but for most people, the practical rule is simple: let the battery rest before trusting a voltage reading.
There’s another limitation that trips people up. Some chemistries, especially LFP, have a very flat voltage curve through the middle of their charge range. The voltage barely changes between roughly 20% and 80% SoC, which means small measurement errors translate into large SoC errors in that range. For chemistries with flatter curves, voltage alone is a poor estimator during normal use.
Coulomb Counting
Coulomb counting tracks SoC by measuring how much current flows in and out of the battery over time. Think of it like a fuel gauge that watches how much fuel you pump in and how much you burn, rather than checking the tank level directly. The core formula is:
SoC(t) = SoC(initial) + (efficiency / battery capacity) × total current over time
In more precise terms: you start with a known SoC value, then continuously measure current (in amps) and integrate it over time. Dividing by the battery’s rated capacity in amp-hours gives you the change in SoC as a fraction. A positive current (charging) increases SoC, and a negative current (discharging) decreases it. The efficiency factor accounts for the fact that batteries don’t return 100% of the energy you put in. Charging efficiency for lithium-ion cells is typically around 95-99%, while lead-acid batteries can lose more.
This method works in real time, even under heavy load, which is its biggest advantage over the voltage method. Battery management systems (BMS) in electric vehicles and power tools use coulomb counting as their primary tracking method for exactly this reason.
The downside is error accumulation. Every small inaccuracy in your current sensor adds up over time. If your sensor is off by even 1%, that error compounds with every charge and discharge cycle, and the estimated SoC gradually drifts away from reality. This is why coulomb counting always needs periodic recalibration, usually by cross-referencing with a voltage reading when the battery is at rest or when it reaches a known full or empty state.
Why You Need Both Methods Together
In practice, the most reliable SoC estimation combines voltage and coulomb counting. A well-designed system uses coulomb counting during active use to track changes in real time, then recalibrates using the OCV-to-SoC relationship whenever the battery rests long enough for its voltage to stabilize. Research on battery algorithms has found that if the battery has been at rest for more than about 45 minutes and the voltage is within operational limits, the OCV method can reset the SoC estimate and clear out any drift that accumulated during coulomb counting.
This hybrid approach is what most commercial battery management systems use. Your phone, laptop, and electric car all do some version of this behind the scenes. It’s also why fully charging or fully discharging a device occasionally can “recalibrate” the battery gauge: it gives the system a known reference point to correct its running estimate.
Temperature and Discharge Rate Effects
A battery’s usable capacity changes with temperature. Cold temperatures increase internal resistance and reduce the amount of energy you can actually extract. Batteries generally perform more efficiently at higher temperatures (within their safe operating range). If you’re calculating SoC in an environment with significant temperature swings, you need to account for this.
For precise systems, researchers model the capacity loss as a function of both discharge rate and temperature. One approach uses a linear correction: measure the capacity loss at two known temperatures (say 25°C and 35°C), then interpolate for any temperature in between. For most DIY applications, the practical takeaway is that your battery will read lower SoC in cold weather not because it lost charge, but because less of its stored energy is accessible at that temperature.
Discharge rate matters too, especially for lead-acid batteries. Peukert’s law describes this relationship: the faster you drain a battery, the less total capacity you get. The formula is:
Capacity = Current^k × Time
where k is the Peukert constant, a value unique to each battery (typically between 1.1 and 1.4 for lead-acid). When k equals 1, discharge rate doesn’t matter. When it’s higher, fast discharge significantly reduces usable capacity. So if your 100Ah lead-acid battery is rated at a 20-hour discharge rate (5 amps), drawing 20 amps won’t give you 5 hours of runtime. It will be noticeably less. Lithium-ion batteries have a Peukert constant much closer to 1, which is one reason they hold up better under heavy loads.
How Battery Aging Changes the Calculation
Every SoC calculation uses the battery’s total capacity as the denominator. As batteries age, that capacity shrinks. A lithium-ion cell rated at 3,000 mAh when new might only hold 2,400 mAh after a few hundred cycles. If your system still uses 3,000 mAh as the reference, it will consistently overestimate how much charge remains.
This is where state of health (SOH) comes in. SOH represents the battery’s current maximum capacity as a percentage of its original capacity. To get accurate SoC over the life of a battery, you need to periodically update the capacity value in your calculation. One practical way to do this: fully charge the battery, then fully discharge it while counting coulombs. The total amp-hours extracted is your current actual capacity. Divide that by the original rated capacity, and you have your SOH. Then use that updated capacity for all future SoC calculations.
More advanced systems estimate SOH automatically by analyzing patterns in charging behavior, like how long the battery spends in the constant-voltage phase of charging. This lets them adjust the SoC calculation on the fly without requiring a full discharge cycle.
Advanced Estimation With Kalman Filters
For applications where high accuracy is critical, such as electric vehicles or grid-scale energy storage, engineers use algorithms called Kalman filters. The most common variant for batteries is the extended Kalman filter (EKF).
The idea is straightforward even if the math is complex. A Kalman filter takes two imperfect sources of information: a mathematical model of how the battery should behave, and noisy real-world measurements from voltage and current sensors. It continuously blends these two sources, weighting each one based on how trustworthy it is at that moment. The result is an SoC estimate that’s more accurate than either source alone.
Think of it as a smarter version of the hybrid voltage-plus-coulomb-counting approach. Instead of only recalibrating during rest periods, a Kalman filter corrects its estimate constantly, even during active use. When the battery model is well-tuned to the specific cell chemistry, these filters achieve SoC accuracy within 1-2% error. The tradeoff is computational cost and the need for an accurate battery model, which is why this approach lives in dedicated BMS chips rather than simple monitoring setups.
Choosing the Right Method
- Voltage lookup alone works for quick, rough estimates when the battery is at rest. Good for checking a car battery with a multimeter or monitoring a solar battery bank overnight. Accuracy drops during use and with flat-curve chemistries like LFP.
- Coulomb counting is best for real-time tracking during active charge and discharge. Requires a current sensor (shunt resistor or Hall effect sensor) and a microcontroller or BMS. Needs periodic recalibration to prevent drift.
- Combined voltage and coulomb counting is the practical standard for most battery management systems. Handles both active use and rest periods well, with periodic OCV-based correction keeping drift in check.
- Kalman filter algorithms are the gold standard for precision applications. Require more processing power and a well-characterized battery model, but deliver the highest accuracy across varying conditions.
For most hobbyist and DIY projects, a combined approach using coulomb counting with OCV recalibration at known charge states will get you within a few percent of the true SoC. The key is using the correct capacity value for your battery’s current age and accounting for temperature if you’re operating outside the 20-30°C range.

