State of charge (SoC) is the battery equivalent of a fuel gauge. It represents the amount of energy currently available in a battery, expressed as a percentage of its total rated capacity. A battery at 80% SoC has used 20% of its stored energy; one at 30% is nearly two-thirds depleted. While the concept sounds simple, accurately measuring SoC is one of the trickiest problems in battery engineering, and understanding it matters whether you’re driving an electric vehicle, managing a phone battery, or sizing a solar storage system.
How SoC Is Calculated
SoC is defined as the available capacity of a battery, measured in amp-hours (Ah), divided by its rated capacity, then multiplied by 100 to get a percentage. A fully charged battery reads 100%. A fully depleted one reads 0%. The challenge is that you can’t simply peek inside a battery to see how much charge remains. Instead, battery management systems rely on indirect measurement techniques that each come with trade-offs.
The most common method is called coulomb counting. It works by continuously tracking how much current flows in and out of the battery over time, then comparing that to the battery’s known total capacity. Think of it like measuring how much water you’ve poured out of a jug. This approach works well over short periods, but small measurement errors accumulate with every reading. Over hours and days, those tiny inaccuracies stack up, causing the reported SoC to drift away from reality.
The other widely used approach measures the battery’s open circuit voltage, which is the stable voltage a battery settles at when no current is flowing. Each voltage level corresponds to a specific charge level, so the system can map voltage to SoC using a known discharge curve. The catch: this relationship is strongly nonlinear in lithium-ion batteries, meaning small voltage differences can represent large SoC swings at certain charge levels. The battery also needs to rest with no load for the voltage to stabilize, which isn’t practical during active use.
SoC and Depth of Discharge
State of charge has a simple inverse relationship with another common term: depth of discharge (DoD). If you have a 100 amp-hour battery and use 20 amp-hours, your depth of discharge is 20% and your state of charge is 80%. Use 70 amp-hours, and your DoD is 70% with an SoC of 30%. The two always add up to 100%. You’ll see DoD used more often in discussions about battery lifespan, while SoC tends to appear in conversations about how much energy is available right now.
Why Your Battery Percentage Isn’t Always Accurate
If your phone has ever died at 15% or jumped from 40% to 25% in minutes, you’ve experienced SoC drift. Because coulomb counting accumulates small errors over time, and because the battery’s actual capacity changes as it ages, the percentage on your screen gradually becomes less reliable. Calendar-based capacity loss isn’t linear either. Batteries typically lose about 5% of their capacity in the first year, then the rate slows, eventually reaching up to 20% loss over the battery’s full lifetime. If the system still thinks the battery holds its original capacity, it will overestimate how much charge remains.
Temperature adds another layer of complexity. Traditional methods that rely only on voltage and current show significant errors in older batteries because aging changes internal resistance, reduces capacity, and creates uneven heat generation inside the cells. In the middle of the charge range (roughly 30% to 70% SoC), voltage curves become especially flat, meaning the voltage barely changes even as charge drops substantially. This makes voltage-based SoC estimates unreliable in exactly the range where most people use their batteries daily.
The fix for consumer devices is periodic calibration: running the battery through a full charge and discharge cycle so the system can measure its actual current capacity and reset the gauge. Doing this once or twice a year is generally sufficient. For heavily used batteries that age faster, twice a year is better. The full cycle should ideally take less than six hours for the discharge portion to minimize measurement errors.
How Electric Vehicles Handle SoC
When your EV dashboard shows 100%, the physical battery cells aren’t actually at full charge. Manufacturers build in software-controlled safety buffers at both ends of the charge range. Lithium-ion cells degrade faster when charged to their absolute maximum or discharged to absolute zero, so the system reserves a portion of capacity at the top and bottom that you never access. Your “0%” still has charge left in the cells, protecting against deep discharge damage. Your “100%” stops short of the cells’ true maximum, protecting against overcharge stress.
This means the usable capacity of an EV battery is always less than its total, or nominal, capacity. A battery pack rated at 75 kWh might only let you use 70 kWh between the dashboard’s 0% and 100%. The size of these buffers varies by manufacturer and sometimes changes through software updates as the company gathers real-world data on battery longevity.
SoC and Battery Lifespan
How you use your battery’s charge range directly determines how long it lasts. One full cycle from 100% to 0% (100% depth of discharge) causes roughly the same wear as two cycles from 100% to 50%, ten cycles at 10% depth, or a hundred cycles at just 1% depth. The relationship is dramatic: restricting depth of discharge to 30% instead of 100% can deliver four times as many charge cycles before the battery degrades. In one manufacturer’s testing, a cell rated for 4,000 cycles at full depth of discharge lasted over 16,000 cycles when limited to 30% depth.
This is why many EV manufacturers recommend keeping your daily charge between 20% and 80%, and why phones with “optimized charging” features stop at 80% overnight before topping off just before you wake up. Staying in the middle of the SoC range puts less electrochemical stress on the cells.
How Modern Systems Improve Accuracy
Because simple coulomb counting and voltage measurement both have significant limitations, modern battery management systems use more sophisticated approaches. The most common advanced technique is based on Kalman filtering, a mathematical method originally developed for navigation systems. It works by maintaining a running prediction of the battery’s state, then continuously correcting that prediction using real-time voltage and current measurements. When the prediction and the measurement disagree, the algorithm finds an optimal middle ground, reducing the cumulative error problem that plagues basic coulomb counting.
Newer systems go further by incorporating temperature data from multiple points on the battery. Surface temperature variations can reduce SoC estimation errors by roughly 18% in aging batteries when combined with voltage data. This matters because as batteries age, the relationship between their electrical signals and actual charge level becomes distorted. Temperature patterns, particularly differences between zones of the battery pack, capture aging-related changes that voltage and current alone miss entirely.
The latest generation of battery management systems is beginning to incorporate machine learning models, including neural networks that can adapt to a specific battery’s degradation patterns over time. These systems learn how a particular battery behaves under different temperatures, charge rates, and aging conditions, then adjust their SoC estimates accordingly. The main challenge is computational: these models demand significant processing power, which is straightforward in an EV but harder to implement in smaller devices with limited resources.

