How to Measure Charge: Coulombs, Capacitors & More

Electric charge is measured in coulombs (C), and the method you use depends entirely on what you’re measuring: a static charge on a surface, current flowing through a wire, or energy stored in a capacitor or battery. Simple tools like an electroscope can detect charge qualitatively, while precision instruments called electrometers can measure tiny amounts directly. For most practical electronics work, you’ll calculate charge from quantities you can easily measure, like current and time or voltage and capacitance.

The Unit of Charge: Coulombs

One coulomb is the amount of charge transported in one second by a current of one ampere. That sounds modest, but a single coulomb contains roughly 6.24 × 1018 electrons. For perspective, a single electron carries a charge of about 1.6 × 10-19 coulombs, an almost unimaginably small number. In everyday electronics, you’ll typically work with microcoulombs (μC) or even nanocoulombs (nC) rather than full coulombs.

Calculating Charge From Current and Time

The most common way to determine charge in a circuit is to measure current and multiply by time. The relationship is straightforward:

Q = I × t

where Q is charge in coulombs, I is current in amperes, and t is time in seconds. If a device draws 25 milliamps for two seconds, it transfers 0.05 coulombs of charge. This formula works perfectly when current is steady. When current fluctuates, you need to integrate (essentially, add up tiny slices of current over time), which is exactly what battery fuel-gauge chips do when they track how much charge has entered or left a battery.

Measuring Charge Stored in a Capacitor

Capacitors store charge on their plates, and you can calculate exactly how much using another simple formula:

Q = C × V

where C is the capacitance (in farads) and V is the voltage across the capacitor. You don’t need a specialized charge meter here. Measure the voltage with a standard multimeter, look up the capacitance value printed on the component, and multiply. A 100 microfarad capacitor charged to 5 volts, for example, holds 500 microcoulombs of charge.

This same principle explains how a gold-leaf electroscope works. A classic lab electroscope at UC Santa Barbara has a measured capacitance of about 29 picofarads. Charging it to 1,000 volts requires depositing roughly 29 nanocoulombs, which is about 182 billion individual electron charges. That’s a lot of electrons, but still a tiny amount of charge in absolute terms.

Battery Capacity and Charge

Battery capacity is listed in milliamp-hours (mAh), which is really just a unit of charge in disguise. One milliamp-hour equals 3.6 coulombs. A phone battery rated at 4,000 mAh stores a total of 14,400 coulombs when fully charged. To track remaining charge in real time, most modern devices use “coulomb counting,” which is exactly the Q = I × t method applied continuously by a small sensor chip that monitors current flowing in and out of the battery.

Detecting Static Charge

Static electricity presents a different challenge because there’s no flowing current to measure. The simplest detection tool is an electroscope: two thin strips of gold foil hang from a metal rod, and when charge is transferred to the rod, both strips pick up the same charge and repel each other. The greater the charge, the wider they spread. This tells you charge is present and gives a rough sense of magnitude, but it’s not precise.

For non-contact measurement of surface charge, technicians use electrostatic voltmeters or Kelvin probes. These instruments measure the electric field near a charged surface and convert that reading into a surface charge density (charge per unit area). They’re commonly used in manufacturing to monitor static buildup on plastics, textiles, and semiconductor wafers.

A Faraday cup offers another approach for measuring total charge on an object. You place the charged object inside a conductive cup, and the charge transfers to the cup’s exterior where a connected meter reads it. This method is especially useful for measuring charge on small, non-conductive items that can’t be connected to a meter directly.

Why Standard Multimeters Fall Short

A regular digital multimeter works fine for measuring current and voltage, which you can then use to calculate charge. But if you need to measure very small charges directly, a multimeter’s input resistance is too low. A typical multimeter has an input impedance of about 1 gigaohm (109 ohms). That sounds high, but when measuring a source with 10 megaohms of impedance, the multimeter introduces a 1% error because charge leaks through the meter itself.

An electrometer solves this problem. It’s essentially a highly refined multimeter with input impedance around 1014 ohms, roughly 100,000 times higher. That drops the measurement error to 0.00001%. Electrometers can also measure charge directly, without needing you to calculate it from other values. They’re standard equipment in physics labs, semiconductor testing, and any application where you’re dealing with picocoulombs or femtoamps of current.

Choosing the Right Method

  • Circuit current: Use Q = I × t. Measure current with a multimeter or current sensor and multiply by the time the current flows.
  • Capacitor charge: Use Q = C × V. Measure voltage across the capacitor and multiply by its known capacitance.
  • Battery charge: Convert mAh to coulombs by multiplying by 3.6, or use a coulomb-counting fuel gauge IC for real-time tracking.
  • Static charge on objects: Use a Faraday cup connected to an electrometer for precise readings, or an electroscope for quick qualitative detection.
  • Surface charge density: Use a non-contact electrostatic voltmeter or Kelvin probe to measure charge distribution across a material’s surface.
  • Very small charges: Use an electrometer instead of a standard multimeter to avoid charge leakage through the instrument itself.

For most practical purposes, you won’t measure charge directly. You’ll measure something easier to access, like voltage, current, or time, and calculate the charge from there. Direct charge measurement is reserved for situations involving static electricity, particle physics, or extremely sensitive electronics where even tiny charge imbalances matter.