An electrometer is an instrument that measures electric charge or very small electric currents with extreme sensitivity. While a basic version dates back centuries as a simple gold-leaf device, modern electrometers are sophisticated electronic instruments capable of detecting currents as small as a few femtoamperes (quadrillionths of an ampere). They play critical roles in radiation therapy, environmental monitoring, and scientific research where ordinary meters lack the sensitivity to register a reading.
How a Classic Electrometer Works
The original electrometer design is elegantly simple. A metal knob sits atop a conducting shaft connected to a flat, vertical metal plate. A very thin gold leaf, hinged at the top, hangs against the plate. When electric charge reaches the knob, it travels down the shaft and spreads across both the plate and the gold leaf. Because the plate and leaf now carry the same type of charge, they repel each other, and the leaf swings outward.
The angle of that deflection is proportional to the amount of charge on the knob. By calibrating the device, you can translate the angle into an actual charge measurement. This makes the electrometer quantitative, not just a yes-or-no indicator of whether charge is present.
That distinction matters because a closely related device, the electroscope, looks almost identical but serves a different purpose. An electroscope tells you that charge exists and whether it’s positive or negative. An electrometer goes further: it tells you how much charge is there. The difference is calibration. An electroscope’s gold leaves spread apart, but without a calibrated scale, you can’t convert that deflection into a number. An electrometer can.
Modern Electronic Electrometers
Today’s electrometers bear little physical resemblance to the gold-leaf design, but the goal is the same: measure vanishingly small electrical quantities. A modern electrometer is a high-input-impedance voltmeter paired with specialized amplifier circuits. “High input impedance” means the instrument draws almost no current from the thing it’s measuring, so it doesn’t disturb the signal it’s trying to detect.
This matters because the currents and charges involved are tiny. A standard digital multimeter might measure milliamps comfortably, but it becomes useless in the femtoampere range. Modern electrometers can detect currents below 10 femtoamperes with better than 88% accuracy, and currents below 100 femtoamperes with over 86% accuracy. For context, a femtoampere is one millionth of a nanoampere. These are currents so small that the thermal noise in ordinary wiring can overwhelm them, which is why electrometers require carefully shielded cables and controlled environments to work properly.
Most modern electrometers can measure four related quantities: electric current, electric charge, voltage, and resistance. They switch between modes depending on the application. In charge mode, the instrument integrates incoming current over time and reports the total accumulated charge, typically in picocoulombs or nanocoulombs.
Radiation Therapy and Medical Physics
One of the most consequential everyday uses of electrometers is in cancer treatment. Radiation therapy machines (linear accelerators) must deliver precisely controlled doses of radiation. Too little misses the tumor; too much damages healthy tissue. The standard system for verifying those doses consists of an ionization chamber, an electrometer, and connecting cables.
The ionization chamber sits in the radiation beam and collects the electric charge that radiation knocks loose from air molecules inside it. That charge is extremely small, so an electrometer reads and records it. From the charge measurement, medical physicists calculate the actual radiation dose delivered. The electrometer is assigned its own correction factor during calibration, and any undetected drift in that factor introduces a systematic error into every dose measurement at that facility. Hospitals typically calibrate their electrometers by comparing readings between two instruments using the same ionization chamber and the same radiation exposure, a process called cross-calibration.
The stakes are real. If an electrometer’s correction factor drifts even slightly, the calculated dose shifts for every patient treated on that machine until someone catches the error. This is why calibration standards follow strict traceability requirements: every measurement must connect through a documented, unbroken chain of calibrations back to a national reference standard.
Other Common Applications
Beyond hospitals, electrometers appear wherever tiny currents or charges need precise measurement:
- Environmental monitoring: Air quality sensors and particulate detectors use ionization methods that produce very small currents. Electrometers read those currents to calculate pollutant concentrations.
- Semiconductor testing: Evaluating the insulating properties of materials used in microchips requires measuring leakage currents in the picoampere or femtoampere range. Standard meters can’t resolve these signals.
- Nuclear and particle physics: Detectors in accelerators and nuclear facilities produce small ion currents that electrometers convert into meaningful measurements of particle flux or radiation intensity.
- High-resistance measurement: Materials like glass, ceramics, and certain polymers have resistances in the teraohm range (trillions of ohms). Measuring resistance that high requires applying a known voltage and detecting the resulting femtoampere-scale current, a job only an electrometer can do reliably.
Why Ordinary Meters Can’t Do the Job
A reasonable question is why you can’t just use a regular multimeter for these measurements. The answer comes down to two problems: input impedance and noise floor.
Input impedance determines how much current the meter itself pulls from the circuit. A typical handheld multimeter has an input impedance around 10 megaohms. That’s fine for measuring household wiring, but if you’re trying to measure a current of 1 femtoampere, the meter’s own current draw is millions of times larger than the signal. It’s like trying to weigh a grain of sand on a bathroom scale. An electrometer’s input impedance can exceed 200 teraohms, more than 20 million times higher, so it barely disturbs the measurement.
Noise floor is the smallest signal the instrument can distinguish from its own internal electronic noise. Ordinary meters have noise floors in the microampere or nanoampere range. Electrometers push that floor down into the femtoampere range through specialized amplifier designs, guarded input terminals, and shielding that blocks external electromagnetic interference. Even the cables connecting an electrometer to a sensor are triaxial (three-layer shielded) rather than standard coaxial, specifically to minimize leakage currents along the cable itself.
Calibration and Traceability
Because electrometers often measure quantities that directly affect patient safety or regulatory compliance, their accuracy must be verified regularly. Calibration follows internationally recognized traceability standards. The core idea is that every measurement result can be linked back to a reference standard through a documented, unbroken chain of comparisons, each with a known measurement uncertainty.
In practice, this means a hospital or laboratory sends its electrometer to an accredited calibration lab. The lab compares the instrument’s readings against a reference standard that has itself been calibrated against a higher-level standard, ultimately tracing back to a national metrology institute. After calibration, the electrometer returns with a certificate stating its correction factors and associated uncertainties. The facility then verifies that the instrument still performs within specification before putting it back into service.

