The Volt-Ampere (VA) is the standard unit of measurement for apparent power in an alternating current (AC) electrical system. Apparent power represents the total electrical power flowing into an electrical circuit from a source, which is calculated simply by multiplying the circuit’s voltage, measured in volts, by its current, measured in amperes. In any AC system, the flow of electricity oscillates, and the VA rating describes the maximum load capacity a piece of equipment must be built to handle. This measurement is necessary because, unlike in simple direct current (DC) circuits where the volt-amp product is equal to the power used, AC systems introduce complexities that cause the total power supplied to differ from the power actually used to perform work.
Defining Apparent Power and Its Measurement
Apparent power, measured in volt-amperes, is the sum of all power in an electrical system, regardless of whether that power is performing useful work or not. This total power is the capacity required by the infrastructure, including wiring, transformers, and generators, to deliver electricity safely to a load. In AC systems, the voltage and current waveforms may not perfectly align, meaning that the peak voltage and peak current do not occur at the exact same moment in time.
This misalignment, known as a phase shift, is what necessitates the measurement of apparent power, as the total current flowing through the wires must be accounted for. The VA measurement is the product of the root mean square (RMS) voltage and the RMS current, which are calculated values that represent the effective strength of the oscillating AC waveforms. Because VA measures the entire electrical demand placed on the system, it is the fundamental rating used for sizing the physical components of the electrical infrastructure.
The Difference Between Volt-Amps and Watts
The distinction between Volt-Amps and Watts lies in what kind of power they are measuring within the circuit. Watts measure real power, sometimes called active power, which is the power component that is actually consumed by the equipment to perform physical work, such as generating heat, light, or mechanical motion. Real power is the energy that is truly dissipated by the load and is the energy for which consumers are billed by the utility company. Volt-Amps, in contrast, measure apparent power, which is the total power flowing in the circuit, including the power that does not perform work.
The discrepancy between the two is caused by reactive power, which is the third component of electrical power in an AC system. Reactive power is the energy that flows back and forth between the power source and the load, primarily due to the magnetic fields required to operate inductive devices like motors, transformers, and fluorescent lighting ballasts. This power is necessary to establish the magnetic fields that allow these devices to function, but it is not consumed and contributes nothing to the actual work output.
Because apparent power is the vector sum of both the real power and the reactive power, the VA rating is almost always larger than the Watt rating in any practical AC circuit containing inductive loads. The utility must supply the total current required to sustain both the useful work and the temporary magnetic fields.
Understanding the Power Factor
The Power Factor (PF) mathematically links Watts and Volt-Amps, serving as a measure of electrical efficiency. It is defined as the ratio of real power (Watts) to apparent power (Volt-Amps), resulting in a value between zero and one. A power factor of 1, or unity, means that the Watts and Volt-Amps are exactly equal, indicating that all the power supplied is used for work with no reactive power present.
The formula for this relationship is \(W = VA times PF\). When the power factor is less than 1, it indicates that a portion of the apparent power is reactive power, which means the utility and the infrastructure must supply more total current (VA) than is actually being converted into useful work (W). A low power factor is undesirable because it requires generators, transformers, and distribution wires to be rated for the higher apparent power, leading to increased heat losses and potentially higher operating costs for the consumer or utility.
Practical Applications of Volt-Amps
The Volt-Amp rating is the primary specification used for sizing electrical supply equipment like Uninterruptible Power Supplies (UPS), transformers, and generators. These pieces of equipment are rated in VA or kVA (kilovolt-amperes, or 1,000 VA) because their physical limitations, such as the thermal limits of the windings and the capacity of the circuit breakers, are defined by the total current flowing through them. The components of the supply equipment must be able to handle the total current draw, which is determined by the apparent power.
For instance, a UPS must be sized based on the VA rating to ensure it can supply the maximum current demanded by the attached devices, including the non-working reactive component. If a UPS were only rated by the Watt capacity, it might fail when trying to supply the total current required by a load with a low power factor, even if the real power consumption is below the Watt rating. Using the VA rating provides a safety margin, ensuring that the supply equipment can handle the entire electrical load and prevent overheating or failure.

