The energy of a photon is determined entirely by its frequency. Higher-frequency light (like X-rays) carries more energy per photon, while lower-frequency light (like radio waves) carries less. The relationship is captured in one of the most important equations in physics: E = hf, where E is energy, h is Planck’s constant (6.626 × 10⁻³⁴ joule-seconds), and f is the frequency of the light.
The Core Equation
Every photon is a tiny packet of electromagnetic energy. Unlike a ball you can throw harder or softer, a photon’s energy is locked to its frequency. Double the frequency, and you exactly double the energy. There’s no way to have a “weak” blue photon or a “strong” red one. The color (frequency) dictates the energy, period.
The equation E = hf can also be written in terms of wavelength, since frequency and wavelength are inversely related through the speed of light (c = 3.0 × 10⁸ m/s). That gives you a second, equally useful form:
E = hc / λ
Here, λ is the wavelength. Shorter wavelengths mean higher energy. This version is often more convenient because wavelength is easier to measure directly for visible light, ultraviolet, and similar radiation.
A Sample Calculation
To make this concrete, consider green light with a wavelength of 500 nanometers (500 × 10⁻⁹ meters), roughly the middle of the visible spectrum. Plugging into the formula:
E = (6.626 × 10⁻³⁴ J·s)(3.0 × 10⁸ m/s) / (500 × 10⁻⁹ m) = 3.97 × 10⁻¹⁹ joules
That number is absurdly small by everyday standards, which is why physicists often use a more convenient unit called the electron-volt (eV). One eV equals 1.6 × 10⁻¹⁹ joules. Dividing, the energy of that green photon comes out to about 2.48 eV. A single photon doesn’t carry much energy, but a typical light bulb emits trillions of them every second, and the effects add up quickly.
Why Photon Energy Is “Quantized”
Before the early 1900s, physicists treated light as a continuous wave that could carry any amount of energy. The discovery that light actually comes in discrete packets, or quanta, was one of the founding insights of quantum mechanics. A photon is the smallest possible unit of electromagnetic radiation at a given frequency. You can’t have half a photon. Energy arrives in these fixed-size chunks, and the size of each chunk depends on frequency.
This idea was radical at the time. Albert Einstein used it in 1905 to explain the photoelectric effect, a puzzling experiment where shining light on a metal surface ejects electrons. Classical wave theory predicted that brighter light should always knock electrons loose, given enough time. But experiments showed something different: dim ultraviolet light ejected electrons instantly, while even blindingly intense red light ejected none at all.
Einstein’s explanation was simple. Each photon interacts with a single electron. If the photon’s energy (hf) is below a certain threshold for that material, the electron can’t break free, no matter how many low-energy photons hit the surface. The ejected electron’s kinetic energy equals the photon’s energy minus the energy needed to escape the material. This matched every experimental observation perfectly and earned Einstein the Nobel Prize in Physics.
Photon Energy Across the Spectrum
The electromagnetic spectrum spans an enormous range of photon energies. Radio wave photons have energies on the order of 10⁻⁵ eV or less. Visible light photons fall in the range of roughly 1.6 to 3.3 eV, with red at the low end and violet at the high end. Ultraviolet photons carry a few eV to a few hundred eV, which is why UV radiation can damage skin cells in ways visible light cannot. X-ray photons range from about 100 eV to 100,000 eV, and gamma rays go higher still, into the millions of eV and beyond.
This energy hierarchy explains a lot about everyday life. Radio waves pass through walls because each photon carries so little energy that it doesn’t interact strongly with the atoms in the material. Gamma rays, on the other hand, carry enough energy per photon to break chemical bonds and ionize atoms, which is why they’re classified as ionizing radiation and require lead shielding.
Photons Also Carry Momentum
Despite having no mass, photons carry momentum. The momentum of a single photon is p = h/λ, directly related to its wavelength. This was confirmed experimentally in 1923 by Arthur Compton, who fired X-ray photons at stationary electrons. The photons bounced off the electrons and lost energy in the process, emerging with a longer wavelength. The scattered photons and recoiling electrons behaved exactly like two billiard balls colliding, with energy and momentum conserved in the collision. The change in wavelength depended only on the scattering angle, matching predictions perfectly.
This experiment was a turning point. It convinced many physicists that the photon model of light was not just a mathematical convenience but a physical reality. Light genuinely behaves as individual particles, each carrying a specific energy (hf) and a specific momentum (h/λ), even though it also exhibits wave-like properties like interference and diffraction.
Converting Between Units
Photon energy calculations frequently require converting between joules and electron-volts. The conversion factor is straightforward: 1 eV = 1.6022 × 10⁻¹⁹ joules. To go from joules to eV, divide by that number. To go from eV to joules, multiply.
There’s also a handy shortcut. If you express Planck’s constant in eV·s rather than J·s, it becomes 4.136 × 10⁻¹⁵ eV·s. Using this version, you can calculate photon energy directly in eV without a separate conversion step. For the 500 nm example above: (4.136 × 10⁻¹⁵ eV·s)(3.0 × 10⁸ m/s) / (500 × 10⁻⁹ m) = 2.48 eV. Same answer, fewer steps.
An even faster mental shortcut: the product hc works out to approximately 1240 eV·nm. So the energy of any photon in electron-volts is roughly 1240 divided by the wavelength in nanometers. For 500 nm light, that’s 1240/500 = 2.48 eV. This is the version most physics students memorize.

