A radiometer is any instrument that measures electromagnetic radiation, whether that’s visible light, infrared heat, ultraviolet rays, or microwaves. Some radiometers measure the total power of radiation hitting a surface, while others (called spectroradiometers) break that radiation down by wavelength to show exactly which parts of the spectrum are present and how strong each one is. Radiometers show up everywhere from weather stations and satellites to physics classrooms and climate research labs.
How Radiometers Work
All radiometers share one basic job: converting incoming radiation into a signal that can be read and recorded. The simplest designs use a detector material that responds to radiation by generating an electrical current or changing temperature. The detector’s response is then calibrated against known standards so the output translates into a meaningful measurement of radiant power or intensity.
Different detector materials respond to different parts of the spectrum. Silicon-based detectors work well for visible and near-infrared light, roughly 300 to 1,000 nanometers. Indium gallium arsenide detectors pick up where silicon leaves off, covering about 950 to 1,650 nanometers in the near-infrared. For longer wavelengths stretching into the mid- and far-infrared, pyroelectric detectors made from materials like lithium tantalate can reach out to 19 micrometers. This range of detector technologies means radiometers can be built to cover nearly any portion of the electromagnetic spectrum, from ultraviolet through visible light to deep infrared.
The Crookes Radiometer: A Famous Example
If you’ve ever seen a glass bulb with black-and-white vanes spinning inside it on a sunny windowsill, that’s a Crookes radiometer. Invented by William Crookes in the 1870s during spectral experiments with infrared light, it’s an evacuated glass bulb containing a set of four lightweight vanes mounted on a low-friction pin. Each vane is painted black on one side and silvered on the other.
When light hits the vanes, the black sides absorb more energy than the reflective sides. This heats the air molecules near the black surface, making them more energetic. The result is a process called thermal transpiration: gas inside the bulb flows from the cooler side to the hotter side of each vane, creating a pressure difference that pushes the black side away from the light source and sets the vanes spinning.
It’s a mesmerizing demonstration, but as a scientific instrument, the Crookes radiometer was essentially useless. You can’t translate the speed of spinning vanes into a precise measurement of radiant power. It became a popular novelty item instead. A separate device, the Nichols radiometer, was later built with much greater sensitivity to detect the actual pressure exerted by photons, confirming James Clerk Maxwell’s predictions about radiation pressure. The key difference: the Nichols radiometer used magnetic damping and careful pressure control, and its designers made the vanes fully reflective on one side so the momentum transfer from photons could be isolated from gas effects.
Types Used in Solar and Climate Measurement
In weather and climate science, radiometers are essential tools for tracking how much energy the sun delivers to Earth’s surface and how much the surface radiates back. Several specialized types handle different parts of this energy budget.
- Pyranometers measure broadband shortwave irradiance, meaning the total sunlight (direct plus scattered) reaching a horizontal surface. They’re the workhorses of solar monitoring stations.
- Pyrheliometers measure only the direct beam of sunlight, called direct normal irradiance. They track the sun across the sky and exclude scattered light.
- Pyrgeometers measure longwave (infrared) radiation, capturing the heat energy emitted by the Earth’s surface and atmosphere.
- Net radiometers combine all of these functions, measuring both incoming and outgoing radiation in shortwave and longwave bands simultaneously. This gives a single picture of whether a surface is gaining or losing energy overall.
Net radiometers used in climate research typically cover shortwave wavelengths from about 300 to 2,800 nanometers and longwave wavelengths from 4,500 to 42,000 nanometers. They record upwelling and downwelling radiation in 30-minute intervals, and these measurements feed into surface energy balance calculations that help scientists understand how heat moves between the ground, the air, and space.
Microwave Radiometers in Atmospheric Science
Not all radiometers measure light you can see or feel as heat. Microwave radiometers detect radiation at much longer wavelengths, in the gigahertz frequency range, and they’re widely used for atmospheric profiling from aircraft and satellites.
NASA’s Microwave Temperature Profiler is a passive instrument, meaning it doesn’t send out a signal. Instead, it listens for the natural thermal emissions from oxygen molecules in the atmosphere. By tuning to three specific frequencies (55.51, 56.65, and 58.80 GHz) and scanning at multiple elevation angles between straight up and straight down, it collects a set of 20 “brightness temperature” readings every 15 seconds. A statistical algorithm then converts those readings into a vertical profile of air temperature at different altitudes.
The trick that makes this work: each frequency penetrates the atmosphere to a different depth. Higher frequencies are absorbed more quickly, so they reveal conditions closer to the instrument. Lower frequencies see farther. By combining multiple frequencies with multiple viewing angles, the radiometer builds a layered picture of atmospheric temperature without needing to physically send a probe through the air.
Calibration and Precision
A radiometer is only as good as its calibration. The National Institute of Standards and Technology (NIST) maintains reference-grade detector standards that other instruments are calibrated against. Their highest-precision standards, silicon tunnel-trap detectors, achieve uncertainties as low as 0.05% across the 400 to 960 nanometer range. For the near-infrared, indium gallium arsenide sphere-input detectors match that level of precision from 950 to 1,650 nanometers. Pyroelectric standards extend calibration capability from 250 nanometers in the ultraviolet all the way to 2.5 micrometers in the infrared, with uncertainties under 0.34%.
These numbers matter because radiometric measurements underpin everything from satellite climate records to industrial quality control. A small systematic error in a radiometer’s calibration can ripple through decades of data. That’s why national standards laboratories invest heavily in maintaining and improving reference instruments that anchor the entire measurement chain.
Everyday and Industrial Uses
Beyond research labs and weather stations, radiometers appear in a surprising range of applications. Thermal imaging cameras used in building inspections are essentially radiometers tuned to the infrared wavelengths that warm objects emit. Solar panel installers use pyranometers to assess how much sunlight a rooftop receives before recommending a system size. In manufacturing, radiometers verify that UV curing lamps deliver the right dose to harden coatings and adhesives. Astronomers use radio-frequency radiometers to study distant galaxies and the cosmic microwave background.
What ties all of these together is the same core principle: detecting electromagnetic radiation and converting it into a number. Whether the instrument sits on a weather tower, orbits Earth on a satellite, or spins lazily on your desk in the sunlight, it’s doing some version of the same job that defines every radiometer.

