Emissivity is a measure of how efficiently a surface radiates heat energy compared to a perfect radiator. It’s expressed as a value between 0 and 1, where 1 means the surface emits the maximum possible thermal radiation at a given temperature, and 0 means it emits none. Every material has its own emissivity, and that number matters whenever you need to measure temperature remotely, design energy-efficient buildings, or understand how heat moves between objects.
How the Scale Works
The baseline for emissivity is something physicists call a “blackbody,” a hypothetical perfect radiator that absorbs all energy hitting its surface and re-emits all of it. Because it reflects nothing, it would appear completely black, hence the name. A blackbody has an emissivity of 1.0. Every real material falls somewhere below that mark, because real surfaces always reflect or transmit some energy rather than radiating all of it.
A material with an emissivity of 0.90 radiates 90% as much thermal energy as a blackbody at the same temperature. One with an emissivity of 0.03 radiates only 3%. That gap has enormous practical consequences: a dull, oxidized surface and a mirror-polished surface at the exact same temperature will send out wildly different amounts of infrared energy.
What Determines a Material’s Emissivity
The biggest factor is the material itself. Metals, especially polished ones, tend to have very low emissivity. Polished aluminum sits around 0.03, meaning it radiates almost no thermal energy relative to its temperature. Non-metals behave differently. Wood falls in the 0.80 to 0.90 range. Human skin has an emissivity of about 0.98, making it nearly a perfect radiator in the infrared spectrum, regardless of skin pigmentation (a study of 65 participants found a mean of 0.972, ranging from 0.96 to 0.99).
Surface condition changes things dramatically. Oxidized steel has an emissivity around 0.80, far higher than its polished counterpart. Research confirms that surface oxidation has a more dominant effect on emissivity than roughness alone. As an oxide layer grows thicker, emissivity climbs. Aluminum coated with a layer of aluminum oxide, for instance, shows a measurable increase in emissivity as the oxide thickens. Roughening a surface also increases emissivity because the textured surface creates tiny cavities that trap and re-emit more radiation, effectively providing more surface area for radiating infrared energy. This effect holds across wavelengths.
There’s also a useful physical law at work here: at any given wavelength, a material’s emissivity equals its absorptivity. In plain terms, a surface that’s good at absorbing thermal radiation is equally good at emitting it, and a surface that reflects most incoming energy (like polished metal) emits very little. This is why dark, rough surfaces both heat up quickly in sunlight and cool down quickly by radiating heat away.
Total vs. Spectral Emissivity
Emissivity isn’t always a single number. A material can emit differently at different wavelengths of infrared light. “Spectral emissivity” describes how well a surface radiates at one specific wavelength, while “total emissivity” averages the radiation across all wavelengths. Engineers often simplify calculations by treating a material as a “gray body,” assuming its emissivity stays roughly constant across the wavelengths they care about. For most everyday applications, total emissivity is the number you’ll encounter. But in precision work like semiconductor manufacturing or remote sensing, the wavelength-by-wavelength picture matters.
Common Emissivity Values
A few reference points help make the scale intuitive:
- Polished aluminum: 0.03
- Water: 0.67
- Oxidized steel: 0.80
- Wood: 0.80 to 0.90
- Human skin: 0.96 to 0.99
The pattern is clear: shiny metals are poor emitters, while organic materials, oxides, and water are strong emitters. This is why a metal pan on a stove feels blindingly hot to your hand (it conducts heat well) but doesn’t glow nearly as brightly on a thermal camera as your own skin does at a much lower temperature.
Why Emissivity Matters for Temperature Measurement
Infrared thermometers and thermal cameras don’t measure temperature directly. They detect the infrared energy leaving a surface and calculate temperature from that signal. The calculation only works if the device knows the emissivity of the surface it’s pointed at. If you aim an infrared thermometer set to 0.95 at a polished aluminum surface with an actual emissivity of 0.03, the reading will be wildly wrong because the device expects far more radiation than the surface actually emits.
Most consumer infrared thermometers ship with a default emissivity setting around 0.95, which works well for skin, food, drywall, and most household surfaces. For metals, plastics, or specialty materials, you need to adjust the setting or apply a high-emissivity tape or coating to the surface before measuring. In thermal imaging, this same principle affects temperature maps: high-emissivity surfaces appear brighter and register accurate temperatures, while low-emissivity materials can appear misleadingly cool without proper calibration.
Emissivity in Energy Efficiency
Low-emissivity coatings, commonly called “low-E” coatings, are one of the most widespread commercial applications of this property. The thin metallic coatings applied to window glass reduce the amount of heat the glass radiates, keeping warmth inside during winter and outside during summer. The principle is straightforward: a surface with low emissivity holds onto its thermal energy instead of radiating it away.
The same idea applies to roofing and insulation. In cold climates, a low-emissivity roof adds resistance to heat flowing out of the building, reducing heating costs. Reflective foil insulation in attics works partly for the same reason: its shiny, low-emissivity surface radiates very little heat compared to standard building materials. In hot climates, the goal flips. Roofs with higher emissivity radiate absorbed solar heat more efficiently, helping the building cool down after sunset.
The Core Formula
The total power radiated by a surface is governed by a relationship where emissivity acts as a simple multiplier. The energy emitted per square meter equals the emissivity times a physical constant times the temperature raised to the fourth power. That fourth-power relationship means small changes in temperature cause large changes in radiated energy, and emissivity scales the whole output up or down. A surface with emissivity of 0.50 radiates exactly half as much energy as a blackbody at the same temperature. When two surfaces at different temperatures face each other, the net heat flow between them depends on the difference of their temperatures (each raised to the fourth power) multiplied by the emissivity of the radiating surface.
This is why emissivity becomes increasingly important at high temperatures. In a steel furnace or a spacecraft re-entering the atmosphere, even small differences in surface emissivity translate into large differences in heat radiated, affecting both structural integrity and thermal management.

