Lumens per watt (lm/W) is a measure of how efficiently a light source converts electrical power into visible light. The higher the number, the more light you get for each watt of electricity consumed. A standard incandescent bulb produces roughly 12 to 18 lm/W, while a modern LED can deliver 80 to 150 lm/W or more, meaning it creates the same brightness using a fraction of the energy.
How Lumens Per Watt Works
The concept is straightforward: divide the total light output (measured in lumens) by the total electrical power consumed (measured in watts). A bulb that produces 800 lumens while drawing 10 watts has an efficacy of 80 lm/W. One that produces the same 800 lumens but draws 60 watts has an efficacy of about 13 lm/W. Both give you the same brightness, but the first uses far less electricity to do it.
The technical term for this measurement is “luminous efficacy.” You may also see a related term, “luminous efficiency,” which expresses the same idea as a percentage of the theoretical maximum rather than a raw number. For shopping and comparing bulbs, lumens per watt is the figure that matters.
Why It’s Based on How Your Eyes Work
Not all light is equally visible. Your eyes are most sensitive to yellow-green light at around 550 nanometers and progressively less sensitive toward the red and blue ends of the spectrum. The lumen itself is defined around this sensitivity curve. A light source emitting purely at 555 nm (peak eye sensitivity) would achieve the theoretical maximum of 683 lm/W, because every bit of its energy lands where your vision is sharpest.
Real white light sources need to emit across a broad range of wavelengths so that colors look natural. That spread of wavelengths inevitably includes some that your eyes are less sensitive to, which lowers the lumens per watt. Researchers have calculated that an ideal white light source, one that wastes no energy as heat and emits only visible wavelengths, could achieve roughly 250 to 370 lm/W depending on its color temperature and spectral spread.
Typical Efficacy by Bulb Type
Different lighting technologies fall into fairly predictable ranges:
- Incandescent bulbs: 12 to 18 lm/W. About 90% of their energy becomes heat rather than light.
- Halogen bulbs: 15 to 25 lm/W. A modest improvement over standard incandescents.
- Compact fluorescents (CFLs): 40 to 70 lm/W. They replaced incandescents in many homes but had drawbacks like slow warm-up times and mercury content.
- LEDs (consumer): 80 to 150 lm/W in typical retail products. High-performance commercial LEDs push even higher.
- Laboratory LEDs: Up to 220 lm/W in the best demonstrations reported to date.
To see the practical difference, consider that a 40-watt incandescent produces about 450 lumens. An LED achieves that same 450 lumens with just 4 to 5 watts. At the other end, replacing a 150-watt incandescent (about 2,600 lumens) with a 25 to 28 watt LED saves over 120 watts per bulb, per hour of use.
The Trade-Off With Color Quality
There’s a built-in tension between high lumens per watt and accurate color rendering. Because your eyes are less sensitive to the deep reds and blues at the edges of the visible spectrum, a light source can boost its lm/W rating by simply not emitting much light at those wavelengths. The catch is that objects lit by such a source look washed out or slightly off-color.
This trade-off is measured by the Color Rendering Index (CRI), which rates how faithfully a light source reveals colors compared to natural daylight. A score of 100 is perfect. Dropping from a CRI of 95 to a CRI of 80 typically increases luminous efficacy by about 15 to 17%, because the source skips wavelengths your eyes don’t respond to strongly. For a living room or retail space where color accuracy matters, a slightly lower lm/W at a higher CRI is often the better choice. For a parking garage, maximizing lm/W makes more sense.
What Lowers Efficacy in Practice
The lumens per watt figure on a product box reflects testing under specific lab conditions. In real-world use, several factors can reduce actual performance.
Heat is the biggest one for LEDs. As the internal temperature of an LED chip rises, its ability to convert electricity into light drops. Higher temperatures increase the share of energy lost as heat rather than light, creating a feedback loop that also shortens the bulb’s lifespan. This is why LED fixtures with good heat sinks and ventilation consistently outperform cheap ones crammed into enclosed fixtures with no airflow.
Dimming, power supply losses, and the optical design of the fixture (how much light gets absorbed or trapped before reaching the room) also reduce the effective lumens per watt you experience. A 120 lm/W LED chip inside a poorly designed fixture might deliver the equivalent of 70 or 80 lm/W to the space you’re trying to light.
How to Use This Number When Shopping
When comparing bulbs, lumens tell you how bright the light will be. Watts tell you how much electricity it will consume. Lumens per watt tells you how efficiently the bulb does its job. Two LED bulbs from different brands might both claim 800 lumens, but if one draws 8 watts and the other draws 12 watts, the first is significantly more efficient and will cost less to run over time.
For certified efficiency, Energy Star requires downlight fixtures to hit at least 82 lm/W. Most quality LED bulbs on store shelves today meet or exceed that threshold. If a product doesn’t list its lm/W directly, you can calculate it yourself by dividing the lumens on the box by the watts.
Keep CRI in mind alongside lm/W. For task lighting, reading areas, kitchens, and anywhere color matters, look for a CRI of 90 or above even if it means slightly lower efficacy. For utility spaces, garages, and outdoor security lighting, prioritizing lm/W over CRI is a reasonable call.

