Scientists use infrared to image Earth because it reveals information that visible light simply cannot capture. Infrared wavelengths expose plant health, surface temperatures, mineral compositions, water content in soil and crops, and active wildfires, all from orbit. Different slices of the infrared spectrum serve different purposes, making it one of the most versatile tools in Earth observation.
What Infrared Actually Shows
Human eyes see a narrow band of wavelengths. Infrared light sits just beyond the red end of that visible range, stretching from about 0.7 micrometers out to around 12.5 micrometers. Scientists divide this broad range into three useful categories: near-infrared (roughly 0.7 to 1.1 micrometers), shortwave infrared (1.5 to 2.5 micrometers), and thermal infrared (10 to 12.5 micrometers). Each interacts with Earth’s surface and atmosphere differently, which is why satellites like Landsat 8 and 9 carry sensors covering all three.
Visible-light satellite images look a lot like photographs. They show what your eyes would see from space. Infrared sensors, by contrast, detect energy that objects reflect or emit in wavelengths we can’t see, then translate that energy into data scientists can map and analyze. The result is a richer, more detailed picture of what’s actually happening on the ground.
Tracking Plant Health Before Problems Are Visible
This is one of the most widely used applications of infrared imaging. Healthy plants absorb red and blue light to power photosynthesis, which is why leaves look green to our eyes. But those same healthy leaves also strongly reflect near-infrared light. A plant producing more chlorophyll reflects more near-infrared energy than a stressed or dying one.
This difference is invisible to the naked eye. Two fields can look identically green in a normal photograph, but a near-infrared image will reveal that one is thriving while the other is beginning to decline. Scientists exploit this contrast using vegetation indexes that compare how much red light a plant absorbs versus how much near-infrared it reflects. The bigger the gap, the healthier the plant. This works at enormous scales, letting researchers monitor forests, croplands, and ecosystems across entire continents from a single satellite pass.
Detecting Drought Stress in Crops
Shortwave infrared wavelengths are particularly sensitive to water content in leaves and soil. As a plant loses moisture, its shortwave infrared reflectance changes in measurable ways. Scientists have developed spectral water indexes using combinations of near-infrared and shortwave infrared data to estimate how much water vegetation contains, effectively diagnosing drought stress from space before a farmer might notice wilting in the field.
This matters for precision agriculture. Rather than irrigating an entire farm uniformly, satellite-derived moisture maps can pinpoint which sections need water most urgently. Research has shown strong correlations between these shortwave infrared-based stress indexes and actual fuel moisture content measured on the ground, validating the approach as a practical tool rather than just a laboratory concept.
Measuring Surface Temperature From Orbit
Every object warmer than absolute zero emits thermal infrared radiation. The hotter the object, the more it emits. Thermal infrared sensors on satellites detect this emitted energy and convert it into precise temperature readings. The latest Landsat 9 thermal sensor measures temperature differences as small as 0.1°C, covering a 185-kilometer-wide strip of Earth in each pass at roughly 100-meter resolution. That’s sharp enough to distinguish the temperature of a single city block from the park next to it.
This capability is essential for tracking urban heat islands. Paved surfaces and dark rooftops absorb and re-emit far more heat than trees and grass. Thermal infrared imagery from satellites and aircraft reveals those temperature differences at fine scales, showing, for example, that a parking lot may be 20°C hotter than a nearby green space. The EPA uses Landsat thermal data specifically for mapping and studying urban heat islands across U.S. cities.
Ocean scientists rely on the same principle to measure sea surface temperatures globally. Current infrared and microwave satellite sensors achieve accuracy between 0.3°C and 0.6°C, which is precise enough to track ocean currents, monitor El Niño events, and feed data into weather and climate models.
Seeing Through Smoke and Haze
Visible light scatters easily when it hits tiny particles in the atmosphere. Smoke from wildfires, industrial haze, and dust storms can all block a satellite’s view in visible wavelengths, turning images into a white or gray blur. Shortwave infrared wavelengths are longer and pass through these particles with much less scattering and absorption. This lets scientists image the ground beneath smoke plumes and hazy skies where visible-light cameras would be useless.
During large wildfire events, this capability is critical. Emergency responders need to know where the fire actually is and what it’s burning, not just where the smoke has drifted. Shortwave infrared cuts through the smoke to reveal the landscape underneath.
Spotting Active Wildfires
Thermal and mid-wave infrared sensors do more than see through smoke. They detect the fires themselves. An active fire is dramatically hotter than the surrounding landscape. Smoldering fires typically range from 175°C to 575°C, while intense blazes reach 525°C to 925°C. The mid-wave infrared channel around 4 micrometers is especially sensitive to these temperatures, making fire pixels stand out sharply against a background that might only be 35°C.
The VIIRS sensor, which is replacing the aging MODIS instruments (scheduled for shutdown in late 2026 or early 2027), detects active fires at roughly 750-meter resolution. Geostationary satellites like GOES can spot fires too, at coarser 2-kilometer resolution but with the advantage of scanning the same area every few minutes. Together, these infrared tools give fire managers near-real-time maps of where fires are burning, how intensely, and in which direction they’re spreading.
Identifying Minerals and Rock Types
Different minerals reflect and absorb infrared wavelengths in characteristic patterns. Clay minerals, for instance, have distinct absorption features in the shortwave infrared range that don’t appear in visible light. Infrared spectroscopy from satellites can distinguish between minerals like kaolinite, smectite, and illite, and can even detect chemical variations within a single mineral group or identify minerals that form together in thin layers versus separate grains.
Geologists use this to map rock types and alteration zones across large, remote areas without needing to collect samples on foot. It’s particularly valuable for geothermal exploration, where specific temperature-indicator minerals signal the presence of subsurface heat. Infrared imaging identifies most of these indicator minerals from both neutral and acidic geothermal environments, making it a first-pass survey tool that narrows down where to invest in expensive ground-level exploration.
Why Not Just Use Visible Light?
Visible light captures what things look like. Infrared captures what things are doing: how hot they are, how much water they contain, what they’re made of, and how vigorously they’re growing. A green forest and a green-painted parking lot may look similar in visible wavelengths, but they’re completely different in near-infrared and thermal infrared. One reflects near-infrared strongly and stays cool; the other absorbs it and radiates heat.
Infrared also works at night. Thermal sensors don’t need sunlight because they detect emitted heat rather than reflected light. This makes overnight temperature monitoring, nighttime fire detection, and 24-hour ocean temperature tracking possible. Visible-light cameras are blind once the sun sets, but infrared sensors keep collecting data around the clock.

