What Are Remote Sensing Satellites and How They Work

Remote sensing satellites are spacecraft equipped with sensors that collect information about Earth’s surface, oceans, and atmosphere without making physical contact. They do this by detecting electromagnetic energy, whether it’s sunlight bouncing off a forest canopy, heat radiating from an ocean current, or microwave signals penetrating through cloud cover. Hundreds of these satellites currently orbit Earth, feeding data to scientists, farmers, emergency responders, and governments worldwide.

How Remote Sensing Satellites Work

Everything on Earth reflects, absorbs, or emits electromagnetic energy. Remote sensing satellites carry instruments designed to detect that energy across a wide range of wavelengths, from visible light to infrared to microwaves. The primary energy source is the Sun. When sunlight hits a surface, part of it bounces back toward space, and satellites capture that reflected energy to build detailed images and measurements.

Different surfaces interact with energy in distinct ways, which is what makes the technology so powerful. Snow reflects up to 90% of incoming solar radiation, while the ocean reflects only about 6% and absorbs the rest. That absorbed energy gets re-emitted as infrared radiation, which satellites can also detect. By measuring how much energy comes back at specific wavelengths, instruments can distinguish between soil types, vegetation health, water bodies, rock formations, and human-made structures.

The number of wavelength bands an instrument can detect determines how much detail it can pull from a scene. Some instruments capture 3 to 10 bands and are considered multispectral. Others capture hundreds or even thousands of bands, earning the label hyperspectral. With that level of detail, researchers can tell the difference between individual mineral types or pick out specific species of vegetation.

Passive vs. Active Sensors

Remote sensing instruments fall into two categories based on where their energy comes from. Passive sensors rely on naturally occurring energy, primarily sunlight. They work well during the day for detecting reflected light, and they can pick up heat emissions at night as long as the thermal signal is strong enough. Most optical imaging satellites use passive sensors.

Active sensors generate their own energy. They send out a pulse of radiation toward the ground and then measure what bounces back. This gives them a major advantage: they can collect data at any time of day, in any season, and through cloud cover. Synthetic aperture radar (SAR) is the most widely used active sensor in remote sensing. Because microwave energy passes through clouds, SAR satellites can image flood zones, ice sheets, and tropical forests that are perpetually hidden under cloud layers.

Orbits Shape What Satellites Can See

The orbit a satellite follows determines how often it revisits a location, how much of Earth it can cover, and what kind of detail it captures. Two orbit types dominate remote sensing.

Geostationary satellites orbit at 35,786 km above the equator, matching Earth’s rotation so they hover over the same spot continuously. This makes them ideal for weather monitoring, where you need nearly constant coverage of one region. The GOES series of weather satellites operated by NASA and NOAA use geostationary orbits to track storm systems across the Americas in near real time.

Polar-orbiting satellites fly much lower, typically around 705 km, and pass near both poles on each revolution. Most are sun-synchronous, meaning they cross the same location at the same local solar time on every pass. This consistency in lighting conditions makes it easier to compare images taken days, weeks, or months apart. Because they sweep across different strips of Earth with each orbit, polar-orbiting satellites gradually build complete global coverage. NASA’s Aqua satellite, for instance, orbits at about 705 km and collects data on water cycles, sea surface temperatures, and atmospheric properties across the entire planet.

Four Types of Resolution

Resolution is the key concept for understanding what any remote sensing dataset can and cannot do. There are four kinds, and they often involve trade-offs.

  • Spatial resolution refers to the size of each pixel in the image and the ground area it represents. A 10-meter resolution means each pixel covers a 10-by-10-meter patch of Earth. The finer the resolution (the lower the number), the more detail you can see.
  • Spectral resolution describes how many wavelength bands the sensor captures and how narrow those bands are. More and narrower bands mean finer spectral resolution, which lets researchers distinguish between materials that look similar in visible light but differ in other wavelengths.
  • Temporal resolution is how often the satellite revisits the same spot. Weather monitoring demands high temporal resolution because conditions change by the hour. Studying seasonal vegetation changes might not need frequent revisits but benefits from higher spatial or spectral detail instead.
  • Radiometric resolution measures how many brightness levels the sensor can distinguish within each pixel. An 8-bit instrument records 256 levels of brightness. Higher radiometric resolution helps detect subtle differences, like slight variations in ocean color that indicate changes in water quality.

Major Satellite Missions

The Landsat program is the longest-running continuous Earth observation effort, with data stretching back to 1972. The latest in the series, Landsat 9, launched on September 27, 2021 and carries two instruments: one that captures visible and near-infrared light and another that measures thermal infrared emissions. Landsat 9 has a 16-day repeat cycle on its own, but because it’s offset by 8 days from its partner Landsat 8, together the two satellites image the same location every 8 days. Both orbit at 705 km.

Europe’s Sentinel-2 satellites, part of the Copernicus program, carry instruments that sample 13 spectral bands. Four of those bands capture data at 10-meter spatial resolution, six at 20 meters, and three at 60 meters. This combination of fine spatial and spectral detail makes Sentinel-2 especially useful for agriculture, forestry, and land-use mapping.

Precision Agriculture

Farming is one of the most practical, everyday applications of remote sensing data. Satellites like Sentinel-2 and Landsat monitor crop health across vast areas by measuring vegetation indices. The most common is NDVI (Normalized Difference Vegetation Index), which compares how much red and near-infrared light plants reflect. Healthy, photosynthetically active vegetation reflects more near-infrared light, so NDVI provides a quick snapshot of plant vigor across an entire field or region.

These measurements are surprisingly accurate. Studies using Sentinel-2 data and machine learning models have estimated wheat grain yield before harvest with an accuracy of 86% and a statistical fit (R²) of 0.83. For banana crops, vegetation indices showed strong correlations with yield during flowering stages, with R² values of 0.71 for NDVI and 0.77 for a related index called EVI2. Satellites also track soil moisture, nutrient content, and the timing of growth stages across entire landscapes, helping farmers apply water and fertilizer precisely where and when it’s needed rather than blanketing whole fields.

Disaster Monitoring and Response

Remote sensing satellites have become essential tools during natural disasters, particularly flooding. NASA’s near real-time flood detection system produces daily, nearly global flood maps at approximately 250-meter resolution using data from instruments aboard polar-orbiting satellites. A more advanced system using VIIRS (a sensor aboard newer weather satellites) generates global flood maps with just a one-hour delay.

The International Charter on Space and Major Disasters coordinates satellite imagery from multiple agencies during emergencies. Since November 2000, it has been activated over 800 times, and more than half of those activations (428) have been for flood events. Europe’s Copernicus Emergency Management Service adds another layer, using Sentinel-1 SAR data to map flooded areas even when clouds block optical sensors.

Speed remains the biggest challenge. By the time satellite data is downloaded, processed, and delivered to first responders, conditions on the ground may have changed. A 2021 experimental mission demonstrated that running machine learning algorithms directly onboard a satellite using onboard processors is feasible, potentially cutting the delay between image capture and usable flood maps from hours to minutes.

From Raw Data to Usable Information

Satellite instruments produce raw digital signals, not ready-made maps. That raw data goes through a standardized pipeline of processing levels. At Level 0, the data is unprocessed with communication artifacts stripped out. Level 1 adds time stamps, location information, and calibration so the data corresponds to actual physical units of energy. Level 2 translates instrument readings into meaningful measurements like sea surface temperature or vegetation cover at the same spatial detail as the original observations. Level 3 places those measurements onto uniform grids, often summarized over weekly or monthly periods. Level 4 combines satellite data with models to produce higher-level products, such as climate forecasts or carbon budget estimates.

Most researchers and applications use Level 2 or Level 3 data. The raw levels are mainly relevant to specialists building new algorithms or calibrating instruments. Major data archives from NASA, the European Space Agency, and the U.S. Geological Survey make processed data freely available, which has driven an explosion of remote sensing applications over the past two decades.