What Is Rain Fade and How Can You Fix It?

Rain fade is the loss of satellite or wireless signal strength caused by rain along the signal’s path. When radio waves travel through falling rain, individual water droplets absorb and scatter the electromagnetic energy, weakening the signal before it reaches its destination. The effect becomes significant at frequencies above about 10 GHz, which includes most modern satellite TV, satellite internet, and emerging 5G wireless services.

How Rain Weakens a Signal

Radio waves carry energy through the atmosphere as electromagnetic radiation. When those waves encounter a raindrop, two things happen simultaneously. First, the water absorbs some of the wave’s energy and converts it to heat. Second, the droplet scatters part of the remaining energy in random directions, diverting it away from the receiving antenna. The combined result is a weaker signal arriving at your dish or receiver.

The severity depends heavily on how large the raindrops are and how many of them the signal passes through. Light drizzle with small, sparse droplets causes minimal loss. A tropical downpour with large, dense drops can cut through a signal dramatically. Rainfall rates above 40 mm per hour, the kind you’d see in a heavy thunderstorm, cause substantially more attenuation than moderate rain in the 10 to 40 mm per hour range. Tropical regions like Southeast Asia regularly experience rates above 100 mm per hour, which is why rain fade is a much bigger engineering challenge near the equator than in temperate climates.

Why Higher Frequencies Are More Vulnerable

The relationship between frequency and rain fade is straightforward: the higher the frequency, the worse the problem. Signals below about 10 GHz pass through rain with relatively little trouble. Once you cross that threshold, attenuation climbs steeply.

Satellite communications use several frequency bands, each with different vulnerability levels:

  • Ku band (12 to 18 GHz): The most common band for satellite TV and many internet services. Moderate rain fade during heavy storms, typically requiring a built-in signal margin of 3 to 5 decibels to maintain 99.9% annual uptime.
  • Ka band (27 to 40 GHz): Used by newer, higher-throughput satellite internet services. Significantly more susceptible to rain fade than Ku band.
  • V band (40 to 75 GHz): Used in some next-generation satellite and terrestrial links. Extremely vulnerable to precipitation of any kind.

To put the numbers in perspective, at 12 GHz during a very heavy rainstorm (around 237 mm per hour), a signal loses roughly 4.7 decibels for every kilometer it travels through rain. At 38 GHz, that loss jumps to about 18.4 dB per kilometer even in less extreme conditions. Since satellite signals travel through several kilometers of atmosphere, these losses add up quickly.

What It Looks Like for You

If you have satellite TV, rain fade is the reason your picture pixelates, freezes, or cuts out entirely during a heavy downpour. The signal from the satellite, traveling roughly 36,000 kilometers from geostationary orbit, hits a band of heavy rain in the last few kilometers before your dish and loses enough power that your receiver can no longer decode it cleanly. Light rain might cause brief pixelation. A severe thunderstorm directly overhead can knock out your signal completely for minutes at a time.

Satellite internet users experience the same issue as increased latency, slower speeds, or total connection drops during storms. The newer Ka-band services that offer faster speeds are, somewhat ironically, more prone to rain fade than older Ku-band systems precisely because they operate at higher frequencies.

For 5G wireless networks, rain fade matters in a different way. The millimeter-wave frequencies that 5G uses for its fastest speeds (typically 26 to 40 GHz) fall squarely in the range most affected by rain. At 26 GHz, a signal can lose roughly 20 dB over just one kilometer in heavy rain. Since 5G small cells are designed for short-range coverage, the distances involved are much smaller than satellite links, but the attenuation per kilometer is high enough that network planners have to account for it.

Rain vs. Snow vs. Other Precipitation

Rain is not the only type of precipitation that degrades signals, but it is the most common culprit at the frequencies used by most consumer satellite and wireless services. Wet snow, which contains a higher proportion of liquid water, causes attenuation through both absorption and scattering. Dry snow primarily scatters the signal rather than absorbing it, and its impact doesn’t change much whether the temperature is right at freezing or down to minus 20°C.

At very high frequencies (above 200 GHz, well beyond current consumer technology), snow actually causes more signal loss than rain. But for the Ku and Ka bands that satellite TV and internet operate on, rain remains the dominant concern. Fog and heavy clouds can also contribute, though their effect is generally much smaller than moderate-to-heavy rain.

How Engineers Fight Rain Fade

Satellite and wireless system designers use a toolbox of techniques to keep services running during storms. These fall into two broad categories: boosting the signal power, and making the signal more resilient.

On the power side, the simplest approach is building in extra signal strength, called a “rain margin,” so that the link can absorb several decibels of loss before the receiver loses lock. For Ku-band satellite systems, a margin of 3 to 5 dB is typically enough to guarantee service availability 99.9% of the year, which translates to a maximum of about 8.8 hours of rain-related outage per year. More sophisticated systems use power control, where the satellite or ground station temporarily increases transmit power when it detects that rain is weakening the link.

Site diversity is another power-side technique: ground stations are placed far enough apart that a localized storm affecting one site is unlikely to affect the other simultaneously. The system routes traffic to whichever station has the clearer path at any given moment.

On the signal resilience side, adaptive coded modulation is the most widely used modern technique. When the receiver detects signal degradation, it automatically switches to a more robust (though slower) encoding scheme that can tolerate more noise and interference. You might notice your satellite internet speed drop during a storm but keep working. That’s adaptive modulation doing its job, trading throughput for reliability in real time. Other approaches include adaptive error correction, which adds more redundancy to the data stream during fades, and bandwidth reduction, which temporarily narrows the signal to concentrate its energy.

What You Can Do on Your End

If rain fade regularly disrupts your satellite service, a few practical factors are worth checking. Your dish alignment matters: even a slight misalignment that’s tolerable in clear weather can push you below the decoding threshold during rain. A professional realignment can help. The dish surface itself should be clean and free of corrosion, since any degradation in the reflector reduces the signal reaching your receiver.

Dish size also plays a role. A larger dish collects more signal energy, giving you a bigger effective rain margin. Some satellite internet providers offer larger dish options for customers in high-rainfall areas. If you’re in a tropical or subtropical region where heavy afternoon thunderstorms are routine, this upgrade can meaningfully reduce the number of outages you experience.

Water pooling on the dish surface or accumulating on the feedhorn (the small antenna at the dish’s focal point) can itself cause additional signal loss on top of the atmospheric rain fade. Some installations include feedhorn covers or hydrophobic coatings to reduce this effect.