What Is Aliasing in Audio, Video, and Medical Imaging

Aliasing is a distortion that happens when a system tries to capture a continuous signal (like sound, light, or motion) by taking periodic snapshots, but doesn’t take those snapshots fast enough. The result is a false version of the original: jagged edges in images, phantom tones in audio, or wheels that appear to spin backward on film. It shows up anywhere a continuous, real-world phenomenon gets converted into discrete digital data.

The core principle is simple. To accurately capture any signal, you need to sample it at least twice as fast as the highest frequency present. This threshold is called the Nyquist rate. Fall below it, and the data you collect doesn’t just lose detail. It actively creates patterns that weren’t there in the original.

The Sampling Problem Behind Aliasing

Think of sampling as taking snapshots at regular intervals. If something changes slowly relative to how often you’re snapping, you capture it accurately. But if it changes quickly, your snapshots miss entire cycles, and the reconstructed version looks nothing like the original.

The Shannon-Nyquist theorem puts a hard number on this: your sampling rate must be strictly greater than twice the highest frequency in the signal. For a signal that cycles 100 times per second, you need more than 200 samples per second. Exactly twice isn’t enough, because at that rate the motion becomes ambiguous. You can’t tell if the signal is going “up” or “down” between samples.

When you violate this rule, something specific happens. High-frequency components don’t just disappear. They fold down into lower frequencies, producing signals that genuinely weren’t in the original. This folding is what makes aliasing so deceptive: the resulting artifact looks or sounds like legitimate data, not like random noise or a gap.

Aliasing in Audio

In digital audio, aliasing turns inaudible frequencies into audible ones. Here’s a concrete example: if an audio signal contains an ultrasonic tone at 45 kHz (well above human hearing) and you sample it at 44.75 kHz, the resulting recording contains a 250 Hz tone, roughly middle C on a piano. That tone was never in the original sound. It was created entirely by undersampling.

Without protection against this, digital music recording would be unusable. Any ultrasonic harmonics from instruments or microphones would fold down into the audible range, producing phantom pitches layered on top of the actual music. This is why every analog-to-digital converter in a recording chain includes a filter that strips out frequencies above half the sampling rate before the signal is digitized. CD-quality audio samples at 44,100 Hz, so the filter removes everything above roughly 22,050 Hz, which conveniently sits just above the upper limit of human hearing.

Aliasing in Images and Video

Visual aliasing takes several forms, depending on whether you’re dealing with still images, screens, or moving footage.

Jagged Edges

The most familiar form is “jaggies,” the staircase pattern along diagonal or curved lines in digital images. A pixel grid is a spatial sampling system. When a smooth edge falls between pixel boundaries, the system is forced to round each point to the nearest pixel, creating a zigzag pattern instead of a smooth line. This is why anti-aliasing in video games and graphics software exists: it blends the colors of edge pixels to simulate the smoothness that the pixel grid can’t directly represent.

Moiré Patterns

Moiré patterns are the swirling, wavy interference you sometimes see when photographing a brick wall, a fabric pattern, or a screen. They appear when a repetitive pattern in the real world has a frequency close to the sampling frequency of the sensor or display. Near the Nyquist limit, strange new patterns emerge that exist in neither the original scene nor the sensor grid. They’re pure artifacts of the interaction between the two frequencies.

This problem is old enough that television performers have long avoided wearing striped clothing on camera, especially horizontal stripes. Analog TV built images using horizontal scan lines, effectively sampling the image in the vertical direction. Stripes at the wrong spacing created visible moiré interference. The print industry faces the same challenge, since printed images are made of ink dots, and those dots are themselves a form of spatial sampling.

The Wagon-Wheel Effect

Temporal aliasing is what makes wheels appear to spin backward in movies and under fluorescent lights. A camera shooting at a fixed frame rate is sampling motion over time. If a wheel spoke moves nearly one full spoke-width between frames, it looks like it moved slightly backward instead. At exactly one spoke-width per frame, the wheel appears frozen.

A UC Davis teaching example makes this intuitive: imagine photographing a clock hand that completes one revolution every 60 minutes. If you take a photo every 5 minutes, you accurately track the hand’s clockwise motion. But if you take a photo every 55 minutes, the hand appears to move counterclockwise, because it’s nearly completed a full revolution between each shot. The Nyquist theorem predicts exactly this. One cycle per 60 minutes requires a sampling rate greater than 2 frames per 60 minutes. Every 5 minutes works; every 55 minutes does not.

Aliasing in Medical Imaging

Aliasing isn’t just a media quality issue. In MRI scans, it produces what radiologists call “wrap-around” artifacts. When body structures extend beyond the scanner’s selected field of view, those structures don’t simply get cropped. They appear mapped onto the opposite side of the image, overlapping with anatomy that’s actually there. An MRI of the brain, for instance, might show tissue from one side of the head ghosted onto the other side if the field of view is too narrow. Technicians fix this by widening the field of view or applying filters, but if the artifact goes unnoticed, it can obscure diagnostic details.

How Anti-Aliasing Filters Work

The universal fix for aliasing is to remove problematic frequencies before sampling occurs. Most analog-to-digital converters are preceded by a low-pass filter, a circuit that allows frequencies below a chosen cutoff to pass through while blocking higher ones. By stripping out everything above half the sampling rate, the filter ensures that no frequency in the incoming signal can violate the Nyquist limit. The aliasing simply never gets a chance to happen.

In audio, this filter sits between the microphone and the digital recorder. In digital cameras, a thin optical filter in front of the sensor slightly blurs the image to eliminate spatial frequencies the pixel grid can’t resolve. Some high-end cameras omit this filter for sharper images, accepting the risk of moiré in certain scenes. In graphics rendering, anti-aliasing algorithms work after the fact, using techniques like supersampling (rendering at a higher resolution, then averaging down) to smooth out jagged edges.

The tradeoff is always the same: filtering removes real information along with the problematic frequencies. A sharper low-pass filter preserves more of the signal you want but is harder to build. A gentler filter is simpler but sacrifices more usable data. Getting this balance right is a core design challenge in any system that converts analog signals to digital.