An interlaced refresh rate describes how many times per second a display draws half of its image. Instead of refreshing every pixel in one pass, interlaced displays split each frame into two “fields,” one containing the odd-numbered lines and the other containing the even-numbered lines. A signal labeled 1080i at 60 Hz, for example, delivers 60 fields per second but only 30 complete frames, because it takes two fields to build one full picture.
How Interlaced Scanning Works
Every image on a screen is made up of horizontal lines stacked from top to bottom. In a progressive system (like 1080p), the display draws all of those lines in a single pass, refreshing the entire image at once. Interlaced scanning takes a different approach: it draws the odd lines first (1, 3, 5, 7…) in one pass, then comes back and fills in the even lines (2, 4, 6, 8…) in a second pass. Each of those half-image passes is called a field.
Because each field is captured at a slightly different moment in time, the two halves of a single frame don’t represent the exact same instant. This matters when something on screen is moving quickly, but it also creates the illusion of smoother motion. Your eye sees 60 updates per second (in the US standard) even though only 30 of those are complete pictures. European systems use a 50 Hz field rate, producing 25 full frames per second.
Why Interlacing Was Invented
Interlacing was a bandwidth compromise. Early analog television systems in the 1940s and 1950s couldn’t transmit enough data to refresh every line of the picture 60 times per second. Interlacing solved the problem elegantly: it doubled the perceived frame rate and refresh rate while using the same bandwidth that a full progressive scan at half the speed would require. In practical terms, the two interlaced fields use half the bandwidth of a single progressive frame at the same resolution. This meant broadcasters could deliver a smooth, relatively flicker-free picture over limited airwaves without doubling the required signal capacity.
Field Rate vs. Frame Rate
This is where the terminology gets confusing. When you see “1080i/60,” the 60 refers to the field rate, not the frame rate. The actual frame rate is 30, because two fields combine to form one frame. Compare that to 1080p/60, where the display draws all 1,080 lines sixty times every second.
So a 1080i signal at 60 Hz and a 1080p signal at 30 Hz deliver the same number of complete frames per second. The difference is that the interlaced version feels smoother to the eye because it updates half the screen twice as often. A 1080p signal at 60 Hz, on the other hand, genuinely refreshes every pixel 60 times per second, delivering both smoothness and full-frame clarity.
Visible Artifacts in Interlaced Video
Because each field captures a different slice of time, fast-moving objects can shift position between the odd and even fields. When those two fields are combined into a single frame, the result is a “combing” effect: horizontal lines that look like tiny teeth along the edges of anything that moved. You’ve probably seen this on older sports broadcasts or news tickers, where edges appear jagged or torn.
A subtler artifact is called interline twitter. This happens when fine detail exists on one set of lines but is completely absent from the other set. The detail appears to flicker or shimmer because it shows up in one field and vanishes in the next. Interline twitter is a normal consequence of interlacing and is one reason interlaced video has an effective vertical resolution of only about 70% of its total line count. A 1080i signal has 1,080 lines on paper, but during motion or with fine detail, the usable vertical resolution drops closer to roughly 756 lines.
How Modern Displays Handle Interlaced Signals
Flat-panel TVs, computer monitors, and phone screens are all progressive displays. They draw every line in every refresh cycle. When they receive an interlaced signal, they have to convert it to progressive through a process called deinterlacing. There are two basic approaches, and most modern displays use a combination of both.
The first is called “bob” deinterlacing. It takes each field and stretches it to fill the full screen, either duplicating lines or averaging neighboring lines to fill the gaps. This produces a full-screen image 60 times per second, preserving smooth motion, but it cuts vertical resolution in half and can create jagged edges or a subtle flickering effect as the interpolated lines alternate.
The second is “weave” deinterlacing, which simply stitches the two fields together into one frame. This preserves full resolution for static scenes but produces the combing artifacts described above whenever anything moves. More advanced deinterlacers use motion-adaptive processing: they detect which parts of the image are still and which are moving, then apply weave to the still areas and bob to the moving areas. Edge-directional methods go further by estimating the angle of edges and interpolating along them to reduce jaggedness.
The quality of deinterlacing varies widely between devices. A high-end TV with a good video processor will handle 1080i content with minimal visible artifacts. A cheap monitor or media player may produce noticeable combing or shimmer, especially during sports or action scenes.
1080i vs. 1080p in Practice
Both formats contain 1,080 horizontal lines, but the viewing experience differs. In scenes with little movement, 1080i and 1080p at the same frame rate look nearly identical. The gap widens with motion. A 1080p signal at 60 frames per second delivers every line of every frame simultaneously, so fast-moving objects stay sharp. A 1080i signal at 60 fields per second splits the image across two time slices, so those same objects can appear blurry, combed, or glitchy.
For static or slow-moving content like news desks, talk shows, or slideshows, 1080i is perfectly adequate. For sports, gaming, or action-heavy video, 1080p provides noticeably cleaner motion.
Where Interlaced Video Still Exists
Most over-the-air TV broadcasts in the United States still use 1080i as their primary HD format. Cable and satellite providers pass that signal through as well. However, the next generation of broadcast standards focuses entirely on progressive scanning. The ATSC 3.0 standard (marketed as NextGen TV) supports Ultra HD services using modern compression codecs like HEVC and VVC, and its framework is built around progressive delivery. Streaming services, Blu-ray discs, gaming consoles, and computer displays all use progressive scan exclusively.
Interlacing remains relevant mainly for legacy broadcast content and for anyone converting older analog video (VHS tapes, standard-definition DVDs, archived TV footage). If you’re buying a new display or choosing settings on a streaming service, progressive is always the better option when available. The bandwidth savings that made interlacing necessary in the 1940s simply don’t apply to modern digital systems.

