Interlacing is a video display technique that builds each frame of video from two separate passes, or “fields,” instead of drawing every line at once. The first field draws all the odd-numbered lines on screen (1, 3, 5, 7…), and the second field fills in the even-numbered lines (2, 4, 6, 8…). These two fields combine to create one complete picture. It was the standard way television worked for decades, and it still shows up in some broadcast signals today.
How Interlacing Works
A standard interlaced video signal refreshes the screen 60 times per second in the US (or 50 times per second in Europe), but each refresh only updates half the image. On one pass, the display draws odd lines. On the next pass, it draws even lines. Two consecutive fields, taken together, make up a single frame. So while the field rate is 60 per second, the actual frame rate is 30 complete pictures per second.
This happens fast enough that your eye perceives smooth, continuous motion rather than two flickering half-images. The brain blends the two fields together, and the result looks like a full picture updating at a high refresh rate.
Why It Was Invented
Early television engineers faced a tough constraint: the available broadcast bandwidth couldn’t carry a full picture refreshed 60 times per second. But refreshing at only 30 full frames per second produced noticeable, annoying flicker on screen. Interlacing was the compromise. By sending half the image 60 times a second instead of the whole image 30 times a second, broadcasters got the smooth appearance of a 60 Hz refresh while using roughly half the bandwidth that a full progressive signal would require. It was an elegant trick that made broadcast television viable with 1940s and 1950s technology.
Interlaced vs. Progressive Scan
The alternative to interlacing is progressive scanning, where every line of the image is drawn in a single pass, top to bottom. A progressive signal at 60 frames per second delivers 60 complete pictures every second, while an interlaced signal at the same field rate delivers only 30 complete frames (each assembled from two fields). Progressive scan produces a sharper, cleaner image, especially during fast motion, but it requires roughly twice the bandwidth or data to transmit.
You can usually tell which format you’re looking at from the label. “1080i” means 1,080 lines of resolution, interlaced. “1080p” means 1,080 lines, progressive. “720p” is 720 lines, progressive. The “i” and “p” designations are the giveaway.
The Combing Effect and Other Artifacts
Because the two fields in an interlaced frame are captured at slightly different moments in time, fast-moving objects can appear in different positions in each field. When both fields are displayed together (or when interlaced footage is converted to a progressive format without proper processing), this mismatch creates a distinctive visual glitch called “combing.” It looks like horizontal lines or teeth along the edges of anything that moved between the two fields, similar to the teeth of a comb.
Interlaced video can also produce interline flicker, where fine horizontal details (like thin text or narrow stripes) flicker because they only appear in one field and vanish in the next. Vertical jitter is another artifact: when interlaced fields are converted to progressive frames, objects can appear to bounce up and down by one scan line at half the frame rate. These artifacts are most visible on large, high-resolution screens where individual pixel-level imperfections are easier to spot.
How Modern Displays Handle It
Modern flat-panel TVs, whether LCD or OLED, are natively progressive displays. Every pixel on the screen is addressed simultaneously for each frame. They cannot display an interlaced signal in its original form. Instead, they run the signal through a process called deinterlacing, which converts the two alternating fields into a single progressive frame before putting anything on screen. The quality of this conversion varies by display. Good deinterlacing algorithms minimize combing and preserve detail, while cheaper processing can introduce softness or motion artifacts.
This is one reason why progressive content generally looks better on a modern TV. The signal arrives in the format the display already uses, so no conversion step is needed.
Where Interlacing Still Exists
Interlacing has largely faded from consumer technology, but it hasn’t disappeared entirely. In the United States, several cable and over-the-air broadcasters still transmit in 1080i. Many local TV stations broadcast in 480p or 720p, with 1080i as the higher-end option on cable. Major broadcast networks chose either 720p or 1080i as their HD standard years ago, and some haven’t upgraded since.
If you’re watching live sports, news, or network TV through an antenna or basic cable package, there’s a reasonable chance the signal reaching your TV is interlaced. Your television handles the conversion silently, so you may never notice unless you check the signal info in your TV’s settings menu.
Interlacing in Video Editing
If you work with video, interlacing matters when you’re mixing footage from different sources. Importing interlaced clips into a progressive timeline without deinterlacing them first is the most common way people encounter combing artifacts. Most modern editing software can handle the conversion automatically, but you need to tell it the footage is interlaced so it applies the right processing. Exporting interlaced footage as progressive without enabling the deinterlace filter will bake those combing lines permanently into your final video.
For anyone shooting new footage today, progressive is the default on virtually all consumer and professional cameras. You’re unlikely to encounter interlaced source material unless you’re digitizing old tapes, working with broadcast feeds, or pulling from older security camera systems.

