Plasma TVs didn’t fail because of picture quality. They actually produced some of the best images of any flat-panel technology. Plasma failed because LCD panels got dramatically cheaper to manufacture while plasma couldn’t keep up, and a combination of higher power consumption, burn-in risk, and shifting consumer priorities sealed the deal. By December 2013, even Panasonic, the last major holdout, confirmed it was ending plasma production.
LCD Manufacturing Costs Collapsed
The single biggest reason plasma lost is economics. LCD panel manufacturing benefited from a scaling advantage that plasma simply couldn’t match. LCD panels are cut from large sheets of “mother glass,” and as factories upgraded to larger sheets, the cost savings were enormous. Going from one generation of glass to the next cut the cost per diagonal inch of display by 50%. Over just a few generations, equipment costs per unit of LCD panel area dropped by 80%.
Yields improved dramatically too. Early LCD production lines wasted about half their output, but modern operations eventually hit 90% or better. Every one of those improvements made LCD TVs cheaper at the store. Plasma panels, built from thousands of tiny gas-filled cells sandwiched between glass layers, had no equivalent scaling trick. The manufacturing process was fundamentally more complex and more expensive, and the gap widened every year.
By the late 2000s, a consumer could buy a decent LCD TV for significantly less than a comparable plasma set. Price wins in consumer electronics, almost every time.
Power Consumption Was a Real Problem
Plasma TVs were energy hogs. A 50-inch plasma drew around 300 watts, while an LCD of the same size used roughly 150 watts. At 60 inches, the gap was even worse: 500 watts for plasma versus 200 watts for LCD. That difference adds up on an electricity bill, especially for a device that runs several hours a day.
This mattered beyond just cost. As governments introduced stricter energy efficiency standards, plasma was increasingly at a disadvantage. The European Union rolled out ecodesign regulations and energy labeling requirements that made it nearly impossible for power-hungry displays to earn a respectable rating. Under the strictest modern EU rules, even efficient TVs struggle to rate higher than “F” on the label. Plasma would have been essentially unlabelable. Manufacturers saw the regulatory writing on the wall years before the final standards took effect.
Burn-In Scared Buyers Away
Plasma panels use phosphors, tiny compounds that glow when electrically charged gas excites them. When the same image sits on screen for too long, those phosphors overheat and lose luminosity unevenly. The result is a faint “ghost” of the static image permanently visible on the display, even when the TV is off. This is burn-in, and plasma panels were especially vulnerable to it because they ran hotter than older CRT screens that had the same weakness.
Earlier plasma models, particularly those made before 2007, also suffered from phosphor degradation over time. The screen gradually lost brightness even with normal use. Later generations improved significantly, but the reputation stuck. Consumers worried about leaving a news ticker, video game HUD, or channel logo on screen too long. LCD panels, which use a completely different light source, had no burn-in risk at all. That peace of mind was a powerful selling point, even if the average plasma buyer would never actually experience severe burn-in.
Size and Weight Worked Against Plasma
Plasma panels were heavy, thick, and only available in larger screen sizes. You couldn’t buy a plasma TV smaller than about 37 inches, which locked the technology out of the bedroom and kitchen TV market entirely. LCD panels could be made in virtually any size, from small desktop monitors to massive wall displays, giving manufacturers a single technology to scale across their entire product lineup.
Plasma sets were also noticeably heavier, making wall mounting more difficult and shipping more expensive. As TVs got thinner and lighter with each product cycle, plasma couldn’t keep pace with the slim profiles that LCD and later LED-backlit panels achieved.
There was even an altitude limitation that affected a small but real number of buyers. Because plasma cells contain pressurized gas, the technology performed poorly at high elevations. Samsung rated its plasmas for a maximum of about 6,900 feet, Panasonic for around 7,200 feet, and LG topped out at roughly 9,500 feet. Anyone living in a mountain city like Breckenridge, Colorado (9,600 feet) or La Paz, Bolivia (nearly 12,000 feet) needed an LCD instead.
LCD Closed the Picture Quality Gap
For years, plasma enthusiasts had a legitimate argument: plasma produced deeper blacks, more natural motion, and wider viewing angles than LCD. Early LCD panels looked washed out from the side, struggled with fast-moving content, and couldn’t produce true black because the backlight was always partially on. Home theater fans and videophiles genuinely preferred plasma, and they were right to.
But LCD technology kept improving. LED backlighting replaced the older fluorescent lamps, boosting contrast and cutting power use further. Local dimming allowed parts of the backlight to turn off independently, dramatically improving black levels. In-plane switching panels widened viewing angles. Higher refresh rates smoothed out motion. By 2012 or so, the average consumer standing in a store couldn’t tell the difference, and LCD’s advantages in price, weight, brightness, and energy use made the choice obvious.
The Final Exit
Samsung and LG quietly wound down their plasma lines in 2013 and 2014. Panasonic, which had invested more heavily in plasma than any other company, held on the longest. In October 2013, The Verge reported that Panasonic confirmed it was exiting the plasma market “with almost immediate effect,” ending production of new units in December and wrapping all related operations by March 2014.
The timing wasn’t coincidental. 4K resolution was arriving, and building a 4K plasma panel at a competitive price was essentially impossible given the manufacturing limitations. LCD was the only realistic path to higher resolutions at scale, and every major manufacturer redirected resources accordingly. OLED eventually picked up where plasma left off in terms of picture quality, offering perfect blacks and wide viewing angles without the burn-in vulnerability or power consumption that doomed plasma. Ironically, OLED would later develop its own mild burn-in issues, but by then the technology had enough other advantages to survive the stigma that helped kill plasma a decade earlier.

