Gamma controls how bright or dark the mid-tones of your TV’s picture appear. It doesn’t change the brightest whites or the deepest blacks. Instead, it reshapes everything in between, determining whether a dimly lit scene looks rich and detailed or muddy and hard to read. Most TVs default to a gamma of 2.2, and adjusting this number even slightly can dramatically change how your picture looks.
How Gamma Shapes Your Picture
Your TV receives a video signal as a series of numerical values for each pixel, ranging from 0 (black) to 255 (white). Gamma determines how those numbers translate into actual light on your screen. The relationship isn’t a straight line. It follows a curve, and the steepness of that curve is what the gamma number describes.
A higher gamma value (like 2.4 or 2.6) makes the curve steeper, which pushes mid-tones darker. Shadows become deeper and contrast looks more dramatic. A lower gamma value (like 1.8 or 2.0) flattens the curve, making mid-tones brighter and lifting shadow detail so you can see more in dark areas. The highlights and absolute black point stay roughly the same either way.
This matters because your eyes don’t perceive light the way a camera records it. You’re far more sensitive to differences between dark shades than between bright ones. Gamma encoding takes advantage of this by dedicating more of the signal to the darker tones your eyes care about most, then letting your display “expand” those tones back into visible light along its curve.
Why Gamma Exists in the First Place
Gamma correction dates back to old cathode ray tube (CRT) televisions. CRTs had a natural quirk: the relationship between the voltage they received and the light they produced wasn’t linear. It followed a power curve with a gamma typically between 2.35 and 2.55. If you sent a signal at 50% intensity, a CRT wouldn’t produce 50% brightness. It would produce something much dimmer.
To compensate, broadcasters applied the inverse curve to the signal before transmission. The signal was deliberately distorted so that after the CRT distorted it again, the viewer saw correct brightness. This system worked so well, and aligned so neatly with how human vision perceives brightness, that it became the foundation for all video standards. Modern flat-panel TVs (LCD, OLED, and others) have completely different internal electronics, but they’re specifically designed to emulate that same CRT-style gamma curve so that all existing content looks correct.
The Standard Numbers: 2.2 vs. 2.4
Two gamma values dominate the TV world, and which one looks best depends largely on your room.
Gamma 2.2 is the standard for the computing world and the sRGB color space used by Windows, macOS, and virtually all web content. It’s also the better choice for TVs in rooms with ambient light, like a living room with lamps on or daylight coming through windows. The slightly brighter mid-tones help fight the room light that would otherwise wash out shadow detail.
Gamma 2.4 is the reference standard for high-definition video, specified by the international broadcast standard known as BT.1886 (with a precise exponent of 2.404). Blu-ray discs and premium streaming services are mastered to this target. Images at gamma 2.4 look visibly darker with higher contrast, producing a punchier, more cinematic feel. This only works well in a dimly lit or completely dark room. In a bright space, you’ll lose too much shadow detail.
Movie theaters push even further, using gamma 2.6 for their blacked-out environments. At that level, a 2.2 or 2.4 image would look flat and dim on a cinema screen.
What Happens When Gamma Is Wrong
Setting gamma too high causes “black crush,” where dark shades that should be slightly different all collapse into pure black. You lose detail in shadows, dark hair blends into dark clothing, and nighttime scenes become impossible to read. At gamma 2.6 on a home TV, shades below about RGB value 6 (out of 255) become invisible.
Setting gamma too low creates the opposite problem. The picture looks washed out and flat. Dark scenes lose their atmosphere because shadows are lifted to a milky gray. Blacks never feel truly black, and the image lacks punch. A gamma of 1.8 will reveal more shadow detail, but at the cost of making lighter shades unnaturally bright and reducing the overall sense of contrast.
How to Check and Adjust Your Gamma
Most TVs bury gamma in picture settings, sometimes under “advanced” or “expert” options. You’ll typically see preset values like 2.0, 2.2, and 2.4, or labels like “light,” “medium,” and “dark.” Some TVs use names like BT.1886 for their 2.4 preset.
The simplest way to check your gamma without professional equipment is to use a test pattern. Online tools like the Lagom LCD test display a set of vertical bars with alternating light and dark bands. You step back from your screen (or squint) until you can’t see individual pixels, then look at where the bands blend together into a smooth tone. They should merge at the line labeled 2.2 if your gamma is set correctly for a room with normal lighting. The test works across red, green, and blue channels individually, which helps reveal if one color is off even when the overall picture looks acceptable.
For a quick real-world check, pull up a movie scene you know well that has both bright highlights and deep shadows. If you can’t see detail in dark areas (faces in dimly lit rooms, textures in dark clothing), your gamma is probably too high. If the image looks flat and washed out with weak blacks, it’s too low.
Gamma for Gaming
Competitive gamers often want to see into dark corners and shadows where opponents might hide, which tempts them to lower gamma. This works tactically but makes the overall image look worse. Rather than adjusting your TV’s gamma globally, it’s better to use the in-game brightness or gamma calibration screen that most games include in their settings menus. These calibration tools are tailored to each game’s specific rendering engine, so they can brighten shadows without destroying the rest of the picture the way a global TV adjustment would.
Gamma and HDR Content
If your TV supports HDR (High Dynamic Range), the traditional gamma curve gets replaced by a completely different system called PQ (Perceptual Quantizer), standardized as SMPTE 2084 and used by both Dolby Vision and HDR10. Unlike a simple power curve that applies the same shape across all brightness levels, PQ changes its shape depending on whether it’s encoding dark or bright tones. It packs more precision into darker shades (where your eyes are most sensitive) and less into the very brightest highlights.
When your TV switches into HDR mode, the gamma setting in your picture menu typically grays out or disappears entirely, because HDR content carries its own tone-mapping instructions. Your TV’s gamma setting only applies to SDR (standard dynamic range) content: broadcast TV, older Blu-rays, and standard streaming.

