Gamma controls how bright or dark the mid-tones on your monitor appear. It doesn’t change the brightest whites or the darkest blacks. Instead, it reshapes everything in between, making the overall image look lighter or darker without touching the extremes. If you’ve ever noticed a photo looking washed out on one screen but rich and contrasty on another, gamma is often the reason.
How Gamma Works
Your eyes don’t perceive light in a straight line. You’re far more sensitive to differences between dark shades than between bright ones. A room going from pitch black to dimly lit feels like a dramatic change, but going from bright to slightly brighter barely registers. Gamma takes advantage of this by applying a curve to the relationship between the signal your computer sends and the light your screen produces.
Technically, gamma is a power-law exponent. The monitor takes an input value (say, a pixel that’s 50% of maximum signal) and raises it to the power of gamma before displaying it. With a gamma of 2.2, that 50% input doesn’t produce 50% brightness. It produces roughly 22% brightness, which actually looks correct to your eyes because of how human vision works. Without this curve, images would look unnaturally bright in the shadows and flat overall.
This system is a leftover from CRT displays, which naturally produced light in a non-linear way. The voltage going into a CRT and the brightness coming out followed a curve with a gamma typically between 2.35 and 2.55. Modern LCD and OLED screens don’t behave this way physically, but they emulate the same curve because every piece of existing content, from photos to movies to websites, was encoded with that curve in mind.
The Standard: Gamma 2.2
A gamma of 2.2 is the default for nearly all computer displays. It’s baked into the sRGB color space, which is the standard for Windows, macOS, web browsers, and the vast majority of digital content. When someone says a monitor is “calibrated,” they usually mean it’s been set to display gamma 2.2 accurately across the full tonal range.
For home theater and cinematic content, there’s a second standard: gamma 2.4. This is specified by the broadcast industry’s BT.1886 recommendation and is used for mastering Blu-ray discs, streaming video, and broadcast television. The higher gamma value produces deeper shadows and more contrast, which looks better in a dark room. In a bright office, though, gamma 2.4 can make shadows look crushed and hard to read.
Gamma vs. Brightness
These two settings seem similar but work very differently. Your monitor’s brightness control adjusts the backlight intensity uniformly. Crank it up and every pixel gets brighter, including the lightest ones that may blow out to pure white. Turn it down and everything dims, including shadows that may collapse to black.
Gamma only redistributes the mid-tones. Raising the gamma value (say, from 2.2 to 2.4) makes mid-tones darker, producing a punchier, more contrasty image. Lowering it (from 2.2 to 1.8) brightens mid-tones, making shadows more visible but potentially washing out the image. In both cases, pure black stays black and pure white stays white. This is why gamma feels more like a “richness” or “depth” control than a simple light switch.
What Wrong Gamma Looks Like
When gamma is too high for your content, dark areas lose detail. Shadows that should show texture and depth collapse into solid black, a problem called “crushed blacks.” You might notice this in a dark movie scene where you can’t make out faces, or in a game where shadowed corners become impenetrable voids.
When gamma is too low, you get the opposite: a grey haze over dark areas. Blacks look raised and milky instead of deep, and the image loses its sense of contrast. Colors can appear flat and washed out, especially in scenes with a wide range of light and dark. This is particularly noticeable on OLED screens, where the contrast between true black and a slightly raised grey is impossible to miss.
Gamma mismatches between how content was created and how your monitor displays it cause real problems. Some game developers, for example, master their titles assuming a specific gamma curve, and if your display doesn’t match, the visual balance breaks. The game can simultaneously have pitch-black shadows in some scenes and washed-out greys in others. This has been a recurring issue in several major game franchises where developers used non-standard gamma handling during development.
Gamma Settings for Gaming
In games, gamma directly controls how much you can see in dark environments. Many monitors include a “Black Stabilizer” or “Shadow Boost” feature, which is essentially a gamma adjustment targeting the darkest tones. Lowering the gamma (or enabling these features) makes shadows brighter, giving you a visibility advantage in dark areas of competitive games. The tradeoff is a flatter, less cinematic image.
For single-player games where atmosphere matters, sticking closer to gamma 2.2 preserves the look the developers intended. Most games include an in-game brightness or gamma slider that shows a reference image, usually a logo or symbol that should be “barely visible” at the correct setting. This slider adjusts the game’s output to compensate for your display’s gamma, so it’s worth setting it carefully rather than just cranking it up for visibility.
If you play on an OLED monitor, gamma mismatches are more visible than on LCD screens. Any lifted blacks show as a noticeable grey haze because OLEDs can produce true zero-light black. When a game sends slightly raised black levels, whether intentionally or due to a gamma bug, the result is immediately obvious. Adjusting your display’s gamma or shadow settings can help, but overcorrecting will crush detail in legitimately dark scenes.
Gamma for Photo and Video Work
For any color-accurate work, gamma 2.2 and the sRGB color space are your baseline. Photos edited on a monitor with incorrect gamma will look wrong on everyone else’s screen. If your gamma is too low during editing, you’ll think images look fine, but other viewers will see them as overly dark because their standard-gamma displays apply more contrast to the mid-tones than yours did.
Professional monitors often include hardware calibration tools that measure the actual gamma curve at multiple points across the tonal range, not just at a single brightness level. This matters because cheap monitors may hit gamma 2.2 on average but deviate significantly in specific shadow or highlight regions, causing uneven tonal reproduction that’s hard to spot without measurement but affects print output and cross-device consistency.
How Gamma Changes With HDR
HDR content doesn’t use the traditional gamma curve at all. Standard dynamic range (SDR) content uses the familiar gamma 2.2 curve with 8 bits of color depth, which limits contrast ratios to roughly 1,200:1. HDR replaces this with a different transfer function called ST2084 (sometimes referred to as PQ, for “perceptual quantizer”). This curve is extended at both ends to handle much brighter highlights and deeper shadows simultaneously, requiring a minimum of 10 bits of color depth and enabling contrast ratios well over 200,000:1.
When your monitor switches to HDR mode, the gamma setting in your display’s menu typically stops applying. The HDR signal carries its own tone-mapping information, and the display handles it differently than SDR content. This is why your carefully tuned gamma 2.2 setting for desktop use doesn’t affect how an HDR movie looks. If HDR content appears washed out, the fix usually isn’t in the gamma setting but in the display’s HDR-specific controls like shadow detail or tone mapping.

