Dynamic range in video is the difference between the brightest white and the darkest black a camera can capture or a display can show. It’s measured as a ratio, and the wider that ratio, the more detail you see in highlights and shadows at the same time. Think of a sunset scene where you want to see both the bright sky and a person standing in shade. A camera or screen with limited dynamic range forces you to choose one or the other. A wider dynamic range lets you keep both.
How Dynamic Range Is Measured
The standard unit for dynamic range is the “stop,” borrowed from photography. Each stop represents a doubling of light. A camera with 10 stops of dynamic range can distinguish brightness levels across a ratio of 1,024:1 (2 to the power of 10). A camera with 13 stops covers a ratio of 8,192:1. The more stops, the more room you have between pure black and pure white before the image loses detail.
You’ll sometimes see dynamic range expressed as a contrast ratio instead. A display spec of 1,000,000:1 is saying the same kind of thing: the brightest point on screen is one million times brighter than the darkest. Stops and contrast ratios are just two ways of describing the same measurement.
What Happens When You Run Out
Every real-world scene has its own dynamic range, and it often exceeds what your camera can handle. When it does, something has to give. Overexposed highlights “clip” to pure white, losing all color and texture. Blown-out clouds, for example, become flat white patches with no way to recover detail in editing. Underexposed shadows clip to pure black, turning dark areas into featureless voids.
You can check this on a histogram. If the graph slams into the left or right edge, the signal is clipped in the shadows or highlights. When a scene’s range is simply too wide for the sensor, you have to decide which end matters more. Highlight recovery is generally harder than shadow recovery. Overexposed areas lose information permanently, while underexposed shadows can often be brightened slightly in post-production with acceptable results.
Camera Sensors and Real-World Numbers
Modern cameras vary widely in how much dynamic range their sensors deliver. Consumer mirrorless cameras typically land between 10 and 12 stops. Lab tests from CineD’s 2025 camera benchmarks illustrate the spread: Canon’s R1 measured 11.3 stops, while the C400 came in at 10.5 stops in RAW. Some cameras offer a dedicated high dynamic range mode that pushes the number higher. Panasonic’s Dynamic Range Boost feature, for instance, jumps to roughly 13 stops, though it comes with trade-offs like increased rolling shutter.
High-end cinema cameras from manufacturers like ARRI have historically led the pack, with sensors reaching 14 stops or more. The general trend is that larger, more expensive sensors capture a wider range, but the gap has been closing as sensor technology improves across all price points.
Log Profiles: Squeezing More Range Into the File
A camera sensor might capture 12 or 13 stops of light, but standard video encoding can’t store all of that information efficiently. That’s where logarithmic (log) recording profiles come in. Instead of encoding brightness on a linear scale, where each step represents the same increase in light, a log profile uses a curved scale that compresses highlights and lifts shadows. This lets the file hold a wider range of brightness values within the same data budget.
Log footage looks flat and washed out straight from the camera. That’s by design. The contrast and color are meant to be shaped later in editing through a process called color grading. By preserving more detail in both the bright and dark extremes during recording, log gives you flexibility to make creative choices in post-production rather than baking them in at the moment of capture. Most professional and many prosumer cameras now offer at least one log profile, like Sony’s S-Log, Canon’s C-Log, or Panasonic’s V-Log.
Bit Depth and Banding
Dynamic range and bit depth are related but distinct. Dynamic range describes the total span from dark to bright. Bit depth determines how many individual brightness levels exist within that span. An 8-bit signal divides the range into 256 levels per color channel, producing about 16.7 million possible colors. A 10-bit signal offers 1,024 levels per channel, for over a billion possible colors.
When you stretch a limited number of brightness levels across a wide dynamic range, you get visible stepping between tones, known as banding. This is most obvious in smooth gradients like skies or softly lit walls. Shooting in 10-bit gives you four times the tonal steps of 8-bit, which means smoother gradients and more room to push footage in color grading without introducing artifacts. If you’re recording in a log profile to maximize dynamic range, 10-bit recording is nearly essential to preserve that advantage through to the final image.
SDR vs. HDR on Displays
On the display side, dynamic range determines how lifelike an image looks. Standard Dynamic Range (SDR) has been the default for decades. SDR screens typically peak at 100 to 300 nits of brightness, with a relatively narrow contrast range. High Dynamic Range (HDR) pushes that ceiling dramatically, with capable displays reaching 1,000 nits and the HDR specification allowing up to 10,000 nits. The result is brighter highlights, deeper blacks, and more visible detail in both extremes.
Display technology matters here. OLED screens can turn individual pixels completely off, producing true black and effectively infinite contrast. Mini-LED displays use thousands of small backlighting zones and can reach higher peak brightness, sometimes exceeding 1,000 nits, making them strong performers for HDR. Each technology has strengths: OLED wins on black levels and contrast, while Mini-LED wins on raw brightness.
HDR Formats Explained
Not all HDR content is created equal. Several competing formats define how brightness information is encoded and communicated to your display.
- HDR10 is the baseline HDR format, supported by virtually every HDR-capable display. It uses 10-bit color and static metadata, meaning one set of brightness instructions applies to an entire movie or show. It’s royalty-free and widely adopted.
- HDR10+ adds dynamic metadata, so brightness information can change scene by scene. Developed by Samsung and others, it’s also royalty-free, which has helped its adoption.
- Dolby Vision also uses dynamic metadata and supports up to 12-bit color depth. It gives filmmakers the finest control over how each scene looks on different displays, but manufacturers pay Dolby for licensing.
- HLG (Hybrid Log-Gamma) was designed for broadcast television. It’s compatible with both SDR and HDR displays without requiring metadata, making it practical for live TV, but it can’t match the precision of the other formats.
In practice, the difference between HDR10 and Dolby Vision is most noticeable in scenes with extreme contrast, like a campfire at night or sunlight streaming through a window into a dark room. Dynamic metadata lets the display optimize each scene individually rather than applying a single compromise to the whole film.
Why It Matters for Your Work
If you’re shooting video, dynamic range dictates how much flexibility you have in challenging lighting. A camera with 13 stops lets you expose for a bright window and still recover detail in the shadowed interior. One with 10 stops forces you to light the room artificially or accept lost detail somewhere. Choosing a log profile and 10-bit recording maximizes what your sensor captures, giving you the most to work with in editing.
If you’re choosing a display, dynamic range determines whether HDR content actually looks like HDR. A screen that technically accepts an HDR signal but only hits 400 nits won’t deliver the full impact. Look for displays that reach at least 600 to 1,000 nits for a meaningful HDR experience, with true local dimming or OLED pixel-level control to maintain deep blacks alongside those bright highlights.

