How can Dolby Vision and the like provide more saturation and more brightness vs SDR? Why can’t SDR just tell the monitor to go to its maximum brightness or use the saturated colors that it can?

427 views

In all “demos” they show a slightly desaturated / dull / dark image as the SDR and the opposite as Dolby Vision (or whatever).

While I understand that DV can provide more resolution (e.g. finer / more grades between shades or colors) I don’t understand what stops SDR to simply ask maximum pure red (as in 255:0:0 in RGB). I understand that the equivalent in DV would be something like 1024:0:0, but if both are mapped to the maximum red a monitor can produce I don’t get how DV can be brighter or more saturated..

In: 17

7 Answers

Anonymous 0 Comments

Two things here:

1. The TV’s ability to actually produce very bright highlights and inky black blacks *at the same time*. Traditional backlight LCD/LED screens are very bad at this because the solid backlight is always on and “bleeds,” through the screen in the dark areas. This means if the brights get brighter, then the darks also get brighter. Newer OLED and other HDR screens are much better at handling this.
2. The processing side of things. Almost everything sans professional video/photo work is done in “8 bit color,” i.e. 3 separate numbers, 0-255 are sent for each red, green, and blue pixel. This content *already exists* and wouldn’t look correct if you tried to map that 0-255 range to an HDR 0-2000 or whatever. You’d get things like a white piece of drywall in a movie suddenly shining like the sun. Therefore you actually need processing and software to handle the different video formats, and differentiate between SDR and HDR content, while also allowing some user-level control over it.

You are viewing 1 out of 7 answers, click here to view all answers.