How can Dolby Vision and the like provide more saturation and more brightness vs SDR? Why can’t SDR just tell the monitor to go to its maximum brightness or use the saturated colors that it can?

306 views

In all “demos” they show a slightly desaturated / dull / dark image as the SDR and the opposite as Dolby Vision (or whatever).

While I understand that DV can provide more resolution (e.g. finer / more grades between shades or colors) I don’t understand what stops SDR to simply ask maximum pure red (as in 255:0:0 in RGB). I understand that the equivalent in DV would be something like 1024:0:0, but if both are mapped to the maximum red a monitor can produce I don’t get how DV can be brighter or more saturated..

In: 17

7 Answers

Anonymous 0 Comments

This will be slightly oversimplified for ELI5 purposes.

This involves not just SDR vs HDR but also the question of color spaces. When we talk about SDR for TV, we usually assume a particular color space, like BT.601 which was used for SD TV, and BT.709 which is used for HD. Computer images usually use sRGB or Adobe RGB. These color spaces map the wavelengths of light that you should get for a particular RGB value, in other words, when a pixel is R=255, what exact value of red does that refer to? HDR uses yet another color space called Rec 2100, which defines much wider values, so a larger color space can be represented. That’s why HDR can show a “deeper red” than SDR can. Here’s a comparison:

https://en.wikipedia.org/wiki/Rec._2100

Edit:
I previously included this link which was the wrong spec, but it has a better chart with the other color spaces for comparison:
https://en.wikipedia.org/wiki/Rec._2020

You are viewing 1 out of 7 answers, click here to view all answers.