How can Dolby Vision and the like provide more saturation and more brightness vs SDR? Why can’t SDR just tell the monitor to go to its maximum brightness or use the saturated colors that it can?

381 views

In all “demos” they show a slightly desaturated / dull / dark image as the SDR and the opposite as Dolby Vision (or whatever).

While I understand that DV can provide more resolution (e.g. finer / more grades between shades or colors) I don’t understand what stops SDR to simply ask maximum pure red (as in 255:0:0 in RGB). I understand that the equivalent in DV would be something like 1024:0:0, but if both are mapped to the maximum red a monitor can produce I don’t get how DV can be brighter or more saturated..

In: 17

7 Answers

Anonymous 0 Comments

> While I understand that DV can provide more resolution (e.g. finer / more grades between shades or colors) I don’t understand what stops SDR to simply ask maximum pure red (as in 255:0:0 in RGB).

Because if the video file is SDR, then the monitor sees that and knows the color space is Rec.709. When the monitor sees an HDR video file, it switches modes to to Rec.2020. It also knows that the luminance data changes as well and that pure white is 100nit for SDR whereas HDR is 10000nit.

You could theoretically take a 1080p SDR video and alter the data to be HDR, but it would take a lot of effort for it not to look terribly inaccurate.

You are viewing 1 out of 7 answers, click here to view all answers.