In all “demos” they show a slightly desaturated / dull / dark image as the SDR and the opposite as Dolby Vision (or whatever).
While I understand that DV can provide more resolution (e.g. finer / more grades between shades or colors) I don’t understand what stops SDR to simply ask maximum pure red (as in 255:0:0 in RGB). I understand that the equivalent in DV would be something like 1024:0:0, but if both are mapped to the maximum red a monitor can produce I don’t get how DV can be brighter or more saturated..
In: 17
It doesn’t. Not necessarily.
You need all the pieces. Brightness and contrast constraints are only part of the requirements for HDR / Dolby Vision certification.
You could have a regular SDR monitor that produces colors just as bright and dark as HDR, of course. Earlier OLED sets would probably fit the bill there brightness-wise even though they don’t support all the other specifications.
For me the appeal of HDR was always in the color detail – finer steps between colors. I watch a lot of scifi and the terrible color banding in space shots has always bothered me. I was so excited to switch to an HDR tv. Sadly, convenience won out in my household and my switch to primarily consuming streaming video happened at about the same time. So I’ll be forever plagued by video compression that negates much of the benefit of the improved physical device.
Latest Answers