You basically answered your question.
The difference is they not only adding more steps (higher bit depth), but also adding more color “ranges”. So the (1,1,1) in some HDR standard would be brighter than (1,1,1) in say, sRGB.
Also adding more steps in-between isn’t “required” for HDR per se; it’s just that the typical 8-bit/channel (256 color) is already stretched thin in even sRGB (you got banding issues because the steps are too rough). By having higher dynamic range to express, it’s only getting worse. So almost all HDR standards use at least 10-bit/ch (1024) colors or more.
Latest Answers