I assume you mean video HDR and not photography HDR via editing.
HDR includes a few things:
**10-Bit color**: Normal (standard) is 8-bit, meaning 2^8 = 256, so red/green/blue each have 256 levels of saturation/brightness, giving you nearly 17M color combos. 10-bit is 2^10 , which allows for over 1B combos. This leads to finer color accuracy.
**Wider Color Gamut**: normal tv is rec.709, whereas HDR is rec.2020 (though most televisions, and many cameras go only up to P3). [Here is the comparison](https://image.benq.com/is/image/benqco/color-gamut-8?$ResponsivePreset$&fmt=png-alpha), biggest difference is in the greens.
**Higher nit count**: nits are a measurements of brightness, like lumens. For normal content the max is 100 nits. For Dolby Vision it is 10,000 (though many televisions/projectors obviously cannot get that bright).
Latest Answers