A lot of technical answers, because in the end it’s a technical issue and the exact answer will depend on which HDR standard you’re referring to.
But ignoring those, your question almost answers itself – there isn’t.
Let’s say you only had 1 bit, and assigned those to black and white where you presume white to be 300 nits on common displays.
Now you add another bit for four values. Suddenly you can have, say, 33% grey and 67% grey as well, and increased your color fidelity.
But what if, instead, you said to keep the old black and white, and the two new bits should be interpreted as being ‘brighter than white’, to be displayed on displays that can go up to 600 nits.
You just invented an HDR standard. A crappy one with only 4 levels, and between drivers, gpus, and displays those 300 nits ones might still just display it as 4 levels of grey within their capabilities, or do tonemapping on the fly.
Latest Answers