If it has nothing to do with brightness, how come 10 bit computer monitors aren't considered hdr?
This image is taken from a slide from nVidia
Look at where it shows HDR Decoder. When we stream HDR content to our TV, or the Xbox One S tries to send the signal, what it wants to know is whether or not the screen is okay to receive it. HDMI 2.0a (the port, not a cable) allows us to send over additional information to check this (similar to how DRM works, it will block you if it doesn't like the screen).
Back to the requirements for HDR Media Profile:
- EOTF: SMPTE ST 2084
- Bit Depth: 10 bit
- Color Primaries: ITU-R BT.2020
- Metadata: SMPTE ST 2086, MaxFALL, MaxCLL
1. aka the Perceptual Quantize, we need to be able to display Rec. 2020 -- which is the wider colour gamut. This alone doesn't mean more colours, it means it can pick out colours some this space. This also allowed the content to display luminance levels of up to 10,000 cd/m2 -- not that it does, it's just that it can (nothing does, literally no TV does this)
2. The 10-bit Monitor has this, so that's fine.
3. The 10-bit Monitor can have this also. No worries.
4. Now here's another issue. These 10-bit monitors are not going to know what to do with this. The port is either not HDMI 2.0a so they can't carry it, or it simply isn't created to verify it can accept the metadata (it can't).
And we also have the problem with a decoder, also. Nvidia's solution here is to use Nvidia Shield which has an HDR decoder. These monitors don't have this. It's specific software that's part of why the Xbox One S doesn't allow HDR content to it.
Also, I hope you're aware that there are 10-bit monitors with higher contrast ratios than TVs. It doesn't matter if they don't accept the signal. And if they don't accept the signal, they aren't displaying anything.
Early 2017 is when they should start to.
Netflix, Xbox One S, PS4 Pro etc. when they want to send this beautiful Deep Colour image to us, our receiver needs to be HDR Media Profile compliant. It's sort of like a screen needing the G-Sync board inside before you're able to turn on the feature at the OS level (or why I can't force my PC to push 144hz to my Surface Book's screen).
Every point you made refers to Ultra HD Premium, which is the standard which basically says: "your TV must do HDR, 4K, and not be shit".