So is the color banding from HDR-8 all that common, or noticeable? I mean, we still have 16.7 million colors to display with that right?
If someone has experience with this please let us know. I can't stomach the thought of losing 4:4:4 for 10-bit color.
I really think you need to see it for yourself.
Try playing a game in 4:2:0 and you'll see that, at 4K, it doesn't really have a significant impact on quality from a normal viewing distance (4-5ft from a 55", for example).
The entire point of HDR (when used properly) is to take advantage of the wider color spectrum.
Using Shadow Warrior 2 as an example, playing in 8-bit RGB mode looks AWFUL with HDR enabled in many scenes. The dramatic color gradients used in the sky, for instance, appear horribly posterized with very obvious banding. It just looks incorrect since this mode was designed to utilize 10-bits per channel.
What I'm saying is that 4:4:4 + 8-bit + HDR looks much worse, image quality wise, than 4:2:0 + 10-bit + HDR.
So if I buy a 4k HDR tv i dont have to worry about any of this?
The other setting is for people who dont have a 4k TV with HDR?
Yes, you will. ALL TVs are limited by the HDMI 2.0 standard.
You can go 4K60 + 8-bits per channel + RGB on an HDMI 2.0 compliant display.
Or you can go 4K60 + 10 or 12-bits per channel + 4:2:0 in order to enjoy HDR properly.
No display can support 4K60 + 10-bit + RGB right now. It's impossible due to HDMI bandwidth limitations.
If you're concerned about this, I would wait and see what the next iteration of the standard holds and which TVs will utilize it.