I really think you need to see it for yourself.
Try playing a game in 4:2:0 and you'll see that, at 4K, it doesn't really have a significant impact on quality from a normal viewing distance (4-5ft from a 55", for example).
The entire point of HDR (when used properly) is to take advantage of the wider color spectrum.
Using Shadow Warrior 2 as an example, playing in 8-bit RGB mode looks AWFUL with HDR enabled in many scenes. The dramatic color gradients used in the sky, for instance, appear horribly posterized with very obvious banding. It just looks incorrect since this mode was designed to utilize 10-bits per channel.
What I'm saying is that 4:4:4 + 8-bit + HDR looks much worse, image quality wise, than 4:2:0 + 10-bit + HDR.
Yeah you're right, I will have to see it for myself, and I appreciate your input on this. Now I just wonder when the HDMI 2.1 standard will actually come out that would allow 10-bit HDR with 4:4:4, even though current hardware like PS4 Pro won't even support it. I'm leaning towards getting an HDMI 2.0a TV today and hoping for a few years of it not being outdated. But then again maybe HDMI 2.1 will be a firmware upgrade, and we will all be safe buying a TV now...