• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How important is 4:4:4 chroma to you?

MazeHaze

Banned
With the massive influx of TV threads due to PS4 pro, I though it'd be nice to have a thread about this as well. It seems if you're buying a TV, what you want is 4k 60hz 4:4:4 HDR with low input lag.

The numbers from rtings show the KS8000 (I own this set) as having 37ms input lag at 4k60 444 with HDR, and I've seen it brought up a lot, so I figured I'd clear it up a bit. 37ms is the lag that particular display has in PC mode, which is the only mode that displays true 444 subsampling as far as I am aware. You can absolutely switch to game mode and get 4k60 with HDR at 22ms input lag, but the display will apparently downgrade the signal to 422.

I understand why 444 is important in a standard PC monitor, as you're right up close to it, and perhaps even in very large screen TVs. My question is, how much does 444 truly matter in a TV that is being sat 6+ feet away from?

In my experience, I use the KS8000 as a PC monitor, and from 5 or 6 feet (55 inch) I can't tell the difference, and I'll take the lower input lag over 444 every time. If I put up a chroma subsampling test image, the last line of "the quick brown fox" looks VERY slightly blurrier, but that's it. It's honestly something that I've never noticed outside of test images. I'd even argue that if you aren't using your TV as a PC monitor, it REALLY isn't a necessary feature.

Before anybody compares this to "the human eye can only see 5 fps" or any other nonsense, I can most certainly tell the difference between framerates, and resolutions no problem. However, I can not see any difference in chroma subsampling between 444 and 422 in regular content.

Does anybody else have experience with this? I would love some more input. It seems a lot of people throw around 444 as being extremely important for a TV, but I think most of them have no idea what this term means for their actual viewing experience. I feel like this is part of what killed the hype for the Vizio P series, since it can't display true 444, yet is a fantastic television otherwise.
 

iTehDroiD

Neo Member
As far as I understand 4K 60hz 4:4:4 HDR is not possible using the current HDMI standard. Since HDR10 uses 10Bit colors it can only transmit 4K 60Hz 10Bit colors with chroma subsampling enabled.
 

shockdude

Member
Imo 4:4:4 is nice but not essential. It's disappointing to see certain colors turn blurry at the edges, but it's not as critical as other factors like input lag and contrast.
 

MazeHaze

Banned
Yep no 4:4:4 with HDR.

This is false technically. You can output an 8bit HDR signal, like the xbox one S for instance. Also, I can display Shadow Warrior 2 at 4k 60 444 with HDR and it still looks fantastic, I guess 8 bit introduces some slight color banding in the sky, but I see that at 10 bit 422 as well.
 
As far as I understand 4K 60hz 4:4:4 HDR is not possible using the current HDMI standard. Since HDR10 uses 10Bit colors it can only transmit 4K 60Hz 10Bit colors with chroma subsampling enabled.
I've seen this said, but it doesn't make sense. The HDMI 2.0 spec has enough bandwidth to do 10-bit 4K60 at 4:4:4. And there are reviews out there of TVs that talk about their use as monitors.

In the end it shouldn't matter all that much, though. 4:4:4 is most necessary on quite thin elements like text. It's thus important for PCs in general, but not most games in specific. Power lines, bits of foliage, and particles may have a bit of extra shimmer with 4:2:0, but only when very tiny (and AA will help). That's why UHD Blu-ray was content to go with 4:2:0.
 

dr_rus

Member
Loss of chroma detail is most apparent on small elements with high contrast, like any type of text, hence why it's immediately noticeable on PC desktop, in a browser or anywhere with text. When dealing with video it's almost unnoticeable, same is kinda true for games - kinda because some games do have text in them and the general loss of color detail leads to artifacts which may in turn increase aliasing.

Some pictures:

image.axd
Roboto_Comparison_400x.png
Hair_Comparison_400x.png
Colorcomp.jpg
pi0Yt.png


As far as I understand 4K 60hz 4:4:4 HDR is not possible using the current HDMI standard. Since HDR10 uses 10Bit colors it can only transmit 4K 60Hz 10Bit colors with chroma subsampling enabled.

I've been reading on this and it seems that this is HDMI cable related - their length mostly as people who report that they are able to use RGB or YCbCr 4:4:4 with 4K HDR @ 60 fps have very short HDMI cables.
 

MazeHaze

Banned
I've seen this said, but it doesn't make sense. The HDMI 2.0 spec has enough bandwidth to do 10-bit 4K60 at 4:4:4. And there are reviews out there of TVs that talk about their use as monitors.

In the end it shouldn't matter all that much, though. 4:4:4 is most necessary on quite thin elements like text. It's thus important for PCs in general, but not most games in specific. Power lines, bits of foliage, and particles may have a bit of extra shimmer with 4:2:0, but only when very tiny (and AA will help). That's why UHD Blu-ray was content to go with 4:2:0.

So 422 as a middle ground is more than acceptable?

I feel like 444 is the secret thing people seem to absolutely need even though it's not a huge deal for consoles, and for PC users sitting far away.

I also don't think youre right about hdmi bandwith. My nvidia card will not let me do 4k 60 444 10 bit all at once, I have to lower one of the settings. Once I'm at 4k 60, 10 bit isnt even an option in Nvidia.
 

MazeHaze

Banned
Loss of chroma detail is most apparent on small elements with high contrast, like any type of text, hence why it's immediately noticeable on PC desktop, in a browser or anywhere with text. When dealing with video it's almost unnoticeable, same is kinda true for games - kinda because some games do have text in them and the general loss of brightness detail leads to artifacts which may in turn increase aliasing.

Some pictures:

image.axd
Roboto_Comparison_400x.png
Hair_Comparison_400x.png
Colorcomp.jpg
pi0Yt.png




I've been reading on this and it seems that this is HDMI cable related - their length mostly as people who report that they are able to use RGB or YCbCr 4:4:4 with 4K HDR @ 60 fps have very short HDMI cables.

Right this all makes sense, but I don't think it matters on a TV really. I'm viewing this thread on a PC hooked to my 4k tv, and switching between 422 and 444 yields no apparent difference from 5 feet on a 55 inch.
 

MCN

Banned
Oh god, is this another one of those number things that's apparently suddenly super-important and requires me to buy another new TV?
 

MazeHaze

Banned
Oh god, is this another one of those number things that's apparently suddenly super-important and requires me to buy another new TV?

No, it's one of those same numbers, but one that I see nobody bring up even though it's apparently super important. I see this brought up in ps4 pro threads even though I don't even think you get a 444 signal from a console.
 

MCN

Banned
No, it's one of those same numbers, but one that I see nobody bring up even though it's apparently super important. I see this brought up in ps4 pro threads even though I don't even think you get a 444 signal from a console.

Can we just go back to using bits? I understood those.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I have no idea what that is. I'll be happy enough to get a 4K signal with the Pro period.
 

MazeHaze

Banned
422 gets you lower input lag? Is it noticeable?
On my TV (samsung ks8000) you can still leave your gpu setting at 444, the tv just only displays a true 444 image in PC mode from what I'm told, which has a lag of 37 ms. If you out the tv into game mode, you dont have to change gpu settings, it apparently just downconverts the signal to 422, and yes 37ms to 22ms is noticeable to me, 444 to 422 is not.
 
But why? How big is your tv? How far do you sit from it? Can you tell the difference between 422 and 444 outside of test images?

444 is much clearer

it's a pretty big difference even on a web page like this (or anything with text) whether i'm 12 ft or 5 ft away
 

jett

D-Member
On my TV (samsung ks8000) you can still leave your gpu setting at 444, the tv just only displays a true 444 image in PC mode from what I'm told, which has a lag of 37 ms. If you out the tv into game mode, you dont have to change gpu settings, it apparently just downconverts the signal to 422, and yes 37ms to 22ms is noticeable to me, 444 to 422 is not.

I guess it's a on a per-TV basis. My TV (sony w655) supports 444 under game mode. I have it connected to my PC with my card set to 12 bit color too, don't notice any input lag.
 

x3sphere

Member
I use 444 on my C6 OLED, difference between 444 and 422 is very apparent, mostly on text. I sit about 5 ft away.
 

MaLDo

Member
What games use that font size at 4K?

So, no games will use red color? Ok.

That picture is a test to show how 422 affects colors and details. It's text because is easier to show you can't read a shit. Same problems will be present without text.
 

MazeHaze

Banned
Ah, I was searching this image, beat me to it.

On a regular usage it's not really important for say, watch TV show or the news, but when reading is required, it can really mess up things.

ok, but this is my tv at 422, don't mind the weird red and blue colors at the bottom, they look like normal red and blue in real life.


TmRno4v.jpg
 

MaLDo

Member
ok, but this is my tv at 422, don't mind the weird red and blue colors at the bottom, they look like normal red and blue in real life.


TmRno4v.jpg


Your TV is scaling and blurring the zoomed picture and part of the problem is mitigated then. For native resolution pictures with fine detail the problem will appear.

I mean, you can barely see the separation between the letters even with that massive zoom.

Please do the same test but without zoom in the picture (1:1 pixel).
 

MazeHaze

Banned
Your TV is scaling and blurring the picture and part of the problem is mitigated then. For native resolution pictures with fine detail the problem will appear.

this is at 2160p. It just looks weird because of the camera exposure or something, in person the blue text on red looks exactly blue, and the red text on blue looks exactly red.
 

shockdude

Member
ok, but this is my tv at 422, don't mind the weird red and blue colors at the bottom, they look like normal red and blue in real life.


TmRno4v.jpg

Your TV is scaling and blurring the picture and part of the problem is mitigated then. For native resolution pictures with fine detail the problem will appear.
That's not TV scaling, that's Windows scaling. Look at the size of those taskbar icons.
Right click the desktop, open Display Settings, change the scaling to 100%, reboot, and try again. You should be able to see the difference then.
 

Flandy

Member
Am I correct in assuming that I'm unable to get 4:4:4 at 4k 60hz 10 bit w/ HDR? It would be limited to 4:2:2?
What would be the better option for my TV (Samsung KS8000) if I sit a few feet away from it?
4k 60hz 10bit HDR 4:2:2 or 4k 60hz 8bit HDR 4:4:4?
 

MaLDo

Member
this is at 2160p. It just looks weird because of the camera exposure or something, in person the blue text on red looks exactly blue, and the red text on blue looks exactly red.


The picture is obviously zoomed, c'mon. I can count four pixels where there should be one.
 
So, no games will use red color? Ok.

That picture is a test to show how 422 affects colors and details. It's text because is easier to show you can't read a shit. Same problems will be present without text.

Of course it looks like shit. I'm not disagreeing with that.

What I'm saying is that the test doesn't reflect real world use, unless you use Windows at a low DPI setting, or a game with a UI that doesn't scale text with the resolution.
 

MaLDo

Member
That's not TV scaling, that's Windows scaling. Look at the size of those taskbar icons.
Right click the desktop, open Display Settings, change the scaling to 100%, reboot, and try again. You should be able to see the difference then.

Correct. Obviously what I want to say was not

Your TV is scaling and blurring the picture and ..

but

Your TV is showing an scaled and blurred picture and ...

sorry :p
 

MazeHaze

Banned
imgur keeps fucking up when I try to upload for some reason, but...

when I set windows scaling to 100 percent, 422 does clearly look worse than 444 on the last two lines, thought not completely unreadable like the examples posted previous, from my couch I can't tell the difference though, mostly because the image takes up about an eighth of my screen. I also can't think of any time I would set windows scaling to 100 percent on a TV when viewed from the couch.

My original question still stands, is this a make or break feature for a TV?
 
Top Bottom