• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How important is 4:4:4 chroma to you?

mrklaw

MrArseFace
So 4:4:4 isn't a major issue that people should hold out buying a 4K HDR tv,hoping for new models to have that?
 
So 4:4:4 isn't a major issue that people should hold out buying a 4K HDR tv,hoping for new models to have that?

Do you want to use it as a monitor for anything other than gaming and media? if not then go for it now. If so weigh your options and decide if it's worth waiting. With a cheap enough set you can always sell and upgrade later too.
 

dark10x

Digital Foundry pixel pusher
RTINGS really dropped the ball with their input lag table as I feel the labeling is HIGHLY misleading since they do not mention that it is tested using 8-bits per channel.

It's easy to bring up situations in which 4:2:0 looks bad but due to the insanely high pixel density of 4K TVs combined with normal viewing distances, it's not really a huge issue.

It only really becomes distracting if you're using a 4K display as a PC monitor for very fine pixel width text. In that case, just go for 4K60 RGB at 8-bit. That's all you need for PC usage.

It was a huge issue on 1080p displays due to the lower pixel count but, at 4K, the loss in color resolution is so much less noticeable while viewing content.

This is false technically. You can output an 8bit HDR signal, like the xbox one S for instance. Also, I can display Shadow Warrior 2 at 4k 60 444 with HDR and it still looks fantastic, I guess 8 bit introduces some slight color banding in the sky, but I see that at 10 bit 422 as well.
Shadow Warrior 2 only functions properly when using 12-bit 4:2:0 mode. Every single other option will result in color banding. If you do 10-bit 422, you will see color banding. That is the fault of the game itself.

Also, if you run 4:2:0 in SW2 outside of 12-bit mode in HDR, you'll see rainbow artefacts on text which completely disappear when the game is displaying properly.

8-bit HDR is worthless going forward since it defeats the entire PURPOSE of it.
 

III-V

Member
RTINGS really dropped the ball with their input lag table as I feel the labeling is HIGHLY misleading since they do not mention that it is tested using 8-bits per channel.

8-bit HDR is worthless going forward since it defeats the entire PURPOSE of it.

Is it accurate as some have said the the HDMI 2.0 cable (18Gbit/s) is not enough for 4K 60 fps 10-bit RGB w/HDR?

Does PS4 Pro attempt to output this or when HDR is enabled the console outputs 10 bit YUV420?
 

Xhaner5

Neo Member
HDR actually doesn't mean anything, the industry did it again, it's just a marketing moniker.

Now I don't think 8 bit can be considered HDR, 10-bit can be but I think that i've seen information that Rec 2020 might need 12-bit to be fully utilized, but none of the techies I asked could give me an answer instead they had so much to talk without any answer.

All the research shows me that it's actually very weird when it comes to what the bits are and how the gamut is, there is really no correlation, you can have 12 bit signal support but it can still have poor gamut.

There is no actually a standard that anybody respects at all, the Rec 2020 gamut is a recommendation, but no TVs in existance can actually support it, there's only multi thousand dollar TVs that approach Rec 2020 to some 75% of the gamut the last thing I've seen.

So most of your TVs even tho if they have a HDR sticker on, that could only mean they can take at least 10-bit signal, they support the signal as in that you would get a working image, not an error, but that won't mean they can display many more colors, some TVs will be very deceptive how much they can display.

So just keep in mind, HDR is not a standard, it's a label not to be taken that seriously, anyone can put anything inside it, and 8-bit HDR is definitely a fake HDR.

Gamut is really different from screen to screen, and the way this industry works, most of the TVs and Monitors are horribly calibrated if at all, the production lines are made for speed, not for quality, unless you have a premium screen and a written seal of quality and unique piece of paper for that serial number of the monitor/TV from the manufacturer that it was fully calibrated you're probably not looking at real colors, it's just like the wild nature of the web when stuff breaks all the time.
 

dark10x

Digital Foundry pixel pusher
Is it accurate as some have said the the HDMI 2.0 cable (18Gbit/s) is not enough for 4K 60 fps 10-bit RGB w/HDR?

Does PS4 Pro attempt to output this or when HDR is enabled the console outputs 10 bit YUV420?
Yes. HDMI 2.0 is not enough for 4K60 10-bit RGB. PS4 Pro offers an RGB mode and a 420 mode for this reason.
 

chrislowe

Member
Ok, found this on my samsung UHD - UHD 50P/60P 4:4:4 and 4:2:2-signals. Please remove your HDMI cable before using this

No idea what this is about, so I should enable it?.

It does not say that I should put the HDMI cable back again ;)
 

dark10x

Digital Foundry pixel pusher
Ok, found this on my samsung UHD - UHD 50P/60P 4:4:4 and 4:2:2-signals. Please remove your HDMI cable before using this

No idea what this is about, so I should enable it?.

It does not say that I should put the HDMI cable back again ;)
Yes. It's on my OLED as well. You have to enable support for those modes for each input in my case.

If your actual cables don't support high bandwidth, enabling this can cause corruption
 

mrklaw

MrArseFace
Do you want to use it as a monitor for anything other than gaming and media? if not then go for it now. If so weigh your options and decide if it's worth waiting. With a cheap enough set you can always sell and upgrade later too.

no just gaming and TV/movies. And the prices are so much lower than previous TVs I've bought I can upgrade again in 2-3 years time if necessary.
 

mrklaw

MrArseFace
Yes. It's on my OLED as well. You have to enable support for those modes for each input in my case.

If your actual cables don't support high bandwidth, enabling this can cause corruption

which specific model OLED do you have? Do you have a video online, or post here/on DF with a review/impressions of it? I'm curious about the balance between the great blacks/infinite contrast/no local dimming limitations, vs slightly slower input lag, above black issues and lower brightness for HDR (especially watching in a normally lit room)
 

III-V

Member
HDR actually doesn't mean anything, the industry did it again, it's just a marketing moniker.

Now I don't think 8 bit can be considered HDR, 10-bit can be but I think that i've seen information that Rec 2020 might need 12-bit to be fully utilized, but none of the techies I asked could give me an answer instead they had so much to talk without any answer.

All the research shows me that it's actually very weird when it comes to what the bits are and how the gamut is, there is really no correlation, you can have 12 bit signal support but it can still have poor gamut.

There is no actually a standard that anybody respects at all, the Rec 2020 gamut is a recommendation, but no TVs in existance can actually support it, there's only multi thousand dollar TVs that approach Rec 2020 to some 75% of the gamut the last thing I've seen.

So most of your TVs even tho if they have a HDR sticker on, that could only mean they can take at least 10-bit signal, they support the signal as in that you would get a working image, not an error, but that won't mean they can display many more colors, some TVs will be very deceptive how much they can display.

So just keep in mind, HDR is not a standard, it's a label not to be taken that seriously, anyone can put anything inside it, and 8-bit HDR is definitely a fake HDR.

Gamut is really different from screen to screen, and the way this industry works, most of the TVs and Monitors are horribly calibrated if at all, the production lines are made for speed, not for quality, unless you have a premium screen and a written seal of quality and unique piece of paper for that serial number of the monitor/TV from the manufacturer that it was fully calibrated you're probably not looking at real colors, it's just like the wild nature of the web when stuff breaks all the time.

Ultra HD Premium is a standard, as is Dolby Vision and HDR-10. Now a manufacturer may brand almost anything as HDR, if they can support some part of these standards.

In able to be branded as Ultra HD Premium, it must satisfy > 90 % of DCI-P3 color with support for 2020 , 10-bit processing and 1k nits peak brightness. Those are about the best you can get today.

In a few years we should start seeing monitors that support the full wcg and 4k nits peak brightness with 12 bit color. Yes, you will need 12-bit for full rec2020, important at high luminance levels.

Until then if you want a standard, go with Ultra HD Premium or Dolby Vision.

Regarding the color calibration, many of these new sets are quite accurate <3-5% delta E out of the box after setting proper color temp and gamma. Greyscale is also pretty good. Its not like years past where you might be >15% off. We can't really calibrate HDR yet, either.

Of course you can have higher bit depth without supporting a wider color gamut, even PS3 supported deep color.

Here is a white paper that does a nice job explaining the bit depth and its importance in relation to HDR.
 
Yes. HDMI 2.0 is not enough for 4K60 10-bit RGB. PS4 Pro offers an RGB mode and a 420 mode for this reason.
Do you know why this is? I posted last page that the math indicates HDMI 2.0 should have the bandwidth to do it. It seems they engineered the spec to be capable, but then restrict it from its full capacity.
 

max-pain

Member
Do you know why this is? I posted last page that the math indicates HDMI 2.0 should have the bandwidth to do it. It seems they engineered the spec to be capable, but then restrict it from its full capacity.

18 Gbps with overhead. Without it's 14.4 Gbps. And you need 14.93 Gbps for 3840x2160 10 bit RGB.
 

Dimmuxx

The Amiga Brotherhood
PS4 pro will most likely output 4K @ 60Hz 10bit 4:2:0 when in HDR mode. It probably supports RGB or 4:4:4 if you don't use HDR. If you have a psvr passthrough connected it will only output 8bit 4:2:0.
 

televator

Member
How do TVs handle the tranparency channel? It just struck me as a question right now. Games, technically have a 4th component. E.g. 4:4:4:4, 4:2:2:4.

Anyway, IMO, bit depth is more important than 4:2:2:4 Vs. 4:4:4:4.
 
The fact that this issue exists from a standards-perspective is extremely perplexing. How did they allow this limitation to be built into the standard?
 
I understand why 444 is important in a standard PC monitor, as you're right up close to it, and perhaps even in very large screen TVs.


I use a ~40" TV that doesn't have 4:4:4 support. I sit too close to it, same position as I did with my 27" monitor. I have a great vision, like long walks on the beach, and like having the back of my neck stroked...I honestly have no complaints about fuzzy text or anything of the sort. I was well aware of what 4:4:4 was before buying this TV and second guessed my decision every step of the way until I set it up.
 

Dunkley

Member
I didn't know this was a thing (although I heard of chroma before I didn't know it was something like this), but this explains a ton of why some text looked really fucked up when I hooked up my PC to my TV.
 

max-pain

Member
I'm not sure I follow. The 14.93 Gbps number is for full 10-bit data. Are you saying 10bit color also requires another 20% bandwidth on top of that? What for?

No. It has nothing to do that it's 10 bit color. The data has to be encoded at one side (hdmi output) and decoded on the other side (hdmi input). 8b/10b encoding is used for the that. The 8b/10b has nothing to do with 10 bit (or 8 bit etc.) color. It's just pure data encoding method (for example serial ata uses the same technic).
 
No. It has nothing to do that it's 10 bit color. The data has to be encoded at one side (hdmi output) and decoded on the other side (hdmi input). 8b/10b encoding is used for the that. The 8b/10b has nothing to do with 10 bit (or 8 bit etc.) color. It's just pure data encoding method (for example serial ata uses the same technic).
Ah, okay. That makes sense, thanks for the explanation.
 

III-V

Member
The fact that this issue exists from a standards-perspective is extremely perplexing. How did they allow this limitation to be built into the standard?

I agree that it is a bit self defeating, but generally speaking a standard is put in place to define limitations and minimum compliance. Its clear they weren't overly concerned with future proofing.

Typically a 'reference' goes above and beyond the standard.

If I had to guess, I would think it likely has to come down to $.

Another equally perplexing standard is Dolby Vision, although I do applaud it b/c it is trying to future proof itself, no current sets can fully comply to its standard.
 

MazeHaze

Banned
From what I saw in Rtings, 4:4:4 does not add input lag on FHD TVs
It does if only one mode supports it, and that mode has a higher input lag than game mode.

Also, here is the 422 image unscaled on my TV. It looks way sharper in 444, it's also reallllly tiny lol.

UXF0W8h.jpg

Also, I never see anything like this in games or browsing, only in this test image.
 

televator

Member
4:4:4 in and of itself should not add lag. In fact, it should theoretically reduce it because it circumvents your TV's need to scale from compressed 4:2:2 or 4:2:0 to RGB. Essentially, it cuts out some processing.

Any lag that is detected should be due to some other processing inherent to the picture mode of the TV.
 

MazeHaze

Banned
4:4:4 in and of itself should not add lag. In fact, it should theoretically reduce it because it circumvents your TV's need to scale from compressed 4:2:2 or 4:2:0 to RGB. Essentially, it cuts out some processing.

Any lag that is detected should be due to some other processing inherent to the picture mode of the TV.
It isn't about 444 adding lag, it's that only PC mode displays 444, and it has a higer lag than game mode.
 
Yeah, I've seen that matrix and I believe it. I just don't understand why it's the case. That same HDMI 2.0 spec sets max bandwidth at 18 Gbps. For 10-bit color in 4:4:4 (i.e. full data for every subpixel), that's 30 bits per pixel. At 60Hz, you need 1800 bits per pixel per second. A 4K frame is 8,294,400 pixels. So 4:4:4 10-bit color at 4K60 should require 14.93 Gbps, right? This leaves 3.07 Gbps headroom in the HDMI 2.0 spec. I couldn't locate data on HDMI bandwidth needs for audio, but I can't imagine 3 Gbps is somehow insufficient, even for many-channel hi-fidelity sound.

So why is the spec limited to lower video settings? The cable seems to support the bandwidth to go higher.

There are many variables, audio being one as you said, and you're also forgetting HDCP 2.2, who knows how much is reserved for the encryption? Another variable is HDR metadata and Dolby vision. I guess with all those, goodbye bandwidth?
 
Top Bottom