• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What are the pros and cons to higher bit rate colour depth and RGB colour space?

NinjaBoiX

Member
I’ve got my XSX and PS5 hooked up to my LG C2 42”, and I was just playing about in the settings and noticed XSX is 8bit by default.

I think I understand the benefits of a higher bit rate, more colours and less banding, but are there any downsides? I’ve put it on 10bit but honestly I can’t really see much difference, will a higher bit rate have any adverse effects?

Should I just put it on 12bit for the best results?

What about the colour space option? PC RGB over standard I assume?
 

Hoddi

Member
I'd just keep it at 8-bit. AFAIK, this is for SDR only and requires explicit application support for 10/12-bit SDR. Nvidia has a good writeup on it here if you want a better understanding of the feature but application support on Xbox is slim.

Important note : for this feature to work the whole display path, starting from the application's display rendering/output, the Windows OS desktop composition (DWM) and GPU output should all support and be configured for 10 bit (or more) processing, if any link in this chain doesn’t support 10 bit (for example most Windows applications and SDR games display in 8 bit) you wouldn’t see any benfit.

RGB vs standard also only applies to SDR but most TVs will suffer from black crush when set to RGB as it's mainly intended for monitors. Your LG TV should have an option for 'Black Level High/Low' and those need to be matched with RGB/Standard if you want to avoid black crush. This also applies to OLEDs as it's about signaling rather than how good the black levels are.

You can still try Low/RGB and see if you like it. There's a calibration page in Settings where you can test it out. But you'll likely see some black crush in the grey scale.
 
Last edited:

Pimpbaa

Member
If you just mean the setting for the dashboard, it doesn’t change much (dashboard is SDR by default), maybe it affects non gaming apps I dunno. Definitely do not want to be limiting HDR games to 8-bit color however.
 
Last edited:

Dice

Pokémon Parentage Conspiracy Theorist
10bit will only really see improvement in a really subtle way in lighting effects if they make the game for it, which is probably only happening if it's an HDR Dolby Vision game. There are only a few of those and in my experience Doby Vision adds input lag that isn't worth it. The reason I say it's only going to be lightly noticeable with lighting is that making a game for this feature sort of needlessly increases memory burden and only serves a very small percentage of gamers, so most devs aren't making assets that really make use of it. That said, you aren't really going to hurt yourself by setting it higher. It just won't be utilized most of the time.

If you have to PC RGB color space on a TV, even an OLED, it's going to look bland. The contrast and color grading is going to be really smooth, but it's going to lack that pop you generally want even after you change the settings for blacks. It has to do with the space that the assets were designed to find their contextual metrics. An analogy for this is like if you made a pattern to work in centimeters and then someone applied it in inches, it's going to be stretched out. PC RGB works by a different logic with saturation of colors into greys, so instructions made without those greys in mind are going to get washed out when you apply them to that system. A flipside of this would be me playing Elder Scrolls Online on PC. I don't like the sort of saturation they decided on, so I like to run it in Adobe color space for exrta pop and warmth.

Here is a good video for Xbox:
 
Last edited:

Pimpbaa

Member
10bit will only really see improvement in a really subtle way in lighting effects if they make the game for it, which is probably only happening if it's an HDR Dolby Vision game. There are only a few of those and in my experience Doby Vision adds input lag that isn't worth it. The reason I say it's only going to be lightly noticeable with lighting is that making a game for this feature sort of needlessly increases memory burden and only serves a very small percentage of gamers, so most devs aren't making assets that really make use of it. That said, you aren't really going to hurt yourself by setting it higher. It just won't be utilized most of the time.

You don’t need Dolby Vision (HDR10 is just fine for games) to get a get a game that makes good use of HDR (and 10-bit). Also, the lack of color banding with 10-bit is noticeably in almost every game that uses HDR (except for those few games that just use a 8-bit color space within the 10-bit container). 10-bit has been utilized in almost every game with HDR since the PS4 Pro. Which was most game back then and almost every game now. Easier to name new games that don’t have HDR and 10-bit color.
 
Last edited:

Dice

Pokémon Parentage Conspiracy Theorist
You don’t need Dolby Vision (HDR10 is just fine for games) to get a get a game that makes good use of HDR (and 10-bit). Also, the lack of color banding with 10-bit is noticeably in almost every game that uses HDR (except for those few games that just use a 8-bit color space within the 10-bit container). 10-bit has been utilized in almost every game with HDR since the PS4 Pro. Which was most game back then and almost every game now. Easier to name new games that don’t have HDR and 10-bit color.
I don't really notice less banding. Rather, usually what I see is the SDR has a narrower dynamic range in order to prevent banding, then when it's HDR mode they expand the dynamic range so it has more detail in lights and darks and looks richer in the color gradient. SDR looks more game-y and garish while HDR looks more natural, but they do pretty well to hide banding. I see banding most often in shadow maps that were stretched way too far, which happens in HDR games as well.
 

Pimpbaa

Member
I don't really notice less banding. Rather, usually what I see is the SDR has a narrower dynamic range in order to prevent banding, then when it's HDR mode they expand the dynamic range so it has more detail in lights and darks and looks richer in the color gradient. SDR looks more game-y and garish while HDR looks more natural, but they do pretty well to hide banding. I see banding most often in shadow maps that were stretched way too far, which happens in HDR games as well.

I notice banding in SDR games in anything with a dynamic time of day, particularly evening and night. Even games that have static time, any night scenes (like the last Yakuza game) has significant banding in the sky. Also, games that use only 8-bit color in HDR mode I definitely notice banding everywhere. The game where I noticed this most was Cyberpunk 2077 with film grain turned off (film grain can somewhat hide color banding). I agree with most of what you said however.
 
8 bit sucks and it's about high time games and Windows standardizes 10 bit as the new default format. I'm sick of the awful sunset gradient banding in the sky.
 
Top Bottom