BluRayHiDef
Banned
I'm currently using a TCL 55R635 as a 4K monitor, which is a 55" 4K TV with a 120Hz panel. However, it can convey a 120Hz signal at only 1440p and below (with or without HDR enabled); at 4K, it's limited to 60Hz. According to its official specifications, the TV supports Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM); I don't know how to verify when VRR is active, but I do know when ALLM is active because when I start a game on my Xbox One X, it displays a notification that it has activated the TV's game mode.
Anyhow, I'd like to know what I'm missing out on in terms of image quality due to the TV's limited combinations of color formats, color depth, and refresh rate, etc, which I will explain.
When using Windows 10, the default resolution, refresh rate, and color settings in Nvidia Control Panel are as follows:
Resolution: 4k x 2k, 3840 x 2160 (native)
Refresh rate: 60Hz
Desktop Color Depth: 32-bit
Output Color Depth: 8 bits per color
Output Color Format: RGB
Output Dynamic Range: Full
If I want to set the output color depth to 10 bits per color or 12 bits per color, I have to set the output color format to either yCbCr 422 or yCbCr 420, which sets the dynamic range to Limited.
Setting the output color format to yCbCr 444 limits the color depth to 8 bits per color and the dynamic range to Limited.
However, if I lower the refresh rate to 30Hz or lower, I can set the output color depth to 10 bits per color or 12 bits per color in combination with the yCbCr 444 color format or with the RBB color format (Limited or Full).
When I set the resolution to 2560 x 1440, the refresh rate is locked to 120Hz and the default color settings that I listed above are loaded.
Also, whenever I force Windows 10 to output HDR, my TV's HDR mode is activated but the default color settings listed above are maintained, which is puzzling because the default output color depth is 8 bits per color even though HDR is 10 bits per color (or 12 bits per color for Dolby Vision). What's the explanation for this?
I'm personally content with this display despite these limitations; however, I am curious. So, what am I missing out on? What is the maximum bandwidth that my TV's HDMI ports can covey based on this information?
Anyhow, I'd like to know what I'm missing out on in terms of image quality due to the TV's limited combinations of color formats, color depth, and refresh rate, etc, which I will explain.
When using Windows 10, the default resolution, refresh rate, and color settings in Nvidia Control Panel are as follows:
Resolution: 4k x 2k, 3840 x 2160 (native)
Refresh rate: 60Hz
Desktop Color Depth: 32-bit
Output Color Depth: 8 bits per color
Output Color Format: RGB
Output Dynamic Range: Full
If I want to set the output color depth to 10 bits per color or 12 bits per color, I have to set the output color format to either yCbCr 422 or yCbCr 420, which sets the dynamic range to Limited.
Setting the output color format to yCbCr 444 limits the color depth to 8 bits per color and the dynamic range to Limited.
However, if I lower the refresh rate to 30Hz or lower, I can set the output color depth to 10 bits per color or 12 bits per color in combination with the yCbCr 444 color format or with the RBB color format (Limited or Full).
When I set the resolution to 2560 x 1440, the refresh rate is locked to 120Hz and the default color settings that I listed above are loaded.
Also, whenever I force Windows 10 to output HDR, my TV's HDR mode is activated but the default color settings listed above are maintained, which is puzzling because the default output color depth is 8 bits per color even though HDR is 10 bits per color (or 12 bits per color for Dolby Vision). What's the explanation for this?
I'm personally content with this display despite these limitations; however, I am curious. So, what am I missing out on? What is the maximum bandwidth that my TV's HDMI ports can covey based on this information?