Inspired by that video and all of the reasoned disagreement in it's comments, I decided to do some experimenting of my own last night using the One X. It turns out that my receiver also displays the format of incoming signals with greater detail than the TV does (which is a shame, btw) so I experimented with the various X settings and wanted to share my findings.
First, some groundwork. From what I understand, normal SDR games (i.e. not HDR) are rendered in 8-bit RGB format. Movies and other video sources are usually stored and broadcast in 8-bit YCC format. The amount of bits basically translates to the amount of colors that are available. You will also see notation for YCC for the amount of color data that is transmitted to the TV (aka color subsampling) ranging from 4:4:4 (best) to 4:2:0.
HDR content, both for games and movies (UHD Blu-ray) is all 10-bit YCC format. Due to bandwidth limitations for HDMI, these sources are 10-bit 4:2:0. By default, the Xbox will automatically switch from RGB 8-bit to YCC 4:2:0 10-bit when an HDR video or game is playing. Now for what the confusing Xbox settings do:
Color Space: This ONLY affects normal, SDR output. If you change this to 10-bit, it has zero effect on HDR games (which are always 10-bit by default). It simply converts your normal 8-bit RGB game signals to 10-bit YCC 4:2:0 before sending it to the display. This will not add any additional colors, and could in fact lead to loss of some color detail since you are not feeding the TV the default RGB signal. It's an extra conversion step that is totally unnecessary for most users. Why is it even there? Well, some TVs may have problems switching modes. You've probably noticed that when you go from SDR to HDR and vice-versa that the TV blanks for a second as it adjusts to the new format. Some TVs may have problems with this transition, so changing this setting to 10-bit would mean that the TV is receiving the 10-bit YCC 4:2:0 signal all the time. The KS8000 doesn't suffer from this issue so I believe the correct setting for the KS8000 is 8-bit (24 bits per pixel).
Allow YCC 4:2:2 checkbox: As I stated earlier, current HDR material is encoded for 10-bit YCC 4:2:0. Your Xbox won't magically add more color data, so why is there a 4:2:2 option? Again, this setting is for lesser TVs that have problem accepting a true HDR 10-bit YCC 4:2:0 signal. According to my tests, what actually happens when you check this box is that YCC output from the Xbox, including in HDR titles, instead of using 10-bit YCC 4:2:0, will instead use 8-bit 4:2:2. This represents a loss of colors and is unwanted. The KS8000 does not need this help so the checkbox should be cleared. (EDIT - It's also possible this could help if you have HDMI cables or other devices in the chain that are not performing adequately.) BTW, if you want to replicate my tests, be aware that the output format will not change on running HDR software. You'll need to quit completely out of any game using the Xbox quick menu and restart before changes will take effect.
TL;DR: I now believe the correct Xbox S/X settings for the KS8000 and related sets are Colorspace: 24 bits per pixel (8-bit) and UNcheck the Allow YCC 4:2:2 option.