That's ... thanks. What I actually want to know is if I can expect SC to run properly on upcoming HDR monitors or if it's something that'll need to be addressed (by the devs).
I figured you meant something else, but I wasn't sure.
I consider myself a techy guy, but I've not heard of these HDR monitors beforehand.
The thing which makes them special seems to be 10bits per color channel instead of 8bit (40bits per pixel per RenderTarget instead of 32bits).
This is not a standard RenderTarget format, but AMD announced support for it with the newest cards and I couldn't find any comment from nVidia.
(rendertargets are textures which are computed on the GPU)
I can't see this not flop in the gaming world, but who knows.
The thing is, these types of renderTargets are not just available for anyone to use, the types are restricted by Directx/OpenGl and GPUs have to support the standards.
For games this would mean to change all relevant parts in the code to support the bigger format and possibly adjust some things that were not accurate enough for 10 bit computation.
Sounds possible in terms of actual work, but I don't think many will even consider doing that when the market is so small and your code gets more convoluted with different new branches.
If I had to guess it's not a high priority for CI / Crytek right now.