Oh come on, stop being foolish.Even still I don't think they messed it up
Limiting HDR to an SDR range is a "choice" instead of just shipping as SDR which in the end would look better on a persons display??? Again, stop being foolish.Just like movies limiting HDR to 200 nits make an intentional choice.
It isn't JUST elevating the black levels though, this is what you seem to be missing, it's scaling an SDR bracket in an HDR container, meaning there's no real improvement going on, it's just messing up the image by shifting the range. You keep talking about creators intent when SDR in these games IS the better option as it provides the exact same range just in the proper container for the TV/Monitor to handle.Just like having specifically elevated black levels is an intentional choice etc.
And this is just you babbling making excuses for CDPR. At a baseline MINIMUM HDR needs to exceed the 200 nits range of SDR, shifting around a 200 nits range doesn't magically make it HDR, it just cripples other areas instead, the the case of Cyberpunk, raising the levels to achieve a higher nit counts drags up the black levels. Lowering it drags down the highs, a properly adjusted game can achieve having a high level range AND good black levels below 50 nits, that's the whole damn point.And really you should stop saying things like "looks worse than SDR" when you have people in this very thread saying the exact opposite based on testing it out for themselves (besides me). Use your eyes and judge, stop relying on youtubers to tell you what you should think. The games goes from 0.025 nits all the way to >1000 if you display can handle it. It's also in WCG. That's as HDR as possible. If you want to restrict HDR to only arbitrary levels like 0.00 nits, then why stop there, why not also only 4000 nits? In which case we do only Dolby Pulsar as "true HDR" display. Or hell, why not 10000 nits? After all, RL HDR measures in 100.000s of nits, so why stop at "lowly" 10.000 nits? In which case everyone be equally (un)happy as there's no true HDR anywhere except RL. No movie, game, no display. All fake HDR. Oh no.
Same goes for WGC, it's pointless to advertise your product is graded in WGC if it doesn't exceed Rec.709 at ANY point!
Fact is, a lot of studios don't have the proper equipment to test HDR in-house, they don't have grading monitors, they don't have the proper tools to test ranges and output, instead they're doing it blind and assuming it will work in most cases until it's pointed out, the bare minimum a studio will do is to just shove SDR into a HDR container and have that be adjustable, but that's not exactly taking advantage of what's being provided. It'd be like saying your game renders in 4k but only 1280x720p in the middle of the screen is visible and everything else around it is a black bar, sure your TV is getting a 4k signal but you're not going to call that true 4k are you?