• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why do too many devs still implement trash HDR?

Stafford

Member
It's still being treated as an afterthought by too many developers nowadays, and not just small ones either. Correct me if I'm wrong, but didn't AC Origins and Oddysey both have great HDR?

Then came Valhalla and they totally fucked it up, colors became drab, areas that were pitch black (no light sources) in SDR were this light grey and looked horrible. Don't get me started on how nighttime looked. The same can be said about Mirage. I tried the trial yesterday and HDR is bullshit in it. It's pretty damn similar to Valhalla.

How do they get it right in Origins and Oddysey and totally mess it up and never fixed it for Valhalla and Mirage? If you don't know how to do it I rather you not put HDR in it at all, saves me the hassle from having to go into the system settings and turning HDR10 off solely for one fucking game, on Xbox, that is.

It's a shame because i have a awesome TV for HDR, a Sony A95K, which I absolutely love except for issues that a TV of this price simply should not have, but that's a different story. It just sucks that proper HDR for every damn game is still not a given nowadays.
 
I only came here to say that Assassin’s Creed : Odyssey had beautiful water until they updated to HDR and now it looks like shit. And you can’t turn it off either (unless you cut HDR off on your Xbox and tv, but I’m not even sure that works).
 

Bojji

Member
Holy fuck some of those motherfuckers are really incompetent or don't give a fuck. MANY games have raised black levels (why? Do they master HDR on Tvs without local dimming at least?) and there are examples like you mentioned where HDR looks dull and worse than SDR.

RTX HDR shows that ML can do better job than many devs...

I play Gears 5 in co-op with my friend right now (PC) and game has phenomenal HDR, looks absolutely mind blowing. There are other games like that - FFVII Remake, GTA 5 (on consoles), Doom Eternal, Yakuza TMWEHN or Ori but most games have average HDR implementation.
 
Are you folks using dynamic tone mapping?

I had a LG C9. Valhalla was the game that brought to my attention how badly dynamic tone mapping ruins the image. Mid-tones like the grey sky are completely wrecked. You absolutely need to enable HGIG and use the in-game HDR setting appropriately (I believe it was 700-800 nits on my C9). This allows proper tone mapping and everything will look good again.

This could also be the cause for complaints against the water in Odyssey.

PS: just played Mirage not long ago and HDR was good on my Sony a95k as well, although I am on a PS5. Use HGIG on LG TV, and Gradation tone mapping on Sony.
 
Last edited:

Killer8

Member
Some of it is likely an artistic choice. People went on and on about Resident Evil 2 remake having raised black levels but it was intended. If you try to 'fix' it via Reshade then other parts of the game end up with completely crushed black levels.

Other controversial games like Starfield were intended to have a lo-fi retro sci-fi movie look. One such movie, 2001: A Space Odyssey, got a 4K UHD transfer a few years ago and had people on Blu-ray forums complaining similarly about raised blacks instead of inky blacks. However it was clear that the new transfer was closer to how the film originally looked.

I do prefer more conservative black levels and if it bothers people enough they can just use the black level adjustment found on many OLED panels.
 

Gamezone

Gold Member
A good solution to this is to just turn off in-game HDR and turn on Nvidia's RTX HDR or Windows HDR on Windows 11. This usually looks better than most HDR that comes with HDR.
 

Bojji

Member
Some of it is likely an artistic choice. People went on and on about Resident Evil 2 remake having raised black levels but it was intended. If you try to 'fix' it via Reshade then other parts of the game end up with completely crushed black levels.

Other controversial games like Starfield were intended to have a lo-fi retro sci-fi movie look. One such movie, 2001: A Space Odyssey, got a 4K UHD transfer a few years ago and had people on Blu-ray forums complaining similarly about raised blacks instead of inky blacks. However it was clear that the new transfer was closer to how the film originally looked.

I do prefer more conservative black levels and if it bothers people enough they can just use the black level adjustment found on many OLED panels.

Fuck their creative vision if it looks like shit. Just like OP mentioned Origins and Odyssey had much better HDR than last two games.

All games should have ability to display black as 0 nits, if they don't they have broken HDR to me. There should be options to adjust everything to your liking/tv but of course many developers are allergic to that.
 

King Dazzar

Member
AC Origins and Odyssey actually have less than great HDR implementations too. Its just you can get around it and have them looking really good regardless by using adjustments. But at default they have some contrast issues and have raised black levels. Dropping the brightness slider and then changing the paper white and peak sliders, can help a lot.

You're going to hate me saying this. But AC Valhalla was largely improved for me by getting a better luminance capable display. It was the game I played for hundreds of hours when transitioning from OLED over to LCD. On OLED the APL capability wasn't strong enough and often looked a touch too dull unless it was really sunny or I enabled DTM - but then nights were too bright. Most of that was improved by going high end LCD. Its actually pretty good HDR in Valhalla, with shifting clouds changing it all the time. That said Valhalla and many other games have poorly implemented sliders. For example Valhalla shows up to 4k nit slider. But in reality it only hits 2k nit maxed out. The way forward is to always use system level calibration and/or have accurate in game sliders covering paper white, black level and peak 10% window.

Sony first party studios seem to be amongst the best at the moment. And on the Xbox side Forza Horizon and Gears are top tier with their HDR too.
 
Last edited:
I still can't tell if it's devs or me screwing up HDR on PC, but I can't seem to get quality results with my LG C3.

Permutations are:
Windows HDR on/off plus in-game HDR on/off plus in-game brightness settings plus TV brightness settings.

So maybe there's a sweet spot in all of that, but as someone who's familiar with settings tweaks, it's a crap shoot game-to-game how to make it work as intended--or if HDR is broken from the get-go and settings tweaking is a wild goose chase.

PC HDR needs to be standardized.
 
Last edited:

Bojji

Member
I still can't tell if it's devs or me screwing up HDR on PC, but I can't seem to get quality results with my LG C3.

Permutations are:
Windows HDR on/off plus in-game HDR on/off plus in-game brightness settings plus TV brightness settings.

So maybe there's a sweet spot in all of that, but as someone who's familiar with settings tweaks, it's a crap shoot game-to-game how to make it work as intended--or if HDR is broken from the get-go and settings tweaking is a wild goose chase.

PC HDR needs to be standardized.

It can be messy but overall you need HDR on in windows setting for most games. Using windows hdr calibration tool can help (you set maximum and minimum brightness and some games use it, it's W11 exclusive).

Overall pc HDR is ok, just switching it on and off (I recommend "HDR tray") is annoying and should be automated.

For tv use HGIG and these settings (still relevant even for C3):



In games you want maximum peak brightness - 800 nits. For paper white ~250 nits. And minimum brightness - 0.0
 
Last edited:

Stafford

Member
I only came here to say that Assassin’s Creed : Odyssey had beautiful water until they updated to HDR and now it looks like shit. And you can’t turn it off either (unless you cut HDR off on your Xbox and tv, but I’m not even sure that works).

Wait, so even if I were to disable HDR10 in the Xbox settings, it might still not look great? Origins and Oddysey both are games that ended up in the backlog and I have yet to really start with them. But you'd think playing the game in SDR would mean it looks fine again.

Are you folks using dynamic tone mapping?

I had a LG C9. Valhalla was the game that brought to my attention how badly dynamic tone mapping ruins the image. Mid-tones like the grey sky are completely wrecked. You absolutely need to enable HGIG and use the in-game HDR setting appropriately (I believe it was 700-800 nits on my C9). This allows proper tone mapping and everything will look good again.

This could also be the cause for complaints against the water in Odyssey.

PS: just played Mirage not long ago and HDR was good on my Sony a95k as well, although I am on a PS5. Use HGIG on LG TV, and Gradation tone mapping on Sony.

I've tried AC Valhalla with three different TVs ever since the game came out. On a C9, on a Samsung S95B and now Sony A95K. But caves and other interiors that are perfectly dark in SDR were always horribly grey in HDR, no matter which settings I used and I have spent hours on settings, both by experimenting by myself and also by following tips from several AV forums.

Auto HDR on BC games on Xbox actually is what has been surprising me a lot. I really like how many games are with that. Not every single one of them but the majority I've tried for sure. Nowadays I just want to game, and if the HDR is shit I'll try a few things and then turn it off because it's not worth wasting so much time on.

Some of it is likely an artistic choice. People went on and on about Resident Evil 2 remake having raised black levels but it was intended. If you try to 'fix' it via Reshade then other parts of the game end up with completely crushed black levels.

Other controversial games like Starfield were intended to have a lo-fi retro sci-fi movie look. One such movie, 2001: A Space Odyssey, got a 4K UHD transfer a few years ago and had people on Blu-ray forums complaining similarly about raised blacks instead of inky blacks. However it was clear that the new transfer was closer to how the film originally looked.

I do prefer more conservative black levels and if it bothers people enough they can just use the black level adjustment found on many OLED panels.

Yep. They absolutely went for the nasa punk look for Starfield, but holy shit if the interiors like caves didn't look bad man. I believe they "fixed" the HDR since then, or at least have given options to have it look better, but sheesh. It's always good to have options.

It just sucks when you've bought a new game, or you just start with one for the first time and the splash screens already show raised black levels. I already know at that point there is going to be some bullshit!
 

proandrad

Member
Tech issues in gaming are usually due to one of three things: 1) Neglecting what is deemed least important to meet a deadline; 2) Inexperienced developers with a lack of knowledge; or 3) Someone who doesn't care and just does the minimum to get paid.
 

analog_future

Resident Crybaby
With the advent of Auto HDR and RTX HDR, I feel like less and less developers are going to invest the time and money into implementing proper native HDR and will instead rely more and more on these system-level solutions.

Which honestly isn’t the worst. Auto HDR and RTX HDR are both quite good.

I’m surprised Sony hasn’t implemented some type of system level HDR on their platforms, but I feel like it’s coming sooner or later.
 
Last edited:

Shin-Ra

Junior Member
My recent HDR jank experience:

The Stellar blade demo didn’t enable HDR automatically and once manually enabled, the menu/UI art gets noticeably brighter and washed out looking.
 

SHA

Member
It's still being treated as an afterthought by too many developers nowadays, and not just small ones either. Correct me if I'm wrong, but didn't AC Origins and Oddysey both have great HDR?

Then came Valhalla and they totally fucked it up, colors became drab, areas that were pitch black (no light sources) in SDR were this light grey and looked horrible. Don't get me started on how nighttime looked. The same can be said about Mirage. I tried the trial yesterday and HDR is bullshit in it. It's pretty damn similar to Valhalla.

How do they get it right in Origins and Oddysey and totally mess it up and never fixed it for Valhalla and Mirage? If you don't know how to do it I rather you not put HDR in it at all, saves me the hassle from having to go into the system settings and turning HDR10 off solely for one fucking game, on Xbox, that is.

It's a shame because i have a awesome TV for HDR, a Sony A95K, which I absolutely love except for issues that a TV of this price simply should not have, but that's a different story. It just sucks that proper HDR for every damn game is still not a given nowadays.
Hdr colors aren't vibrant, that's a different type of experience especially on games, cause they tend to look better with more vibrant colors.
 

Stafford

Member
Hdr colors aren't vibrant, that's a different type of experience especially on games, cause they tend to look better with more vibrant colors.

Unless it's games like Doom Eternal where the colors are fantastic. And other games as well. But take RDR2, it becomes way less colorful, but in turn has a more realistic look.
 
Last edited:
  • Like
Reactions: SHA

JohnnyFootball

GerAlt-Right. Ciriously.
I think many of you just need to go to the eye doctor. HDR looks great in games. My LG C1, my Alienware 32 inch 4K Qd-OLED look fucking fantastic. My Samsung G85 ultra redid QDOLED on the other hand doesn’t look quite right. I had to adjust the nvidia desktop settings to get it to look right.
 
Last edited:
It's still being treated as an afterthought by too many developers nowadays, and not just small ones either. Correct me if I'm wrong, but didn't AC Origins and Oddysey both have great HDR?

Then came Valhalla and they totally fucked it up, colors became drab, areas that were pitch black (no light sources) in SDR were this light grey and looked horrible. Don't get me started on how nighttime looked. The same can be said about Mirage. I tried the trial yesterday and HDR is bullshit in it. It's pretty damn similar to Valhalla.

How do they get it right in Origins and Oddysey and totally mess it up and never fixed it for Valhalla and Mirage? If you don't know how to do it I rather you not put HDR in it at all, saves me the hassle from having to go into the system settings and turning HDR10 off solely for one fucking game, on Xbox, that is.

It's a shame because i have a awesome TV for HDR, a Sony A95K, which I absolutely love except for issues that a TV of this price simply should not have, but that's a different story. It just sucks that proper HDR for every damn game is still not a given nowadays.

Amen to this! Assassina Creed Origins had one of the very best HDR implementations. A little better than Odyssey even. It grinds my gears to no end that they fucked it up in Vahalla and now Mirage.

The worst HDR debacle though and the one that pisses me off the most is HITMAN 3! Fuck Io Interactive! Hitman 1 and 2 have excellent HDR, then, Hitman 3 comes out using the SAME engine and the HDR is total garbage. These games are even all part of the same goddamn launcher. You'd think this dev would have just the smallest bit of pride and want all 3 games, especially the most RECENT GAME in the series, to all look good, right? Nope, the fuckers never bothered to fix it!

That's the type of shit that really grinds my gears. No consistency, incompetence but much worse is that they KNOW THEY FUCKED UP BUT JUST DONT CARE.
 

Stooky

Member
the problem is there isn’t an industry standard for hdr mastering. most movies are mastered to 1000nits some movies to 5000. for games it can be 1000 to 10,000. there is no tv in on consumer market that show 10000 but yet we have those games.
 
Last edited:

K' Dash

Member
Does HDR have a Standard now?

Last time I checked everyone was doing whatever the fuck they wanted in the TV/Monitor space.

That’s one of the main reasons it sucks.

I gave up on it.
 

ReBurn

Gold Member
Devs probably don't emphasize it because the average person doesn't care that much about the HDR implementation. It would be nice if devs cared that your OLED panel has a higher peak brightness than the sun and blacks darker than the void of space. But in the end most people don't want the best HDR implementation, they prioritize other things like the size of the panel or refresh rate. If cheap to midrange panels had decent and consistent HDR capabilities and HDR was a primary driver for content sales devs would probably care more.
 
Top Bottom