• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

If your not playing your XSX/PS5 on a proper HDR set you are not doing the games justice

mitchman

Gold Member
Thanks but if I turn off HDR in my PS5 settings does the game still render in HDR and then downscale it or is it turned off completely?
Games supporting HDR will typically do rendering internally in HDR then tone map it to SDR. 10bit internal processing is not uncommon and the extra overhead is close to insignificant.
 

Ashtyr

Neo Member
Proper HDR = OLED
that is directly false, or rather it is not totally true.
A good hdr needs a good black level and a good brightness level.

Saying that you only have that in oled is not true.

Basically to have a good HDR you need a good TV, whether Led or Oled

In my case, I prefer the HDR of TVs with many nits because of the impact they produce, but that is already personal taste.
 

aries_71

Junior Member
It’s not that simple. If your gaming is done close to the screen, let’s say a 27-32 inches screen, in a studio or room with artificial lighting, a bright OLED with 1000 nits will probably damage your eyes.
 
To make a long story short. I am playing my ps5 on a Quantum LED 4k tv set with over 1000 nit brightness. Colors pop and the contrast is amazing. I took my library of games over my buddies house and he is playing on a 4 year old 1080p Samsung. Holy shit the difference is night and day. He doesn't have HDR and games often look flat and bland in comparision. It's equivalent of playing an Xbox 360 on a 480p flatscreen. Food for thought!
It very well can be different and older panel type difference. For example OLED TV without HDR will look better than LCD with HDR because of higher native contrast.
 

Raphael

Member
I don't normally like to talk about it but if you twist my arm*, its a Sony ZD9/Z9D. Its an LCD HDR sex festival.

I have the 65", which has ~646 zones, the 75" has ~858 zones and this crazy monster here, the 100" supposedly has ~1536 zones according to display specifications, which I think makes sense because if you divide the zone count into the screen area of each of them you generally get the same zone size on each set, which is ~2.8 sq inches:



Here is Vincent talking about the 65" ZD9 vs a C7:


I got Vincent to come and calibrate it and he ended up staying for twice as long because I wouldn't stop asking him questions :messenger_tears_of_joy: He's a very nice man. The comparison above is what people are talking about when they say you get "better HDR pop" from some LCDs, but its really limited to a very few number of sets, like the following Sony's: XE94, ZD9 and ZG9. Also, the Panasonic DX902 and Samsung Q9FN (2018) are close but have some issues and a fewer dimming zones. I'd put the Sony XE93, XG95 and XF90 as amazing high end sets which amazing peak brightness but they don't have the zone chops to compete with the previous sets I mentioned so there will be a lot more blooming.

* This is a joke I literally can't shut up about it and I've owned it for three (3) years nearly now.
You got the man to calibrate your set? Wow. How much did that cost.
 

Ashtyr

Neo Member
I don't normally like to talk about it but if you twist my arm*, its a Sony ZD9/Z9D. Its an LCD HDR sex festival.

I have the 65", which has ~646 zones, the 75" has ~858 zones and this crazy monster here, the 100" supposedly has ~1536 zones according to display specifications, which I think makes sense because if you divide the zone count into the screen area of each of them you generally get the same zone size on each set, which is ~2.8 sq inches:



Here is Vincent talking about the 65" ZD9 vs a C7:


I got Vincent to come and calibrate it and he ended up staying for twice as long because I wouldn't stop asking him questions :messenger_tears_of_joy: He's a very nice man. The comparison above is what people are talking about when they say you get "better HDR pop" from some LCDs, but its really limited to a very few number of sets, like the following Sony's: XE94, ZD9 and ZG9. Also, the Panasonic DX902 and Samsung Q9FN (2018) are close but have some issues and a fewer dimming zones. I'd put the Sony XE93, XG95 and XF90 as amazing high end sets which amazing peak brightness but they don't have the zone chops to compete with the previous sets I mentioned so there will be a lot more blooming.

* This is a joke I literally can't shut up about it and I've owned it for three (3) years nearly now.


nowadays the miniled samsung, and phiilips I think are about 800 zones in 65 ", 1000 in 75" and 1320 in 85 ".

The Qn900a in 1620 in 65 ", 1920 in 75" and 2340 in 85 "

But people still think that they look like their old led, which they will compare with their new Oled of course.

I have been able to enjoy both a high-end led and an oled one, and for HDR I prefer a led with many nits.

IN SDR the oled is unbeatable, it is perfect !!! but I haven't cared about SDR for a long time
 
that is directly false, or rather it is not totally true.
A good hdr needs a good black level and a good brightness level.
Wrong. Good HDR needs good black level and good peak brightness in close proximity. A bright star against a dark sky etc.

LCD - even with many dimming zones - cannot do good black level and good peak brightness close together. There is bloom, even on the most expensive set. A star field looks like complete shit on every LCD.

This is a simple fact.
 

Forth

Neo Member
Games supporting HDR will typically do rendering internally in HDR then tone map it to SDR. 10bit internal processing is not uncommon and the extra overhead is close to insignificant.
Thanks for this. I thought my PS5 was running slightly hotter with HDR turned on for RDR2 and other titles than they had with HDR turned off at the system level menu. But I'm obviously imagining it.
 

Cyberpunkd

Member
To make a long story short. I am playing my ps5 on a Quantum LED 4k tv set with over 1000 nit brightness. Colors pop and the contrast is amazing. I took my library of games over my buddies house and he is playing on a 4 year old 1080p Samsung. Holy shit the difference is night and day. He doesn't have HDR and games often look flat and bland in comparision. It's equivalent of playing an Xbox 360 on a 480p flatscreen. Food for thought!
And yet you are most likely playing the sound from a 10W TV speakers - food for thought!
 

STARSBarry

Gold Member
I know this thread was nercro'd for a specific question but has quickly turned into the standard "OMFG HAVE YOU SEEN OLED YOU PEASANTS"

so allow me to resurrect a classic joke from beyond the grave.

*cough*

"How do you know if someone has an OLED?

Don't worry it will be the first thing they mention"
 

K' Dash

Member
HDR is not even a standard and needs per game calibration, which is stupid.

Of course everyone trying to justify their stupid $2000 set is going to say they met God through their TV.

120Hz and VRR is where it's at boys, HDR has always been a scam.
 
I know this thread was nercro'd for a specific question but has quickly turned into the standard "OMFG HAVE YOU SEEN OLED YOU PEASANTS"

so allow me to resurrect a classic joke from beyond the grave.

*cough*

"How do you know if someone has an OLED?

Don't worry it will be the first thing they mention"
For real, some of us don't feel like spending $2000 on a TV.
 

JayK47

Member
This is the reason I do not have a PS5 of Series S/X, besides lack of stock. All of my monitors/TVs and 1k or 2k and I doubt I could see the difference from my PS4 and Xbox One. Not only would I have to track down a console, I would need a new monitor/TV to play them on. I considered getting a 4K monitor, but then the video card need to work much harder to display at native resolution and many games would not support native. I had to go through this shit with Xbox 360. I had to buy new TVs just to play damn games. I am not wanting to do that again.
 

RafterXL

Member
This thread is illuminating. This place is like a gamer nursing home. This thread is filled with the same kind of people in every thread about fps, HDR, 4k, basically anything tech related and they're a bunch of "get off my lawn" Luddites who espouse their terrible, backwards, preferences and opinions as facts.

Get with the times, grandpa. OP is right. I mean, you can enjoy gaming on any piece of shit system, or tv, but playing them on the best hardware, with the best features, is a whole different ballgame and the fact that so many here like to fight tooth and nail against these advancements is baffling.
 

Shtef

Member
I agree op, i have older model sony tv xf90 that supports 120hz but without hdr. One time i wanted to try titanfall 2 on 120hz and i switched off 4k and hdr and was surprised how the image looked, its was all washed out. I switched back to 4k/60/hdr and it looked amazing.
 

Reizo Ryuu

Member
and was surprised how the image looked, its was all washed out.
Then you probably need to calibrate your screen, or you enabled game mode; there's no reason for a sony tv, in standard preset, with proper calibration, to look anywhere near "washed out".
 

Shtef

Member
Then you probably need to calibrate your screen, or you enabled game mode; there's no reason for a sony tv, in standard preset, with proper calibration, to look anywhere near "washed out".
I get it, my point was that when compared with HDR, SDR looks washed out.
 

Esppiral

Member
I got a Sony Bravia XH 90 and either the TV is a piece of pos or my eyes are burnt out because I don't see that game changing HDR.....
 
Last edited:
To make a long story short. I am playing my ps5 on a Quantum LED 4k tv set with over 1000 nit brightness. Colors pop and the contrast is amazing. I took my library of games over my buddies house and he is playing on a 4 year old 1080p Samsung. Holy shit the difference is night and day. He doesn't have HDR and games often look flat and bland in comparision. It's equivalent of playing an Xbox 360 on a 480p flatscreen. Food for thought!
DE27100777770209299700
Send me the money, thanks
 

Kuranghi

Gold Member
nowadays the miniled samsung, and phiilips I think are about 800 zones in 65 ", 1000 in 75" and 1320 in 85 ".

The Qn900a in 1620 in 65 ", 1920 in 75" and 2340 in 85 "

But people still think that they look like their old led, which they will compare with their new Oled of course.


I have been able to enjoy both a high-end led and an oled one, and for HDR I prefer a led with many nits.

IN SDR the oled is unbeatable, it is perfect !!! but I haven't cared about SDR for a long time

The reason for that is they keep trying to make their flagship 4K and 8K sets so thin and have better viewing angles to compete with OLED, but it means the native contrast ratio is trash. Forget about the "with local dimming" value because thats meaningless with Samsung because they cheat the test patterns.


So you end up with more blooming/clouding because of the panel, no matter how many zones they have. The Q90A native contrast is trash compared to even a Sony XF90 from 2018/2019 which was a mid to high end set. The QN900A native ratio is embarrassing considering how much it costs, worse than the infamous Sony ZF9 and still not as bright:


I love HDR on my ZD9, still blows me... away:

Funny Face GIF
 

Kuranghi

Gold Member
I got a Sony Bravia XH 90 and either the TV is a piece of pos or my eyes are burnt out because I don't see that game changing HDR.....

I'm afraid it actually IS a disappointing pos*, you are right. Not enough dimming zones, downgraded poor image processor and crap light output.

Sell it and buy a second hand XF90/XG95 and you'll see proper HDR. Its mostly because XH90 is ~650 nits, XF90 is ~950 and XG95 is ~1200.

* Compared to 2017-2019 Sony LCD sets.
 

Kuranghi

Gold Member
You got the man to calibrate your set? Wow. How much did that cost.

£300, it was a multi TV deal with two friends so it was £900 instead of £1050 for calibration of a Sony ZD9, Panasonic GZ2000 and an LG C9. Prob more expensive now though.

That included his flight from Manchester and a hotel. He was a great guy just to chat with and I'm super happy with the calibration, if ever there are gamma problems its 100% of the time because the game is targetting 2.0/2.2 or its just fucked generally.

The 65" ZD9 was £4000 and the 65" GZ2000 was £3000 so pretty worth it imo. It was a massive proportion of the price the 65" C9 owner paid for his set but A) Colour accuracy is a bit subpar on it so was well worth it for him and B) We got him too excited about the whole thing so we could save more money lol.
 

Kuranghi

Gold Member
I get it, my point was that when compared with HDR, SDR looks washed out.

I agree with the other guy, if anything HDR should have more more detail visible in the shadows/less crushing and look "washed out" compared to SDR to peoples eyes. The overall depth of the image will be way higher in HDR though.

Like Jedi Fallen Order for instance, people said it looked washed out but it was just showing shadow detail SDR couldn't.
 

Billbofet

Member
In my experience, HDR games are so inconsistent and time-consuming to calibrate that I have to convince myself it looks better. 9 times out of 10, it blooms waaaaaaay too much. Also, if I dial it in on the display settings vs. in-game, it's just that much more messed up for all the other games. I turned HDR and Auto-HDR off on my Series unit and have not looked back.
There is a benefit there, but for me, it's not worth the headache.
 
The reason for that is they keep trying to make their flagship 4K and 8K sets so thin and have better viewing angles to compete with OLED, but it means the native contrast ratio is trash. Forget about the "with local dimming" value because thats meaningless with Samsung because they cheat the test patterns.


So you end up with more blooming/clouding because of the panel, no matter how many zones they have. The Q90A native contrast is trash compared to even a Sony XF90 from 2018/2019 which was a mid to high end set. The QN900A native ratio is embarrassing considering how much it costs, worse than the infamous Sony ZF9 and still not as bright:


I love HDR on my ZD9, still blows me... away:
Death to X wide angle! xD

Makes sense on an 85 inch z9g but a 55 inch 950h? :messenger_dizzy:

Samsung is even worse than Sony in this way, they have the optical layer AND do funky stuff with the pixel output to help with the angle at the expense of everything else.
 

acm2000

Member
I'm afraid it actually IS a disappointing pos*, you are right. Not enough dimming zones, downgraded poor image processor and crap light output.

Sell it and buy a second hand XF90/XG95 and you'll see proper HDR. Its mostly because XH90 is ~650 nits, XF90 is ~950 and XG95 is ~1200.

* Compared to 2017-2019 Sony LCD sets.

i have the xf9005, i specifically went with a previous years model because the screen was actually better than the current model at the time (H), i wanted the juicy 1000nits hdr
 

Kuranghi

Gold Member
In my experience, HDR games are so inconsistent and time-consuming to calibrate that I have to convince myself it looks better. 9 times out of 10, it blooms waaaaaaay too much. Also, if I dial it in on the display settings vs. in-game, it's just that much more messed up for all the other games. I turned HDR and Auto-HDR off on my Series unit and have not looked back.
There is a benefit there, but for me, it's not worth the headache.

Totally agree that its a massive headache and in a sorry state, but can I ask what exact model you have?
 
If the majority of your gaming sessions take place in a dark or dimly lighted room or if you play less than 6ft away from your TV, then HDR is probably not for you. For anyone else, as long as you don't buy the cheapest HDR set on Black Friday, it should be a nice upgrade to your visual experience.
 
Any Mini-LED or OLED set should qualify.
Mini-LED has been around for less than 2 years, and most brands only released their Mini-LED tech this year like Samsung. There are definitely good sets out there without this technology. If you're going the LED road, then the most important things are the number of dimming zones available (higher = better), the quality of the dimming/backlight technology (varies from brand to brand), and the max nits that the TV can emit in HDR (At least 600 for a decent HDR, 1000+ for premium).
 

Billbofet

Member
Totally agree that its a massive headache and in a sorry state, but can I ask what exact model you have?
I have a BenQ 4k projector for most gaming - also a 55" flat panel in the living room - not sure on model - it has Dolby Vision. I know pj's are never going to be a reference point for HDR, so before I get beat down by the OLED overlords, I get it. My point is that HDR on movies is and has been spectacular. It seems significantly more consistent than games. I spent a ton of time dialing it in for Assassin's Creed Origins a while back and it was truly spectacular with HDR - a big step up from SDR. That said, it was a lot of work and did not translate to other HDR games. I have limited time and I would rather just game and not have to calibrate for most games. Just my preference.
 

Kuranghi

Gold Member
I have a BenQ 4k projector for most gaming - also a 55" flat panel in the living room - not sure on model - it has Dolby Vision. I know pj's are never going to be a reference point for HDR, so before I get beat down by the OLED overlords, I get it. My point is that HDR on movies is and has been spectacular. It seems significantly more consistent than games. I spent a ton of time dialing it in for Assassin's Creed Origins a while back and it was truly spectacular with HDR - a big step up from SDR. That said, it was a lot of work and did not translate to other HDR games. I have limited time and I would rather just game and not have to calibrate for most games. Just my preference.

Its just that if its a lower end set or an LG LCD then HDR just looks awful in it in general.
 

Nikana

Go Go Neo Rangers!
What games even do HDR well? Its pretty hit or miss in my experience.
Doom Eternal
Ratchet and Clank
Ratchet and Clank A Rift Apart
Forza Horizon 4
Forza Horizon 5
Flight Sim
Ghost of Tshuima
Cyberpunk 2077 (PC)
Uncharted Lost Legacy
Call of Duty Modern Warfare (might be the best Atmos expereince)
Ori and the Will of the Wisps
Tetris Effect
Gears 5 (atmos mix is strong)
Shadow of the Tomb Raider (atmos is top tier as well)
Death Stranding
Assassins Creed Odyssey
Metro Exodus
Sea of Thieves
 

Kataploom

Member
Not having current gen console but PC with Windows 11 and a LG ThinQ HDTV with HDR and it looks cool... I don't care about doing games "justice", I'll just use them, milk them and trash them as soon as I'm done with them 😎
 

Kuranghi

Gold Member
Doom Eternal
Ratchet and Clank
Ratchet and Clank A Rift Apart
Forza Horizon 4
Forza Horizon 5
Flight Sim
Ghost of Tshuima
Cyberpunk 2077 (PC)
Uncharted Lost Legacy
Call of Duty Modern Warfare (might be the best Atmos expereince)
Ori and the Will of the Wisps
Tetris Effect
Gears 5 (atmos mix is strong)
Shadow of the Tomb Raider (atmos is top tier as well)
Death Stranding
Assassins Creed Odyssey
Metro Exodus
Sea of Thieves

Agreed on everything except bolded, I think it was malfunctioning for me on PC because it was complete dogshit at default settings and my display is calibrated so would usually look great. Moving sliders couldn't fix it. I took these tonemapped screenshots to give a rough idea of what it looked like for me:


Vs SDR it had less shadow detail/crushed shadows, blown out highlights with banding and colours looked really oversaturated. Most people seem to say its amazing so I tend to think it a weird issue on my end but I could never fix it after hours of tweaking, might reinstall and see if its different now.
 

Billbofet

Member
Doom Eternal
Ratchet and Clank
Ratchet and Clank A Rift Apart
Forza Horizon 4
Forza Horizon 5
Flight Sim
Ghost of Tshuima
Cyberpunk 2077 (PC)
Uncharted Lost Legacy
Call of Duty Modern Warfare (might be the best Atmos expereince)
Ori and the Will of the Wisps
Tetris Effect
Gears 5 (atmos mix is strong)
Shadow of the Tomb Raider (atmos is top tier as well)
Death Stranding
Assassins Creed Odyssey
Metro Exodus
Sea of Thieves
I agree with your list on the games on it that I have played. I think my favorite so far has been Uncharted Lost Legacy. Gears, Forza's and Ori as well.
 

Nikana

Go Go Neo Rangers!
Agreed on everything except bolded, I think it was malfunctioning for me on PC because it was complete dogshit at default settings and my display is calibrated so would usually look great. Moving sliders couldn't fix it. I took these tonemapped screenshots to give a rough idea of what it looked like for me:


Vs SDR it had less shadow detail/crushed shadows, blown out highlights with banding and colours looked really oversaturated. Most people seem to say its amazing so I tend to think it a weird issue on my end but I could never fix it after hours of tweaking, might reinstall and see if its different now.
Saturation was a huge issue at launch as the HDR settings/options werent precise enough. Its not an east game to calibrate but they released an update with better controls over it but you have to know how to calibrate an HDR image to get it correct as the sliders give you no guidance.
 

Kuranghi

Gold Member
Saturation was a huge issue at launch as the HDR settings/options werent precise enough. Its not an east game to calibrate but they released an update with better controls over it but you have to know how to calibrate an HDR image to get it correct as the sliders give you no guidance.

Yeah I saw that stuff but even ignoring the saturation it just looked really wrong, like when you just know an image isn't outputting correctly. I had the same issue in Guardians of the Galaxy recently actually so I'm going to try fixing it the same way as I did there, which was forcing a non native res and then back to native.
 

Ashtyr

Neo Member
U underwater

there is no problem with star scenes on a led, a good one.

And yes I know perfectly well that the Spears & Munsil test, the Starfiled on a led you will hace blooming, but that test is made for that, it is not representative of what will happen in real content, and I can demonstrate it without problems, in fact even in star videos other than that they will do it perfectly.

So using that test as evidence of something is wrong
 
Last edited:
Top Bottom