• Hey Guest. Check out your NeoGAF Wrapped 2025 results here!

HDR gaming, how important is it, actually, what is it?

E6 Owner here
high-five!
, and as you can attest, HDR is legit and amazing to behold when used correctly. It's one of those things that you can rightfully be dismissive of until you see it for yourself.

Yes indeed, my x940c opened my eyes to it quickly, I've been curious how OLED's are handling it with the lower peak brightness. People dismiss this feature too easily, or think its something they've had for years, lol. Once you actually watch some content on it you'll realize its special, even more so when you go back to non-HDR versions of the same content.

I think one thing that will screw this whole thing up is companies like Sony and Samsung releasing Edge lit HDR pannels. Completely misses the point, ruins the pictures, and ruins peoples impression of HDR. But manufacturers are doing it to get that HDR lable throw on all their sets.
 
The tv itself doesn't subsample, this is done when compressing the movie before streaming or putting it on a bluray. The tv should be able to handle a 4:4:4 video signal from a pc for instance.

Most TV's today have a subsampled pixel formation, so for example if you was to play a UHD Bluy-ray disc that runs at full 4:4:4 Chroma, most TV's will run it compressed unless your TV is full chroma (4:4:4), right now though most content is broadcasted @ 4:2:0 as it's easier to deploy.

I was just trying to establish whether a TV that is classified as 'HDR Compliant' also has to have full chroma support.
 
Got a laugh at people thinking it was an option they turn on and off in their old PC games. I remember the Half Life HDR push.

Glad we're educating in here though.
 
Yes indeed, my x940c opened my eyes to it quickly, I've been curious how OLED's are handling it with the lower peak brightness. People dismiss this feature too easily, or think its something they've had for years, lol. Once you actually watch some content on it you'll realize its special, even more so when you go back to non-HDR versions of the same content.
Yeah, that's the unfortunate reality of technologies that can only be experienced in person. Same for Gsync/FreeSync etc. Haven't experienced HDR myself yet and unfortunately my bros 4K set doesn't support it but I'll probably have to check it out in store someday (I should've done research when he asked me for help but I just dismissed it ;_; Now the salt is real after preordering a 1080 with HDR support and enough power to deliver decent performance at 4K).

I think chances are we will soon get HDR high end gaming monitors hopefully since both Polaris and Pascal support the required DP/HDMI specs.
 
Quote from that video.

"The jump from 1080p to 4K wasn't that noticeable until HDR rolled out."

It's that huge apparently.

Yes, but you can have 1080p with HDR, too, it's just that it's only built into 4K TVs because reasons.

BTW, there's also wide color gamut, which adds a lot to the picture. There are TVs that support both HDR and WCG (Vizio P-series), and some that ONLY support HDR (Vizio M-series). It's massively confusing right now.
 
Most TV's today have a subsampled pixel formation, so for example if you was to play a UHD Bluy-ray disc that runs at full 4:4:4 Chroma, most TV's will run it compressed unless your TV is full chroma (4:4:4), right now though most content is broadcasted @ 4:2:0 as it's easier to deploy.

I was just trying to establish whether a TV that is classified as 'HDR Compliant' also has to have full chroma support.

No, it doesn't have to, most UHD discs are 4:2:0.
 
Yes indeed, my x940c opened my eyes to it quickly, I've been curious how OLED's are handling it with the lower peak brightness. People dismiss this feature too easily, or think its something they've had for years, lol. Once you actually watch some content on it you'll realize its special, even more so when you go back to non-HDR versions of the same content.

I think one thing that will screw this whole thing up is companies like Sony and Samsung releasing Edge lit HDR pannels. Completely misses the point, ruins the pictures, and ruins peoples impression of HDR. But manufacturers are doing it to get that HDR lable throw on all their sets.

Couldn't agree more, On my OLED, its quite bright. having true black as a baseline really enhances the panels overall brightness. Lower peak nits compared to LED but what is there really pops. My wife covered her eyes when passing the TV the other night when a scene switched to outdoor during Man in the High Castle on Amazon, lol.
 
giphy.gif


Uhh, many videophiles agree that HDR makes a big difference. Some think it's a bigger jump than 4k alone.
 
Yes indeed, my x940c opened my eyes to it quickly, I've been curious how OLED's are handling it with the lower peak brightness.
The much deeper blacks, read pure black, balance out the lower peak brightness on OLED displays.
I think one thing that will screw this whole thing up is companies like Sony and Samsung releasing Edge lit HDR pannels. Completely misses the point, ruins the pictures, and ruins peoples impression of HDR. But manufacturers are doing it to get that HDR lable throw on all their sets.
Jup, there are quite a few edge lit "HDR" displays out there. Sony, Panasonic, Philips, Samsung all sell edge-lit TVs as HDR displays.

As you say, this will only drive people away from HDR due to the subpar experience it'll give on an edge-lit display.
 
The much deeper blacks, read pure black, balance out the lower peak brightness on OLED displays.
Jup, there are quite a few edge lit "HDR" displays out there. Sony, Panasonic, Philips, Samsung all sell edge-lit TVs as HDR displays.

As you say, this will only drive people away from HDR due to the subpar experience it'll give on an edge-lit display.

It blows my mind that Vizio are one of the only companies making FALD sets. And they're dirt cheap by comparison to the edge-lit ones.
 
High dynamic range (HDR) creates a more realistic picture
HDR technology expands the contrast and color range of the pixels on your display to reveal a better, brighter, more colorful image. The picture quality is noticeably more natural.

:)
 
Which one of the top pictures look more like what we see with our eyes?

Also, will HDR fuck with filmmakers' intent in movies? Will it distort what movies should look like, similarly to what dynamic contrast options in TVs do?
No. Dynamic contrast is nothing more than guessing work done by your TV and changes the contrast of the entire picture shown based on that guess. Altering the colours, blacks and whites in the process and thus distorting the filmmakers intent. Which is why dynamic contrast should always be disabled.

HDR actually enhances the vision of the filmmaker by giving the filmmaker more control over the final picture.
 
Isnt this just some bullshit buzzword thats pretty meaningless?

Also who is using this term besides MS?

Every major TV manufacturer...

thisisneogaf.gif


I was really bummed my TV from November didn't come HDR ready. It's an S model Samsung so it's a great tv but I love the colors on HDRs. Unfortunately I didn't want to drop the extra cheddar since basically no HDR sets went on sale.
 
Good thread

I have been wondering about this and what TV will actually be best for gaming in full 4K glory and HDR options.

I thought 1080P was great and now xbox one S will support HDR.

Am i missing out?
 
Yes indeed, my x940c opened my eyes to it quickly, I've been curious how OLED's are handling it with the lower peak brightness. People dismiss this feature too easily, or think its something they've had for years, lol. Once you actually watch some content on it you'll realize its special, even more so when you go back to non-HDR versions of the same content.

I think one thing that will screw this whole thing up is companies like Sony and Samsung releasing Edge lit HDR pannels. Completely misses the point, ruins the pictures, and ruins peoples impression of HDR. But manufacturers are doing it to get that HDR lable throw on all their sets.

To be fair customers are price conscious, and unfortunately we local dimming zones and smart backlights are simply more expensive to make. I expect they'll come down, but we'll still see improvements on the top end too (the x940D shits on the x940C...).
 
I guess the source video needs to have HDR capabilities, and that a desktop, internet, or pictures won't have HDR? Still confused...
Short answer: no they wont
Long answer: Currently the mayority of the desktop, internet and images use sRGB, or "SDR". But You can attach color profiles to images, and color managed applications and browers should show it correctly on your monitor, whether it will support HDR or not. But right support for PQ ("HDR gamma-replacement") is virtually non-existent on PC.

Also, will HDR fuck with filmmakers' intent in movies? Will it distort what movies should look like, similarly to what dynamic contrast options in TVs do?
No, but the whole process from capture to display has to change to support HDR movies.
More on that topic here:

https://pro.sony.com/bbsccms/assets/files/cat/mondisp/articles/HDR_X300_explained.pdf

Q: Let’s start with the background of high dynamic range (HDR).
BILL BAGGELAAR: High dynamic range is not new. It is the way that most of us see the world everyday. Our eyes are high dynamic range and wide color gamut sensitive equipment. On a bright sunny day, they can see cloud highlight details while still being able to see into the shadows. In dark environments, we can start to see detail in extremely low to practically no light. Some cameras are at least getting closer to being able to detect some of the same things on those sunny days. But they have also gotten better in the dark, not as good as our eyes, but certainly better as the sensor technology improves. On the other hand, the television sets or other types of displays that we typically have used to watch content has a much lower dynamic range than our eyes or even the cameras that are being used to capture the content. So TV viewing, up to this point, has been what we now call, Standard Dynamic Range (SDR) where we have maybe in the order of seven to nine stops of exposure depending on the display. With these new displays, we are now talking about Ultra-HD (4K/UHD) and High Dynamic Range (HDR) that could potentially go up to 20 stops. But more practically for the consumers, we’re talking maybe 12 to 14 stops. We are inherently contrast sensitive beings and so what this increased dynamic range does is it gives us an increased sense of sharpness, detail, clarity, color and saturation; all these things that we see in the real world, but we’re not able to realize on today’s consumer displays or even in theaters, for that matter. And since we are incredibly adaptable beings, when we’re watching a movie or a TV show in a particularly either dark or bright location, our eyes and brain adapt. We may see something right away where the contrast doesn’t necessarily feel right, but we adapt pretty quickly to it and can watch it and not be distracted by it. But now with these higher dynamic range displays, we’re able to really start to give people more picture details in order to provide a more immersive experience. As I mentioned, getting better color saturation goes along with this whole wider color gamut piece that is part of HDR. So HDR is not just about contrast, or about resolution; or even about color; it’s about being able to combine all three to represent images more accurately on consumer displays. This gives content creators an expanded canvas to represent things that are true to life or they can even go hyper real. For finishing movies for theaters, we work in P3 color space, which is a wider color gamut than the TV standard, Rec. 709. Often times there are very specific colors, particular purples, reds or translucent colors; that cannot be displayed in SDR/Rec.709, so we have to do additional color correction to nicely squash it down into Rec. 709 for consumer displays. There’s all sorts of saturated colors whether it goes from blue to orange to purple to green that we can now represent on consumer displays that we’ve never been able to represent before and that provides something closer to the original artistic vision, representing what the DP and Director originally intended for the viewers to see.

Q: So this is basically getting us as close to reality as what’s technically possible.
BILL BAGGELAAR: Yes, within certain constraints. There is the reality of staring at the sun on a bright sunny day. It hurts your eyes and we obviously don’t want to get that real. We don’t want people to be hurt by the content. So, within certain limits, yes. An additional sense of reality is possible, but I don’t know that we necessarily need to focus just on the reality piece. I think the content creators can present the vision that they want the consumers to experience more accurately although that sometimes doesn’t necessarily mean real. It just means that it’s maybe more immersive or maybe going to the level of creating colors that you don’t see in the real world that can exist, but you don’t necessarily see normally, but we can represent them in a way that we’re not able to with today’s display technology.

Q: Is it the camera delivering more in order to get that result? Or is it more of a postproduction process, or perhaps both?
BILL BAGGELAAR: A bit of both. Certainly it is better to start with captured images that have more inherent information in them. So the cameras have to be able to capture a wider dynamic range and a wider color gamut in order to truly take advantage of that in post. It doesn’t mean that you absolutely have to start with wider sources, but there are diminishing returns when starting from “narrower” sources. You’re always going to have the potential to get better results when you capture with a camera that has higher resolution, wider dynamic range and wider color gamut than the intended display, which is typically what we do. We’ve got the Sony cameras that capture in S-Log, one, two or three and we’ve got S-Gamut, which is a much wider gamut than P3. And as Rec. 2020 comes along, S-Gamut is actually even wider than Rec. 2020. Film has always been very wide as well and many of the other digital cameras have much wider color gamut than the displays that we actually have today or even are being planned for the near future, so starting from a wider color gamut helps to make sure that you can represent those colors on the displays.
 
High dynamic range (HDR) creates a more realistic picture
HDR technology expands the contrast and color range of the pixels on your display to reveal a better, brighter, more colorful image. The picture quality is noticeably more natural.

:)


Yep
 
HDR is a way more noticable improvement than 4K is. Go watch Fury Road in HDR and you will be convinced.

Well, Mad Max is mastered at 2K. This is the main reason why I think the format overall is a marginal upgrade over 1080p. Only a fraction of movies today are filmed in 4K and even if they are the CGI isn't.
 
Yeah, full and complete HDR support is why I'm waiting on buying a 4K tv. Right now there are cheap 4k tvs, but I want one that will actually give me HDR (and thus be more future-proof) and those sets are still more expensive than I'm willing to pay. So I'm waiting for 4K HDR prices to come down. It'll totally be worth it though.
 
HDR is hardware tech built into TVs. If you don't have the hardware in your TV you won't get the desired effect.

This is not to be confused with HDR lighting in games that became a popular buzz word during the Half-life 2 days.

This right here. I hate that they have the same acronym. They are different things.
 
It blows my mind that Vizio are one of the only companies making FALD sets. And they're dirt cheap by comparison to the edge-lit ones.

Sony 940D is FALD, Samsung have FALD sets to. The problem with FALD, at least in Samsung eye's and to a extent Sony is the aesthetic of the display. The display can't be as thin, if the display is direct lit and not edge lit.

They also want to market FALD as a premium, Vizio and HDR may force there hands to make it standard. Especially if Vizio decides to international.
 
Of course you can't see the actual colour difference between the two because the image isn't being displayed in HDR on your monitor/tv, unless you have a HDR TV? Even then it wouldn't work.

I recently got a Samsung UHD 4K Blu-ray player and a KS7500 HDR 4KTV. Once I've got them all connected I'll come back and report if there is a difference.

Also, I was under the impression that the HDR in the Scorpio is just for 4K streaming/4K Blu-ray and not for games? Would'nt a game have to be on a 4K blu-ray disc and make it Scorpio only and the software not compatible with other Xbox consoles as it has to have the correct data built in (as not all 4K films have HDR)?

Yeah I'm aware, that's the joke...
 
Isn't HDR like 3D sound? Just stupid bullshit.
Close, but not close enough.

HDR is a new display standard. Backed by every TV manufacture, the UHD Alliance, Hollywood, Netflix, Amazon, Vudu, Blu-ray association, just to name a few.

It's the next big thing. I personally see HDR as something bigger than 4K. There are plenty of posts in this thread already that explain what HDR is and that it's no to be confused with HDR in Half-life 2 for example.
 
No no no. Unfortunately the Fury Road transfer is crap.

Yes yes yes. It is easier to notice color than it is resolution. I know people who couldn't notice the change from SD to HD. The type of people who still had their cable box hooked up with composite until I switched it for them. They immediatly noticed HDR, even on a bad transfer.
 
HDR is legit and might be a major thing in gaming, at least both

Nvidia https://www.youtube.com/watch?v=tDUGWUWRRNU

and AMD https://www.youtube.com/watch?v=MnvctltAKLE

are integrating the output and rendertarget support (my guess would be r10g10b10 will prevail) into their future graphic cards, although AMD is more outspoken about it.



The "HDR" "buzzword" that came with Half-life 2 (and subsequently with basically every game and game engine) is legit as well.

Every relevant engine out there is using HDR for their lighting calculations right now.

To many players it actually basically meant "more bloom and overcast", but even the implementation in Half-life 2 was not just a marketing term.

Behind the scenes the scene is rendered in a much higher color spectrum than displayed in the end. This usually is encoded down so it can be stored in the normal 32bit rendertarget.

What that means is that the rendering is done in a very wide color spectrum - from very very dark to very very bright.

Afterwards camera exposure is simulated (tone mapping and/or eye adaption) and this brings the range down to a level that has less extremes and is suitable for display output.

What makes it similar to "bloom" is that the parts which are selected to bloom/haze/glow are based on overexposure on the final image.

So if there are parts in our HDR screen which are much brighter than the bright colors we get in our final output we simply blur them out to simulate camera/eye behaviour.

The difference back then was basically how the bloom threshold was selected.

HDR bloom is very much standard in every relevant engine today. It's just not advertised any more because everyone is doing it.



Either way, in games the hdr image still has to be stored in standard render target formats, (normally 8 bit per color) which is a problem for precision.

Logaritmic encoding helps (some color differences are easier spotted by human eyes, so we don't need to encode them linearly), but in the end more precision in form of more bits per color would be preferable.

HDR rendertargets, at least right now, look to incorporate 10 bits per color (256 vs 1024 shades), but this has to be supported by graphics cards first.

Which is what AMD is doing and what MS is advertising with scorpio.
 
I still haven't seen it, but I'm dying to as I hear nothing but great things about it. Netflix even said that if you were viewing 4K material with HDR and your speed dropped, they'd go to 1080p with HDR rather than 4K without it.
 
HDR might not be all it's cracked up to be...

http://www.hdtvtest.co.uk/news/4k-vs-201604104279.htm

While I disagree with this whole article.

This is an issue. The problem is it isn't so much fully an HDR issue as it is just a general TV issue.

It's also why I generally hate when people say to Google calibration numbers as room lighting varies greatly. That means general recommended specs on a site like RT can look fabulous to total shit if the room lighting isn't similar.

Not to mention the difference in panels. Even beyond OLED vs LED you have the IPS vs VA shit going on.

The other thing is if you calibrate for a room with the lights out cause you are gonna watch a movie at night you could end up needing totally different settings during the day or with lights on.

Again this is a current issue even with 1080p TVs that generally gets glossed over.

Also I hate all of the buzz worthy pics for HDR as said in this thread. It's something you have to see in person to really get a feel for.

PS: I do think 4K res is undersold. Even with 2k content upscaled I think it makes a huge fucking difference even on 50" displays.
 
Well, Mad Max is mastered at 2K. This is the main reason why I think the format overall is a marginal upgrade over 1080p. Only a fraction of movies today are filmed in 4K and even if they are the CGI isn't.

Agreed...for most movies, except the most recent 4k is not a great advance since that detail really doesn't exist (35mm is supposed to capture to 5k but in reality it really doesn't based on lenses and so forth)

HDR on the other hand is a huge upgrade and even non native 4k res movies should be released to make use of it

High Definition Ready

Welcome to the club Scorpio.

This is some sort of weird "xbox 720p" thing isn't it
 
HDR will make it easier to tell how hard Microsoft fucked the gamma curves of their future consoles. Gotta make them colors pop! It's not like YOU know how to calibrate a TV properly, so MS is here to make it look like a poorly calibrated Vizio from 10 years ago!
 
Agreed...for most movies, except the most recent 4k is not a great advance since that detail really doesn't exist (35mm is supposed to capture to 5k but in reality it really doesn't based on lenses and so forth)

HDR on the other hand is a huge upgrade and even non native 4k res movies should be released to make use of it

Didn't realize it, but so far not a single animated movie has been rendered at 4k.

Guess it makes sense that they don't exist for most effects.

Still the new color gamut will be awesome.
 
Top Bottom