• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why is HDR a thing now?

Paz

Member
People really should see what The Revenant UHD looks like on a HDR set. It is amazing.

Forza Horizon 3 demo (no HDR) looks great, excited to see what difference the HDR in the full game will make.

This movie looks unbelievably good on UHD Blu Ray, I watched about 30 minutes on my brothers LG E6 OLED and it was sort of mind blowing, like the first time I booted up my Dreamcast and saw what 3D graphics were meant to look like.
 

TheExodu5

Banned
As far as I understand it, HDR is essentially adding a third dimension to a 2 dimensional image: brightness.

Brightness used to be a constant factor...your entire screen was a certain brightness and that was it. Then with local dimming brightness became tied to the color being reproduced...blacks were given lower brightness settings and whites were given higher brightness settings.

With HDR, brightness and color are now independent. HDR is basically a brightness map over the entire image.

Is this interpretation correct?

10 bit is a whole other thing. Even if you had a 10 bit internal panel, it only accepted 8 bit color input, so your monitor or TV was reliant on image processing to correctly convert the image to 10-bit to reduce color banding.
 

NewDust

Member
Honestly it should have been introduced 5-6 years ago. The PC is still 3 steps ahead of the competition.
._(ツ)_/¯

You really don't know what you are talking about...

Edit: why you can't use backslash on gaf... I'm not trying an sql-injection or something.
 

jmdajr

Member
This movie looks unbelievably good on UHD Blu Ray, I watched about 30 minutes on my brothers LG E6 OLED and it was sort of mind blowing, like the first time I booted up my Dreamcast and saw what 3D graphics were meant to look like.

Man, I got a little choked up reading this.

What a day that was.
 
Ok....so I have a couple of questions.

Right now we have:
HDR 10 (10bit Color) and Dolby Vision(12bit Color)

HDR10 is probably gonna end up being the standard right? I mean since there are licensing fees for Dolby Vision.

Also, how long before there is a HDR12 ?

I thik both hdr10 and DV will coexist. Hdr10 is nice, but i think DV is better due to the dynamic metadata. When hdr10 gets dynamic metadata its going to be better than what it is now.
 

Aces&Eights

Member
I've been eyeing this TV as well. Looks to be the one

Ditto. I check it on Amazon every day. I'm letting Xmas pass then jumping on it in early Jan. I suspect it will be down to $1200ish. I've seen it in person at my local TV outlet and it looks reeeeeeally good and I doubt the store has gone in and fine tuned/calibrated the display model.
 

Warnen

Don't pass gaas, it is your Destiny!
Heres what im confused about....my LG set is a 4K HDR 10 set, and after the 4.0 update yesterday the PS4 setting for HDR defaulted at "automatic" which is what it needs to be for HDR sets. But now do i need to go into my tv settings someplace and enable HDR or is it just set to automatically recognize a HDR device and display accordingly?

1 you need to make sure your HDMI cable can do it. Dunno if the pack in cable can. 2 most TVs you have to enable it in the setting somewhere. I have a Sony and I had to before my Xbox One S would display HDR.
 
I picked up on HDR when Xbox One S made a deal of it at E3. I tend to highly favor some Xbox exclusives like Forza Horizon and Gears of War despite generally being more PlayStation brand loyal, and both games emphasized HDR compatibility during interviews.

Since the actual release of the Xbox One S, I have noticed an increase in conversation, but I'm kind of grateful because it does seem like a visual gamechanger to me. I don't own an HDR display and likely won't for a while, but having seen it in person at my local Best Buy's Magnolia home theater department, it was pretty stunning to see the added depth of black, the sheer brightness of white, and the detail afforded by the increased color range. I have yet to actually see a video game running in HDR, but the typical TV demo footage was enough to have me convinced that its application in video games is undoubtedly going to be awesome.

It's been said over and over, but 4K is certainly a nice progression for image quality, but paired with HDR, I think it does in fact provide a very substantial visual upgrade, with most of the actual gain coming from HDR contrast and color.

It has a few kinks to sort out -- I'd like to wait and see how the format battle between HDR10 (open source) and Dolby Vision goes first. I'd also like 4K+HDR sets to become just a little more refined when it comes to gaming -- it kind of feels like the tech is still adjusting some response time vs image quality below the high end spectrum. I'm hoping the technology will allow for a better balance of both at a way more affordable level.

I'm not worried about content as far as video games go. E3 and PlayStation Meeting got me pretty convinced that it's going to be a relatively standard feature in games going forward, starting as early as October (not in terms of games that actually have it, just in terms of when you can start expecting its implementation in almost every high budget game coming out).

Hoping things reach a level I'm comfortable with in terms of price/quality between summer and fall 2018. By then, might be able to see how well PS4 Pro's 4K checkerboard upscaling technique is holding up compared to native content, I'll be able to see how Scorpio is holding its own in the 4K space, and perhaps I will have actually witnessed in-person some sort of HDR-enabled video game to just solidify my notions so far.
 
Because Sony knew what gamer buttons to push at that conference. Mention HDR a milliom times and the people who follow this stuff will suddenly feel inadequate and look to buy new tech.

I almost did it. Was on my way to buy that Samsung everyone fell all over themselves to talk about until I saw what the stand looked like. I was like I aint settling for some TV that looks like a reject from the Carousle of Time.

Took a breath, stopped reading those TV threads and here I am..a week later playing a 10 year old game on my 1080p TV and it looks great!!
 
From what I read Dolby Vision can do HDR10, but not vice versa.

Most content won't support Dolby Vision though.
DV and HDR10 are standards, they can't display each other. "DV can do HDR10" is like saying "Gsync can do FreeSync" or "BluRay can do HD-DVD."

There are TVs that can do both though, and I would think that will get more common as newer HDR TVs come out.
 

Ushay

Member
Because it makes a huge difference to the image quality, if you actually see an image with working you'll understand. There is NO way you will understand looking from a SDR set.

It isn't just PR, there is reason it is being pushed to this extent.
 

Unknown?

Member
I would like to know this as well as i have a true 10 bit panel. To my understanding, games are currently 8 bit. Maybe we can see hdr colors just not the amazing contrast? Don't know
My TV says 12 bit when it displays the input, is my TV a 12 bit TV? Or is that something different?
 
I think it's a bunch of bullshit, honestly. Displays vary greatly when comparing them anyway, and when properly calibrated, normal monitors can look just as good as an HDR equipped set based on what I've seen of the technology. I'm not saying that it doesn't make a difference, of course, but I think it's a lot more subjective then these tech companies want you to believe.
 

nOoblet16

Member
Everyone is talking about how HDR shows lots of colours and stuff but almost no one in this thread actually answered OP's question about how HDR in games that we've had for thr past decade and HDR TV are different.



The software HDR you describe as being used in the Source engine and other games is AN ENTIRELY DIFFERENT THING than HDR in modern TV's. They are both trying to show High Dynamic Range, but they are fundamentally different technologies.

HDR in games has referred to the "exposure" of a scene dynamically changing in a similar way that the human eye can adapt to different light levels, so you can see a wider range of colors overall, but not at the same time. If you look at a shadowed area, your "vision" will adapt and get brighter so you can see details in that darker area, but brighter parts of the screen will get "over exposed" and blown out, so you lose those details. Then when you look at a brighter area things will get dimmer, so the bright area is in detail but the dark area goes to black.

HDR as it's referred to on PS4, Xbox One S and 4K TV's is showing a High Dynamic Range all at once, because the actual hardware can physically display a wider gamut of colors and brightness then was possible on older TV's. Whereas the software technique used in games would brighten or darken the scene, this newer HDR would be capable of showing the darkest and brightest parts of the entire scene all at once, without loss of detail. It wouldn't need to be "faked" like the software technique used in games. Going beyond simple brightness, it can also show more gradations of color, which means it can show more detail in regular scenes, or ,in the case of games, potentially revealing detail that has always been there in textures but was previously unable to be seen due to the limitations of the number of colors the TV's could actually display.
Eye adaption and HDR are not the same thing. HDR in games effectively means you can have lots of varying level of luminosity from lightsources without having the textures surrounded by those lightsources affected by it. In essence this is full dynamic range (provided the HDR is high precision) and there is nothing "fake" about this HDR.

What is actually fake is the tone mapping process which is used to convert this full HDR image into an image capable of being displayed on am SDR screen. The way they achieve this is similar to how DSLRs do HDR where they combine 3 images of varying intensity to produce one image.

As for the tone mapping process itself, the games did full HDR internally but then they had to tone map this full range into 256 colour space (8 bit) since the TV's range was limited. Infact, even with the new HDR TVs, the full range internal image still needs to be tone mapped. This is because HDR TVs are limited to 1024 colour space (10 bit) which while a lot higher than 256, is still no where close to the range of what the human eye actually sees.
 
Heres what im confused about....my LG set is a 4K HDR 10 set, and after the 4.0 update yesterday the PS4 setting for HDR defaulted at "automatic" which is what it needs to be for HDR sets. But now do i need to go into my tv settings someplace and enable HDR or is it just set to automatically recognize a HDR device and display accordingly?

That depends on the tv. I've heard that some tvs will automatically play hdr content while other tvs need to enable it. Best way to know for sure is to download a hdr10 sample file and view it on your tv, or pay for the hdr netflix subscription.
 
tl;dr
Is 'HDR' enabled tvs just the non technical term for 10-bit panels?
If so, I should have had a pseudo 'true' HDR experience for a long time now since my monitor was capable to processing 10 (or 12) bits of colour.
Were game previously not programmed for HDR; in lighting or colour palette usage (developers were using less than 10 bits of colour all this time)

A monitor with 10-bit colors isn't an HDR monitor. HDR isn't just about 10-bit colors.The main idea behind HDR is higher brightness, or higher conrast. It's about displaying a wider brightness scale found in nature, while the old standard was to master the displayed content to 100 nits, which leaves content looking pretty flat. And if it wasn't obvious, HDR is not about amping up the average brightness of the image to unwatchable levels, just the highlights when it's appropriate. The same goes for deep blacks. 10-bit colors merely help deal with the increased contrast and gamut to create smoother gradients. Wider gamut is the other main ingredient besides brightness/contrast to create a more life-like image.

Personally I'd only call Ultra HD Premium certified panels as true HDR, since only those offer the combination of deep blacks, high maximum brightness and a wide color space. I think it was Dolby that did the original research that we get benefits out of HDR up to 10K nits brightness, but currently we're at around 1K level with the brightest consumer models. HDR is still very much in the early adopter stage, and it'll take a few years for the industry to settle. And that's just for movies and games, broadcast content will take far longer. Switching over from the old 100 nits standard with a limited color space is a huge jump for the industry, and it's far bigger than just going to 4K.

Why now? The necessary tech wasn't really available in mass market form. The film industry saw it as the "next thing" to improve image quality, and TV manufacturers saw it as the next thing to sell new sets with. Gaming is simply following in tow, though I do think it's going to have a substantial impact towards the adoption of HDR, since it's relatively straightforward to include HDR support in games.
 

Cyborg

Member
If my TV supports HDR10 and lets say I have PS4-Pro. Could I enjoy HDR in its true form?

Ive heard my reciver needs to be HDR ready to. But what if I dont use my reciver and use an optical cable?
 
The standard "what is HDR" image:


hdr1-100656714-large.jpg


Of course, this is displayed on your non HDR monitor, so they have to play down the "non HDR" photo to show the comparison. But given the content I've seen in HDR, this is a fair comparison.

Which one is supposed to be more realistic here?

I get that it's tough for them to show the differences when you have a non-HDR display (my monitor) so they have to fake it with some garbage, but the image on the left looks ridiculous. The end of that tunnel looks like a martian landscape.
 

jmdajr

Member
DV and HDR10 are standards, they can't display each other. "DV can do HDR10" is like saying "Gsync can do FreeSync" or "BluRay can do HD-DVD."

There are TVs that can do both though, and I would think that will get more common as newer HDR TVs come out.

From what I understand Dolby Vision TVs can get a firmware update to display HDR10 content.
 

Anony

Member
thanks to some of you who actually read my post and clarified some stuff, i actually never read/heard of hdr10 and dolby vision until this thread surprisingly

so i think it's safe to say hdr tvs just mean 10-bit panels (or better)?
this leads to what some people have asked in the thread as well, if your current monitor/tv already has a 10-bit panel, then does that mean it's hdr ready then? or is there some software and/or standard bullshit that cockblocks you from getting all the colours as some of you said (gonna need some sort reference for this one instead of word of mouth)

someone replied that developers are using less then 10-bits of colour for their game, is there a reference for that as well
i know that in the old days that using less colours meant less processing power and memory, is this the case for 'modern' 3d games as well, if so, wouldn't old ps4 need extra horse power to run hdr games? which in turn, if that is the case, anyone wanting hdr on ps4 should upgrade to the slim
 
If my TV supports HDR10 and lets say I have PS4-Pro. Could I enjoy HDR in its true form?

Ive heard my reciver needs to be HDR ready to. But what if I dont use my reciver and use an optical cable?

If your tv is a true hdr10 set, yes you will enjoy hdr in its true from.
 
If my TV supports HDR10 and lets say I have PS4-Pro. Could I enjoy HDR in its true form?

Ive heard my reciver needs to be HDR ready to. But what if I dont use my reciver and use an optical cable?

Yes, and you'll be fine if you connect the PS4 to the TV and the optical cable to your receiver. However you'll lose HD audio over optical.

Edit:

I should say the quality of HDR is dependent on your TV, so not everyone is going to have the same experience. Last year's models in particular weren't very good in this regard.
 

platocplx

Member
If my TV supports HDR10 and lets say I have PS4-Pro. Could I enjoy HDR in its true form?

Ive heard my reciver needs to be HDR ready to. But what if I dont use my reciver and use an optical cable?

HDR has no impact on audio, your reciever just would need to be able to pass through or help process the additional information.

and HDR true form as ive learned is kind of muddy right now. There are tvs that are compatible but dont hit the Premium HDR rating of 1000 nits(level of brightness thats a standard) its an interesting thing to see.

Like i have an xbr850c while it can accept HDR signals and can max its brighness its not as crazy as the highest end sets so HDR definitely sets a new Gap for PQ for tvs even more than just the jump from 1080p to 4k.
 
Here is why devs are getting excited over HDR. As mentioned in the OP, HDR has been around for a while now. The Source engine had it since 2005 or 2006, and since then, most games with realistic lighting probably do HDR rendering.

HDR in this context means purely internal to the engine. This allows for more realistic lighting calculations, but there is the problem of the fact that display technologies at the time can't actually display this broader range. That is why the games will have a rendering stage called 'tonemapping' where they take the HDR rendered image, and compress it down to be viewable on a normal screen. Here's a talk on one of the more common techniques used: http://duikerresearch.com/2015/09/filmic-tonemapping-ea-2006/

So all this new talk coming out is the fact that since HDR tvs can display a much wider range of brightnesses, the game engine can output a much broader range of colors than they could before. Remember, these engines are already HDR, and the colors were always there, just trapped behind the display technology. Devs had to pick and choose which colors to throw out so it can be seen on a normal screen. With HDR, all the colors already rendered can be sent over. Devs will essentially be getting something awesome for almost nothing.
 

jmdajr

Member
A monitor with 10-bit colors isn't an HDR monitor. HDR isn't just about 10-bit colors.The main idea behind HDR is higher brightness, or higher conrast. It's about displaying a wider brightness scale found in nature, while the old standard was to master the displayed content to 100 nits, which leaves content looking pretty flat. And if it wasn't obvious, HDR is not about amping up the average brightness of the image to unwatchable levels, just the highlights when it's appropriate. The same goes for deep blacks. 10-bit colors merely help deal with the increased contrast and gamut to create smoother gradients. Wider gamut is the other main ingredient besides brightness/contrast to create a more life-like image.

Personally I'd only call Ultra HD Premium certified panels as true HDR, since only those offer the combination of deep blacks, high maximum brightness and a wide color space. I think it was Dolby that did the original research that we get benefits out of HDR up to 10K nits brightness, but currently we're at around 1K level with the brightest consumer models. HDR is still very much in the early adopter stage, and it'll take a few years for the industry to settle. And that's just for movies and games, broadcast content will take far longer. Switching over from the old 100 nits standard with a limited color space is a huge jump for the industry, and it's far bigger than just going to 4K.

Why now? The necessary tech wasn't really available in mass market form. The film industry saw it as the "next thing" to improve image quality, and TV manufacturers saw it as the next thing to sell new sets with. Gaming is simply following in tow, though I do think it's going to have a substantial impact towards the adoption of HDR, since it's relatively straightforward to include HDR support in games.

This is good stuff.

So the HDMI 2.0 info must be more for the brightness info than the color gamut. HDMI 1.3/1.4 is probably more than enough for that information.

I wonder if SONY is simply offering PS4 owners just more color range, and not true HDR.
 

Anony

Member
A monitor with 10-bit colors isn't an HDR monitor. HDR isn't just about 10-bit colors.The main idea behind HDR is higher brightness, or higher conrast. It's about displaying a wider brightness scale found in nature, while the old standard was to master the displayed content to 100 nits, which leaves content looking pretty flat. And if it wasn't obvious, HDR is not about amping up the average brightness of the image to unwatchable levels, just the highlights when it's appropriate. The same goes for deep blacks. 10-bit colors merely help deal with the increased contrast and gamut to create smoother gradients. Wider gamut is the other main ingredient besides brightness/contrast to create a more life-like image.

okay, going on this point, i purposely left out contrast because that's just adds to the convolution
going by this, than you're never going to get 'true' hdr if you're using an lcd monitor. Only plasma and oled because they can turn off individual pixels. And since plasma is dead (or dying), oled is the option.
 
For software, this was introduced (my first exposure to it at least) was the source engine, it was one of the biggest selling points/technological achievements for the engine.
at the end of the day, in software, you're manipulating how the lighting works and is displayed, thus giving the HDR effect.
This is a related but different technique called HDRR. It uses internal HDR buffers, but the result are tone-mapped down to SDR with a bloom effect to simulate overbrightness. a HDR display could display a much larger brightness range natively, without the use of bloom.
It's been so many years since, that i assumed all game engines use HDR as lighting to some degree, but they way sony and microsoft puts it, this is some new thing that can only game developers are programming for now and needs to be switched on? (this part confuses me)
Software developers need to change the output format and the way the tone-mapping works.
As a pc monitor user, I (believe) have never had this problem though. My monitor, the dell u2410, is a 8-bit panel with 10-bit (or 12-bit) internal colour processor, which is a top tier colour reproducer for its time (and should still hold up today).
I know that most monitors, even some low tier ips panels are still 6-bit in colour.
The number of bits only tell you the amount of brightness steps, but not the range or distribution of said steps. HDR monitors have a peak brightness and use a transfer function called SMPTE.2084 (or HLG) instead of the "gamma" curves SDR monitors use.

Is 'HDR' enabled tvs just the non technical term for 10-bit panels?
HDR =
* 10 bit + panel
* High contrast panel (FALD LED LCD or OLED, some projectors)
* High peak brightness (>500 nits)
* SMPTE.2084 (for HDR10 and DolbyVision) and/or HLG transfer function support.
* Metadata support for tone-mapping.
* Usually also come with wide color gamut.

If so, I should have had a pseudo 'true' HDR experience for a long time now since my monitor was capable to processing 10 (or 12) bits of colour.
Nope, there are no consumer HDR monitors available right now.
 
It's now a thing because Sony declared it was a thing last week. Within 48h, everyone and its dog suddenly went HDR-gaga, requiring HDR *now* and immediately shopping for a HDR TV. News outlets contributed to the hype describing HDR as 'life changing'.
Great marketing from Sony.
See also: gamers craving for HDR in the Forza Horizon 3 demo release (hint: it doesn't have it, resulting in pissed gamers, because HDR ***NOW NOW NOW***).
 
Because 60fps doesn't sell.

Actually, because 3D didn't sell and TV manufacturers gotta get people to buy new sets.

3D TV - Dead
Smart TVs - Dead
Curved TVs - Never even had a chance

4K and HDR are the next two things that will be pushed to consumers in an effort to get them to buy new sets. And while 4K and HDR are both more worthwhile then other other technologies I listed, people are still not going to rush out like they did for HD and 16:9 since the normal content they consume will not be going 4K for a while, or HDR for even longer.

Broadcast / cable 4K and HDR is a long way off, if ever. At this point i'd settle for a 1080p feed that isn't compressed to hell by the networks / sat providers. If I could just get blu-ray quality 1080p from Dish Network i'd be happy.
 

GlamFM

Banned
Big, good TVs have gotten pretty cheap pretty fast.

Compared to CRTs the prices fell ridiculously quick.


They've been trying to sell us new TVs for a while now.

They tried 3D - stupid
They tried curved - even more stupid

And now they are trying HDR.

I'm a fan of HDR personally, but it's all about us getting rid of our perfectly good panels to buy a new one.
 
From what I understand Dolby Vision TVs can get a firmware update to display HDR10 content.
I know that was the case with the Vizio P series, they supported Dolby out of the box and got a firmware update that allowed them to play HDR10. I'm not sure that's true for every set that ships only being able to do DV.
 
This is a related but different technique called HDRR. It uses internal HDR buffers, but the result are tone-mapped down to SDR with a bloom effect to simulate overbrightness. a HDR display could display a much larger brightness range natively, without the use of bloom.

Software developers need to change the output format and the way the tone-mapping works.

The number of bits only tell you the amount of steps, but not the range or distribution of said steps. HDR monitors have a peak brightness and use a transfer function called SMPTE.2084 (or HLG) instead of the "gamma" curves SDR monitors use.


HDR =
* 10 bit + panel
* High contrast panel (FALD LED LCD or OLED, some projectors)
* High peak brightness (>500 nits)
* SMPTE.2084 (for HDR10 and DolbyVision) and/or HLG transfer function support.
* Metadata support for tone-mapping.
* Usually also come with wide color gamut.


Nope, there are no consumer HDR monitors available right now.

OP needs to update his OP with this post right here. Less go.
 
The HDR 'effect' the OP mentioned is purely the artifact of taking an HDR image and turning it into a non-HDR image. There's always that scene in a game where you're looking out at bright sunlight from a dark cave and the whole screen is blown out right? That's because there's too much of a brightness difference between the lighting in the cave and the lighting outside, and the engine had to pick which part to emphasize and blow out the rest, throwing away all that detail.

On an HDR screen, you should be able to see detail in both the dark cave and bright sky.
 

jmdajr

Member
Wonder when true 120 fps TVs will be a thing. Probably never I am guessing.

This is still good progress though. I think it will do MUCH better than 3D. By a lot.
 

No_Style

Member
Why now? Because the TV manufacturers are ready to put this feature into mainstream audience's hands and the content is ramping up.
 
The standard "what is HDR" image:


hdr1-100656714-large.jpg


Of course, this is displayed on your non HDR monitor, so they have to play down the "non HDR" photo to show the comparison. But given the content I've seen in HDR, this is a fair comparison.

Great, so everything is going to look like over-saturated photoshopped garbage.

I loathed this kind of shit.. Now gaming is telling me the next big thing is unrealistic colors and high contrast and over saturation? Jump off a cliff. I'll play things like Duck Game.
 

mattp

Member
Great, so everything is going to look like over-saturated photoshopped garbage.

I loathed this kind of shit.. Now gaming is telling me the next big thing is unrealistic colors and high contrast and over saturation? Jump off a cliff. I'll play things like Duck Game.

its not. ignore this image
the real simple explanation is: higher contrast. blacker blacks, brighter brights. and most important, more small details will be visible in everything. especially bright areas and shadows. it gives the impression of higher quality textures and stuff.
 

jmdajr

Member
Great, so everything is going to look like over-saturated photoshopped garbage.

I loathed this kind of shit.. Now gaming is telling me the next big thing is unrealistic colors and high contrast and over saturation? Jump off a cliff. I'll play things like Duck Game.
They probably need to over exegerrate the example.
 
Top Bottom